Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Restore from S3 bucket #253

Open
Mark24Slides opened this issue Sep 21, 2023 · 2 comments
Open

Restore from S3 bucket #253

Mark24Slides opened this issue Sep 21, 2023 · 2 comments

Comments

@Mark24Slides
Copy link

Mark24Slides commented Sep 21, 2023

Description of the feature

Add possibility to recover database backups stored in S3 bucket.

Benefits of feature

Restore from remote S3 location.

Additional context

Right now, use add. script:
It copies latest S3 ${SOURCE_FILE}=mysql_.*\.gz$ file from ${S3_BUCKET}/${S3_PATH} and downloads to ${TEMP_LOCATION}/${TARGET_FILE}, then executes restore cmd

export AWS_ACCESS_KEY_ID=${S3_KEY_ID}
export AWS_SECRET_ACCESS_KEY=${S3_KEY_SECRET}
export AWS_DEFAULT_REGION=${S3_REGION}
export PARAM_AWS_ENDPOINT_URL=" --endpoint-url ${S3_PROTOCOL}://${S3_HOST}"

mkdir -p ${TEMP_LOCATION}

export LATEST_FILE=$(aws s3 ls s3://${S3_BUCKET}/${S3_PATH}/ ${PARAM_AWS_ENDPOINT_URL} --recursive | grep -E ${SOURCE_FILE} | sort -r | head -n 1 | awk '{print $4}')
aws s3 cp s3://${S3_BUCKET}/${LATEST_FILE} ${TEMP_LOCATION}/${TARGET_FILE} ${PARAM_AWS_ENDPOINT_URL}

restore ${TEMP_LOCATION}/${TARGET_FILE} ${DB_TYPE} ${DB_HOST} ${DB_NAME} ${DB_USER} ${DB_PASS} ${DB_PORT}

aws cp can be used with ${LATEST_FILE} (+ ${SOURCE_FILE} grep), or, ${SOURCE_FILE}.

@sakonn
Copy link

sakonn commented Dec 7, 2023

I have created a similar script. The feature would be very much appreciated.

export AWS_ACCESS_KEY_ID=$(cat ${DB01_S3_KEY_ID_FILE})
export AWS_SECRET_ACCESS_KEY=$(cat ${DB01_S3_KEY_SECRET_FILE})
export AWS_DEFAULT_REGION=${DB01_S3_REGION}
export SOURCE_FILE="mysql_.*\.gz$"
export BACKUP_LOCATION="/backup"
aws sts get-caller-identity

export LATEST_FILE=$(aws s3 ls s3://${DB01_S3_BUCKET}/${DB01_S3_PATH}/ | grep -E ${SOURCE_FILE} | sort -r | head -n 1 | awk '{print $4}')
aws s3 cp s3://${DB01_S3_BUCKET}/${DB01_S3_PATH}/${LATEST_FILE} ${BACKUP_LOCATION}/${TARGET_FILE}
aws s3 cp s3://${DB01_S3_BUCKET}/${DB01_S3_PATH}/${LATEST_FILE}.sha1 ${BACKUP_LOCATION}/

restore ${BACKUP_LOCATION}/${TARGET_FILE} ${DB01_TYPE} ${DB01_HOST} ${DB01_NAME} ${DB01_USER} $(cat ${DB01_PASS_FILE}) ${DB01_PORT}

@Mark24Slides
Copy link
Author

Mark24Slides commented Jan 12, 2024

Updated for v4:

  export AWS_ACCESS_KEY_ID=${DEFAULT_S3_KEY_ID}
  export AWS_SECRET_ACCESS_KEY=${DEFAULT_S3_KEY_SECRET}
  export AWS_DEFAULT_REGION=${DEFAULT_S3_REGION}
  export DEFAULT_PARAMS_AWS_ENDPOINT_URL=" --endpoint-url ${DEFAULT_S3_PROTOCOL}://${DEFAULT_S3_HOST}"
  
  mkdir -p ${TEMP_PATH}
  
  export LATEST_FILE=$(aws s3 ls s3://${DEFAULT_S3_BUCKET}/${DEFAULT_S3_PATH}/ ${DEFAULT_PARAMS_AWS_ENDPOINT_URL} --recursive | grep -E ${SOURCE_FILE} | sort -r | head -n 1 | awk '{print $4}')
  aws s3 cp s3://${DEFAULT_S3_BUCKET}/${LATEST_FILE} ${TEMP_PATH}/${TARGET_FILE} ${DEFAULT_PARAMS_AWS_ENDPOINT_URL}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants