How to backup MySQL / MariaDB database to Amazon S3, shell script example
This article shows a shell script to backup or export a MySQL / MariaDB database, gzip the exported file, and upload the gzipped backup file to Amazon S3.
Table of contents
- 1. Install AWS CLI
- 2. Backup MySQL/MariaDB database to S3 (Shell Script)
- 3. How to run the backup script?
- 4. Run the backup script every day
- 5. Backup files on Amazon S3
- 6. References
1. Install AWS CLI
We must ensure the system installed the AWS CLI.
We can type aws --version
to check if the AWS CLI is installed.
Terminal
$ aws --version
aws-cli/1.17.0 Python/2.7.16 Linux/4.x-amd64 botocore/1.14.0
2. Backup MySQL/MariaDB database to S3 (Shell Script)
Below is a shell script to dump the database with mysqldump
and gzip
it into a folder, later using the aws
command to upload the gzip
file to Amazon S3.
backup-mysql.sh
#!/bin/bash
################################################################
##
## MySQL Database Backup To Amazon S3
## Written By: YONG MOOK KIM
## https://www.mkyong.com/mysql/mysql-backup-and-restore-a-database-or-table/
## https://www.mkyong.com/linux/how-to-zip-unzip-tar-in-unix-linux/
## https://mkyong.com/linux/linux-script-to-backup-mysql-to-amazon-s3/
##
## Daily backup, At 01:30
## 30 1 * * * /home/mkyong/scripts/backup-mysql.sh > /dev/null 2>&1
################################################################
NOW=$(date +"%Y-%m-%d")
NOW_TIME=$(date +"%Y-%m-%d %T %p")
NOW_MONTH=$(date +"%Y-%m")
MYSQL_HOST="localhost"
MYSQL_PORT="3306"
MYSQL_DATABASE="wordpress"
MYSQL_USER="user"
MYSQL_PASSWORD="password"
BACKUP_DIR="/home/mkyong/backup/$NOW_MONTH"
BACKUP_FULL_PATH="$BACKUP_DIR/$MYSQL_DATABASE-$NOW.sql.gz"
AMAZON_S3_BUCKET="s3://mkyong/backup/linode/mysql/$NOW_MONTH/"
AMAZON_S3_BIN="/home/mkyong/.local/bin/aws"
#################################################################
mkdir -p ${BACKUP_DIR}
backup_mysql(){
mysqldump -h ${MYSQL_HOST} \
-P ${MYSQL_PORT} \
-u ${MYSQL_USER} \
-p${MYSQL_PASSWORD} ${MYSQL_DATABASE} | gzip > ${BACKUP_FULL_PATH}
}
upload_s3(){
${AMAZON_S3_BIN} s3 cp ${BACKUP_FULL_PATH} ${AMAZON_S3_BUCKET}
}
backup_mysql
upload_s3
# check if any error
#if [ $? -eq 0 ]; then
# ok, no error, log and send email?
#else
# failed, got error, log and send email?
#fi;
3. How to run the backup script?
We must assign a +x
execute permission to run the shell script.
Terminal
# assign execute permission to the file
$ chmod +x backup-mysql.sh
# run the backup script
$ ./backup-mysql.sh
4. Run the backup script every day
We can use the cron
scheduler to run the backup script daily.
terminal
$ crontab -e
crontab
# Daily, 1am
0 1 * * * /home/mkyong/scripts/backup-mysql.sh > /dev/null 2>&1
5. Backup files on Amazon S3
Below are some backup files uploaded to Amazon S3.
Hi Yong Mook,
Thanks so much for such an excellent script.
Giving full credit to you, I took the great script and made it to work with shared hosting, by first installing a Python virtual environment. This allowed the installation of AWS CLI without restriction of the shared hosting root Python folder protections. Without this setup AWS could not be installed in shared hosting.
o Added multi database support
o Clean up of local backups after X days
o Amazon S3 file tagging (using S3API) after their upload.
File tagging helps control S3 lifecyle retention policy. S3 then takes care of file clean-up daily, weekly, monthly, yearly depending on the file upload folders and set policy.
https://www.godaddy.com/community/cPanel-Hosting/Solution-MySQL-Backups-to-Amazon-S3-cPanel-Business-Hosting/td-p/148134
Your solution really helped me and I appreciate the gift of your training.
Aly
Thanks! Happy New Year! 🙂
Very good sample
if anyone finds it useful, here’s a brain-dead python script which will keep your backups from accumulating:
https://github.com/alexakarpov/s3-logrotate
much better for this purpose use lifecycles. it has on almost any s3-compatible service.