Roman Matkivskyy
Posted on April 28, 2021
This is a simply guide with examples of how to setup a cron job to backup in efficient way both mongodb and postgresql databases.
All backups are encrypted with 7z and have an expiration time.
Prerequisites
- Linux based server
- installed
p7zip-full,curl, mongodb-tools, postgresql-client
- Some S3 compatible storage (in this guide I've used OVH Object Storage)
MongoDB backup
With the help of mongodump
tool we can easly download and compress an entire collection, then piping the stdout to 7z and encrypt the db with a password.
mongodump \
--uri="${db_host}/${db_name}?authSource=admin" \
--archive \
--gzip \
| 7z a -si -t7z -p${password} ${file_name}
db_host
> uri to connect to mongodb
db_name
> name of the collection to backup
password
> password used to encrypt (but also decrypt) the 7z file
file_name
> the output file (possibly with *.7z extension)
PostgreSQL backup
With the help of pg_dump
tool we can easly download and compress an entire db, then piping the stdout to 7z and encrypt the db with a password.
pg_dump \
--dbname=${db_host}/${db_name} \
| 7z a -si -t7z -p${password} ${file_name}
db_host
> uri to connect to psql
db_name
> name of the db to backup
password
> password used to encrypt (but also decrypt) the 7z file
file_name
> the output file (possibly with *.7z extension)
Upload to S3 compatible service
Now we have the compressed and encrypted files ready to be send to an S3 storage. We can achive this with curl
, this is an example of a PUT request with ${file_name} as body
curl -i ${bucket}/${file_name} \
-X PUT \
--data-binary "@${file_name}" \
-H "Content-Type: application/x-7z-compressed" \
-H "X-Auth-Token: ${token}" \
-H "X-delete-after: ${delete_after}"
in this example I've used OVH Object Storage
bucket
> complete uri to your bucket
file_name
> filename with which the uploaded file will be saved
token
> with OVH we have to add the X-Auth-Token header to autenticate the request
delete_after
> this is an optional header, we can decide to delete the file after a certain amount of seconds
Automation
Those 3 commands are the core of our backup script, now we can wrap them with some bash logics and put all in a file
#!/bin/bash
# pass args to script
if [ $# -eq 0 ]; then
echo "Missing args: db_name or delete_after"
exit 1
fi
while [ $# -gt 0 ]; do
case "$1" in
--db_name=*)
db_name="${1#*=}"
;;
--delete_after=*)
delete_after="${1#*=}"
;;
*)
printf "***************************\n"
printf "* Error: Invalid argument.*\n"
printf "***************************\n"
exit 1
esac
shift
done
if [ -z "${delete_after}" ]; then
delete_after=`expr 60 \* 60 \* 24 \* 30`; # one month
fi
# db env
db_host=mongodb://user:password@host:port
#db_name=db_name
file_name=${db_name}_$(date +%d-%m-%Y_%H-%M-%S).gzip.7z;
#ovh env
bucket=bucket_url;
token=personal_token;
password=password;
# Create a crypted backup
if mongodump \
--uri="${db_host}/${db_name}?authSource=admin" \
--archive \
--gzip \
| 7z a -si -t7z -p${password} ${file_name}; then
echo 'mongodb dump created'
else
echo 'mongodump return non-zero code'
exit 1;
fi
content_length=$(cat $file_name | wc -c);
# send file to ovh switf
http_response=$(curl -i ${bucket}/${file_name} \
-X PUT \
--data-binary "@${file_name}" \
-H "Content-Type: application/x-7z-compressed" \
-H "Content-Length: ${content_length}" \
-H "X-Auth-Token: ${token}" \
-H "X-delete-after: ${delete_after}" \
| head -n 1 | cut -d \ -f2);
if [ $http_response = 201 ] || [ $http_response = 100 ]; then
echo "complete";
rm ${file_name};
else
echo "error sending file to ovh";
fi
This file accepts two args:
db_name
> mandatory, name of collection/db to backup
delete_after
> optional, after how many seconds the file should be deleted
example of how to manually execute
bash script_name.sh --db_name=db_to_backup
CRON
With crontab we can schedule when to execute the backup.
add this row to
crontab -e
file to run the script every night at 2:00am and save the result tobackup_script.logs
0 2 * * * ~/script_name.sh >> ~/backup_script.logs 2>&1
Posted on April 28, 2021
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.