Encrypt Database Backup And Save On Google Cloud Platform

paulknulst

Paul Knulst

Posted on November 12, 2022

Encrypt Database Backup And Save On Google Cloud Platform

Paul Knulst  in  Programming • Dec 7, 2021 • 4 min read

To improve my backup strategy I add syncing encrypted backups to GCP with a CronJob, OpenSSL, and a simple Bash script. This Tutorial will explain how everyone can implement my personal strategy.

Introduction

After I set up my database backup using a cronjob I was thinking about a good solution to have also backups in case of a server crash.

You can read how I created my own simple backup strategy here: https://www.paulsblog.dev/everybody-needs-backups-a-lesson-learned-the-hard-way/

The first idea was something like a rsync job which runs on my laptop and downloads the DB every day. But this is not a really good solution so I thought about one including cloud storage. Because GCP has $300 in free credits at the moment I choose it over AWS.

I created a free account here and then I created my first Storage Bucket.

Installing Google Cloud SDK

After the initial setup within the cloud console was done I performed the following steps to connect my server to the google storage. If you are working with a Docker Swarm this procedure has to be repeated on EVERY worker within my swarm. If you want to set up a Docker Swarm, you can read about it in another article from me.

Downloading Google Cloud SDK

curl -O https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-353.0.0-linux-x86_64.tar.gz
Enter fullscreen mode Exit fullscreen mode

Extract it

tar -xf google-cloud-sdk-353.0.0-linux-x86_64.tar.gz
Enter fullscreen mode Exit fullscreen mode

Install the SDK

./google-cloud-sdk/install.sh
Enter fullscreen mode Exit fullscreen mode

After the installation process is done you have to reconnect the shell!

Initialize the SDK

./google-cloud-skd/bin/gcloud init
Enter fullscreen mode Exit fullscreen mode

During the initialization, I had to login into my google account

Find the bucket

gsutil ls
Enter fullscreen mode Exit fullscreen mode

Now the machine is connected to GCP and I can use gsutil to backup my data.

Encryption

Now that I have a google connection on every server I could start to create the backup script. But I first gathered information on how to zip the database backups and store them in an encrypted way. Google should not have my database! I know: paranoia

I choose a procedure for encryption that uses a password file on the system (which I also downloaded to my machine). So I had to create a secure password and export it into a variable which then can be used within a script.

Generate a random password in a file:

head -c 100 /dev/urandom | strings -n1 | tr -d '[:space:]' | head -c 40 >> ~/.pass
Enter fullscreen mode Exit fullscreen mode

Export file to .bashrc to use it while I am within the shell:

echo "export PASS=~/.pass" >> ~/.bashrc && source ~/.bashrc
Enter fullscreen mode Exit fullscreen mode

Make file “safe”:

chmod 400 ~/.pass
Enter fullscreen mode Exit fullscreen mode

With these .pass file I can encrypt and zip a folder with:

tar czf - . | openssl enc -e -aes-256-cbc -pbkdf2 -iter 100000 -salt -out archive.enc -pass env:PASS
Enter fullscreen mode Exit fullscreen mode

To decrypt I have to use:

    openssl enc -d -aes-256-cbc -pbkdf2 -iter 100000 -salt -in archive.enc -pass env:PASS | tar zxf - --directory backup_restore
Enter fullscreen mode Exit fullscreen mode

Creating the backup shell script

The last step was to really create a shell-script which will then be executed within a cronjob. After researching and testing this script was created:

#!/bin/bash
TIME=$(date +%Y-%m-%d)
HOST=$(hostname)
BACKUPSAVEPATH=/YOUR_BACKUP_STORAGE_PATH/backups_encrypted/
BACKEDUPFILE=$BACKUPSAVEPATH/db-backups-$HOST-$TIME.enc
TARGET=gs://YOUR_BUCKET_HERE

cd /YOUR_BACKUP_STORAGE_PATH/ || exit 1
if [ -z "$(ls -A .)" ]; then
   exit 1
else
  tar czf - . | openssl enc -e -aes-256-cbc -pbkdf2 -iter 100000 -salt -out $BACKEDUPFILE -pass env:PASS
  /PATH_TO_GOOGLE_SDK_FOLDER/google-cloud-sdk/bin/gsutil -m rsync -r $BACKUPSAVEPATH $TARGET || exit 1
  rm -rf /YOUR_BACKUP_STORAGE_PATH/backups/*
fi
Enter fullscreen mode Exit fullscreen mode

Important to know if you read/use this file:

  1. Replace YOUR_BACKUP_STORAGE_PATH with your path where you store your backups. This folder has to have two subdirectories: backups_encrypted and backups
  2. YOUR_BUCKET_HERE should be replaced with your bucket which you want to use. Find it with gsutil ls
  3. PATH_TO_GOOGLE_SDK_FOLDER is the folder in which you installed the Google Cloud SDK

This script can now be used within a cronjob:

30 1 * * 6 export PASS=/root/.pass; /bin/sh /PATH_TO_SCRIPT/SCRIPTNAME.sh
Enter fullscreen mode Exit fullscreen mode

This cronjob is executed at 1:30 every Saturday. Use crontab.guru for crontab creation

VERY IMPORTANT: You have to export the PASS variable before executing the script because a cronjob is not executed within the same bash in which you are as a user and then the prior exported PASS variable does not exist

ALSO VERY IMPORTANT: Remember to recreate the cronjob on every worker node within the swarm (if you use a Docker Swarm). Otherwise, data will be lost.

Closing Notes

Within this simple How-To, I explained how I used Google Cloud SDK to save an encrypted zip file in my Google Bucket here.

I have another Tutorial that explains How you can upload files to Amazon. You can copy the encryption part from this tutorial and combine it with the file upload to S3: https://www.paulsblog.dev/how-to-backup-your-files-to-aws-s3/

I hope you find this article helpful and are also able to save your valuable backup within the Cloud. I would love to hear your thoughts and if you have any questions please contact me.

This article was originally published on my blog at https://www.paulsblog.dev/encrypt-database-backup-and-save-on-google-cloud-platform/

Feel free to connect with me on my personal blog, Medium, LinkedIn, Twitter, and GitHub.


Did you find this article valuable? Want to support the author? (... and support development of current and future tutorials!). You can sponsor me on Buy Me a Coffee or Ko-Fi. Furthermore, you can become a free or paid member by signing up to my website. See the contribute page for all (free or paid) ways to say thank you!


Photo by panumas nikhomkhai / Pexels

💖 💪 🙅 🚩
paulknulst
Paul Knulst

Posted on November 12, 2022

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related