How to Automatically Back Up Ghost


Having a recent backup is obviously very important. We put the following scripts together to automate our backup process. This first script is run on our DigitalOcean droplet and is run every hour by a cron job.

This script is responsible for backing up our SQLite3 database. On our DigitalOcean droplet we are using pm2 to start and monitor Ghost, so the script temporarily stops Ghosts, makes a copy of the database, and starts Ghost. Lastly, the script prunes old backups to save on disk space. Currently with our Ghost themes site, this backup script takes less than a second to run, making the stopping and starting of Ghost imperceptible to our end users.

At the top of the script, there are three variables that you will want to change for your environment. Change GHOST_DATABASE to the path of your Ghost database. Change BACKUP_DIR to the directory that you want to backup into. And lastly change BACKUP_RETENTION_PERIOD to the number of days that you want to have saved on the server.

#!/bin/bash

GHOST_DATABASE=/var/www/ghost/content/data/ghost.db
BACKUP_DIR=/home/ghost/backup/allghostthemes/
BACKUP_RETENTION_PERIOD=10
LOG_FILE=/var/log/backup-allghostthemes.log
DATE=`date '+%Y/%m/%Y-%m-%d-%H-%S'`

# Make backup directory
mkdir -p $BACKUP_DIR$DATE

# Stop Ghost
pm2 stop ghost
echo "Ghost has been stopped - $DATE" >> $LOG_FILE

# Copy Ghost Database
cp $GHOST_DATABASE $BACKUP_DIR$DATE
echo "Ghost has been copied - $DATE" >> $LOG_FILE

# Start Ghost
cd /var/www/ghost/
pm2 start index.js --name ghost
echo "Ghost has been started - $DATE" >> $LOG_FILE

# Prune backup directory
find "$BACKUP_DIR" -type f -mtime $BACKUP_RETENTION_PERIOD -iname '*.db' -delete
find "$BACKUP_DIR" -type d -mtime +$BACKUP_RETENTION_PERIOD -delete
echo "Backup directory pruned - $DATE" >> $LOG_FILE

The crontab entry we are using for this script is:

5 * * * * /home/ghost/backup-allghostthemes.sh >> /var/log/backup-allghostthemes.log

The second script is what we use to pull files down from the server to have a local, offsite backup. What we are mainly interested in pulling down are the backups of the SQLite database and images, but because we are already backing those up we also pull down the Nginx directory, for the Nginx configuration files, and the users home directories, to preserve a copy of the pm2 configuration files.

You will need to customize SERVER_USER, SERVER_URL, DESTINATION, BACKUP_DESTINATION, for your environment. You may also want to make changes to what directories get pulled down from the server, which is on line 12, '/var/www/ /etc/nginx/ /home/ /root/'.

This script will keep a current copy of what you have on server in DESTINATION and any time a file is modified or deleted the file will be moved to BACKUP_DESTINATION, and the name will be appended with the date.

#!/bin/bash

SERVER_USER=root
SERVER_URL=YOUR_SERVERS_IP_ADDRESS
DESTINATION="/Users/andyboutte/Documents/backups/current/"
BACKUP_DESTINATION="/Users/andyboutte/Documents/backups/"
DATE=`date '+%Y/%m/%Y-%m-%d-%H'`

# Make todays backup directory
mkdir -p $BACKUP_DESTINATION$DATE

sudo rsync -azPbrR --stats --human-readable --progress --backup-dir=$BACKUP_DESTINATION$DATE --suffix=.old-`date '+%Y-%m-%d-%H-%M'` $SERVER_USER@$SERVER_URL:'/var/www/ /etc/nginx/ /home/ /root/' "$DESTINATION" \
        --delete-excluded \
        --delete \
        --exclude "root/.npm" \
        --exclude ".node-gyp"

If you have a question about how to set this up for your environment or any suggestions on how to improve these scripts we would love to hear from you in the comments below.