Skip to content

Backups

Wordpress is a content management system. This means that most of the content is stored in the database or is ephemeral storage on the server itself (in the case of uploaded images). If something happens to the server, we can quickly recover the code because it is stored in Bitbucket. However, all the content would be gone. To mitigate this risk, we take automated backups of the database and the Wordpress wp-uploads directory once a day. If the server ever gets corrupted or otherwise loses data, the backup files can be used to restore the content for the Wordpress site.

Note that backups are taken of the entire database, but are only taken for the production (i.e. bridgeschool.org) wp-uploads folder.

Storage of backups

As part of the backup script, the archived data is uploaded to the Bridge School AWS account and stored in an S3 bucket. The bucket name is bs-archives. The S3 integration requires the AWS CLI to be installed on the server and a Boto profile to be created with credentials.

Update PATH

First you will need to update the server PATH variable. Login to the server as the bridge user and edit/create the ~/.bash_profile file. Ensure the following line exists in the file:

export PATH=~/.local/bin:$PATH

Install Pip

Pip should be installed on the server by default, but if it's not, you can install it by logging into the server as the bridge user and running:

yum update
yum install python-pip

When pip is installed, the pip --version command should return the version.

Install awscli

Once pip is installed, you can login to the server as bridge and install the AWS CLI with:

pip install awscli --upgrade --user

Install boto

boto manages AWS credentials for the AWS CLI. Log into the server as bridge and install boto with:

pip install boto

Set up credentials

Once boto is installed, login to the server as bridge and create/edit the ~/.aws/credentials file. The new file should look like this:

[default]
aws_access_key_id = ACCESS_KEY_ID
aws_secret_access_key = ACCESS_KEY

NOTE: you will need to replace ACCESS_KEY_ID and ACCESS_KEY with the appropriate values. You can find them in KeePass by searching for "S3 backups".

Cron scripts

The backups are created by two cron scripts that run as the bridge user on the server. Because the two scripts are not stored anywhere in Bitbucket, they are listed here (in case they have to be recreated at some point).

Login to the server as the bridge user and edit the crontab with crontab -e. It should have the following lines:

0   6   *   *   *   /bin/bash /var/www/vhosts/bridgeschool.org/scripts/backup_db_to_s3.sh
5   6   *   *   *   /bin/bash /var/www/vhosts/bridgeschool.org/scripts/backup_wp_to_s3.sh

Backup script for Wordpress uploads

This is the script that runs daily to ZIP the Wordpress uploads folder and back it up to S3. If it is ever deleted, this script should be recreated at /var/www/vhosts/bridgeschool.org/scripts/backup_wp_to_s3.sh.

# Source .bash_profile to ensure PATH is set
source ~/.bash_profile

# Set vars
NOW=$(date "+%Y-%m-%d")
UPLOADS_PATH=/var/www/vhosts/bridgeschool.org/httpdocs/wp-content/uploads
FILENAME="bs_wp_$NOW.tar.gz"
SCRIPTS_DIR='/var/www/vhosts/bridgeschool.org/scripts'

# Archive the uploads folder
tar -zcvf "$SCRIPTS_DIR/tmp/$FILENAME" $UPLOADS_PATH

# Upload to S3
aws s3 cp "$SCRIPTS_DIR/tmp/$FILENAME" "s3://bridgeschool-archives/wp/$FILENAME"

Backup script for database

This is the script that runs daily to dump the MySQL database and back it up to S3. If it is ever deleted, this script should be recreated at /var/www/vhosts/bridgeschool.org/scripts/backup_db_to_s3.sh.

# Source .bash_profile to ensure PATH is set
source ~/.bash_profile

# Set vars
NOW=$(date "+%Y-%m-%d")
SCRIPTS_DIR='/var/www/vhosts/bridgeschool.org/scripts'
FILENAME="bs_db_$NOW.sql"

# Take DB snapshot and push to S3
# For select databases, instead of --all-databases flag, use
# --databases bridge bridge_stage wordpress_dev etc
mysqldump --host=127.0.0.1 \
  --user=backups \
  --password="DM*VJNsb{(bx" \
  --all-databases > "$SCRIPTS_DIR/tmp/$FILENAME"

# Copy the file up to S3
aws s3 cp "$SCRIPTS_DIR/tmp/$FILENAME" "s3://bridgeschool-archives/sql/$FILENAME"