Automating MySQL Backups

This script has been heavily overhauled here (5/15/2017). This is left here for history.

I recently began having a lot of trouble with my hosting company, site going down, database server not running for days on end, shell access disappearing, unanswered support emails, the works. So a week or so ago, I switched to a new hosting company. So far things have been great. One thing I used a lot was shell scripting and crons. When the server was up, it worked well.

In moving to the new company there was the usual things to get used to, different path names, incompatible PHP versions, that kind of thing. One new wrinkle was that the new host had every MySQL database on their own virtual server. So with 4 databases initially, I had 4 database servers, 4 users. I had been using phpMyBackupPro for about 3 years or so, for backups, since their 1.5 or 1.6 release. It is a great program for backing up, emailing, or FTP’ing database dumps. Unfortunately, one thing that it can’t do is multiple servers. Multiple databases, yes, servers no. Bummer. I decided to see what it would take to roll my own solution. Turns out, it took an hour or so. You can get the final files here (15.8k .zip) or at the bottom of the post.

My Requirements:

  • Runs with shell script in any/most linux distros and Mac
  • Extracts the databases from multiple servers or users
  • Compresses the text to conserve disk space
  • Emails me the backups for safe-keeping
  • Runs from cron so it won’t miss

I started out with the database dump. Back to ol’ mysqldump. (The names have been changed to protect the innocent, replace the parts between ** and ** with your values)

mysqldump -h **hostname** -u **username** -p**password** -q --add-drop-table --extended-insert **database** > ~/db_backups/database.sql

Works. Great.

Now, I want it compressed to save space. Fine, send it through gzip. But, I don’t want to delete the .sql files so I just won’t create them. I’ll use a named pipe instead. Below is the bash shell script, save as “database_backup.sh”.

# Create a pipe named 'pipe'
mkfifo pipe

# Set the input/output
gzip -9 < pipe > ~/db_backups/**database**.sql.gz &amp;

# Run the actual dump to the pipe
mysqldump -h **hostname** -u **username** -p**password** -q --add-drop-table --extended-insert **database** > pipe

# Get the return code of mysqldump
result=$?

# Wait until the gzip completes
wait

# Now it is safe to remove the pipe
rm pipe

Again, works. Great.

To run for multiple servers, or user/password settings, just repeat the “gzip -9” and “mysqldump” lines. It just resets the I/O on the pipe, but uses the same mechanism.

Now, I want it emailed to me. No problem, dust off a PHP5 version of the PHPMailer class. Save the PHP below as “send_backups.php”.

<?php

$to_address = 'your_name@gmail.com';
$from_address = 'backups@your_domain.com';
$subject = 'MySQL Backup';
$body = 'Database backup files attached.';

require_once('class.phpmailer.php');

date_default_timezone_set('America/Denver');
define('DS', DIRECTORY_SEPARATOR);
$path = dirname(__FILE__);

$body = eregi_replace("[\]", '', $body);
$subject = eregi_replace("[\]", '', $subject);

$mail = new PHPMailer();
$mail->From = $from_address;
$mail->FromName = 'Backup Agent';
$mail->Subject = $subject . ' - ' . date('Y-m-d H:i:s');
$mail->Body = $body;
$mail->AddAddress($to_address, 'Backup Recipient');

foreach (glob('*.gz') as $filename) {
$mail->AddAttachment($path . DS . $filename);
}

$mail->Send();

In general, this creates an email object, applies properties, gathers all the .gz files, attaches them, and squirts the whole mess to you over the ether.

The last thing to do is to set the whole thing to run on a schedule. Below is my crontab entry. Save it to your crontab using “crontab -e”.

50 */2 * * * ~/db_backups/database_backup.sh
00 23 * * * /usr/local/bin/php5 ~/db_backups/send_backups.php

I have the database backup run every two hours, and send me the email with the latest files once a day. That’s it. I get very regular backups, more than I really use, but, I do also take backups using rsync to my local machine, so I always get the latest. I leave the emails in my gmail account for about a week before I delete them. I guess I don’t really have to with 7+ GB of space, but I do.

This may not be new, the best, or the most efficient, but it does hit all my main points.

Update 8/5/2013: Just wanted to add that this script has been running flawlessly, unattended since I installed it in 2008. It is still sending my Gmail account my compressed database backups everyday.

Zip file: database_backup.zip (15.8k .zip)