Simple Off-site Backups with GMail

After my disastrous encounter with “rm” and subsequent failure to recover any files, I resolved to set up an automated backup solution. A key component of any backup system is the off-site backup, where critical files are stored in a remote location so that any disaster that eliminates local storage would hopefully not affect them.

There are many Internet based backup options available, but they all seemed like overkill since my requirements were simply to get one or two relatively small critical files backed up off-site. However, my GMail account currently has about 5GB of free space, so why not make use of it? Simply mailing myself the files solves the problem.

The following trivial bash script:

  • optionally checks a timestamp file to see if it should continue running
  • encrypts and compresses the file to be backed up with GPG (so that GMail won’t index its contents) using an easy to remember password
  • uses the venerable Mutt to build an email message with the file attached and send it to my GMail account

For automation’s sake I’ve placed this script in my crontab to execute daily, and added a filter to GMail to label the backup emails and have them skip my inbox.

[code lang="bash"]
# gmbackup  []
# Back up the file to gmail.  If a test file is supplied and it is
# newer than the file to back up then the backup will not proceed.  If
# a test file is supplied then its time stamp will be updated after
# the backup is complete.
emailaddr=[your Google account name]
if [ "$testfile" == "" ] || [ $testfile -ot $bakfile ]; then
    gpgfile=/tmp/`basename $bakfile`.gpg
    subject="[backup] "`date +"%Y-%m-%d %H:%M:%S"`" "$bakfile
    rm -f $gpgfile
    /usr/bin/gpg -c --no-use-agent --passphrase [your passphrase] -o $gpgfile $bakfile
    echo $bakfile" backup for " `date` | /usr/bin/mutt -a $gpgfile -s "\"$subject\"" $emailaddr
    rm $gpgfile
    if [ "$testfile" != "" ]; then
        /usr/bin/touch $testfile

About this entry