Tiny solution for automated backups: duply

3 Okt

There are many backup solutions out there. Many of them are free and each one is great in its own way. One very pouplar solution is BackupPC. But sometimes these solutions are to complex. Imagine the following scenario: you have exactly one tiny server and an FTP storage, and you want to do incremental backups every night via cron. It sounds like a problem for rsync. But since FTP and rsync don’t work together that well, here is another solution: duply.

Duply is a nice wrapper around duplicity. Duplicity is a command line backup tool which supports incremental backups and saves them on arbitary file systems. It holds a local index to track changed files. Duply adds profiles to duplicity, so it’s more error-safe and you have to type less. There are many more cool features: if you can’t trust your remote-storage, there is the possibility to encrypt your data before uploading it with gnupg.

After installing duply and duplicity (which is part of most distributions), you first must create a profile. This is done via "duply myServer create". MyServer is the profile name. Choose, whatever you like. If you logged in as root, the profile will be created in /etc/duply. Otherwise, it can be found in "~/.duply". You can now edit the profile, i suggest something like this:

Content of "/etc/duply/myServer/conf":

# this is where our files will go

# base directory to backup


# we disable gpg, see the docs how to use this feature

# we keep backups max 2 month
# we keep max 4 full backups

# after 1 month, a new full backup (no incremental one) is done

# we sage the backup in 250 MiB chunks

Content of /etc/duply/myServer/exclude:

+ /etc
+ /home/myUser
+ /var/www
+ /srv
- /

Content of /etc/duply/myServer/pre (marked as executable):

mkdir /var/www/.backup
chmod 700 /var/www/.backup
/usr/bin/mysqldump --all-databases -u root -psecret > /var/www/.backup/db.sql

Content of /etc/duply/myServer/post (marked as executable):

rm -rf /var/www/.backup

This is only an example, but i think, the idea is clear. Note the exclude file: the conf says, we backup everything. The exclude file then marks /etc/, /home/myUser, /var/www and /srv as included and excludes /. The file is read from top to bottom until the line matches the currently processed file. That means, that only the directories with a "+" (including all subdirectories) are backed up. The two scripts "pre" and "post" can be used, to create database dumps and stuff like that.

The profile can now be used to create automated backups:

$ duply myServer backup # do a new backup
$ duply myServer verify # verify the backup
$ duply myServer purge  # delete too old backups
$ duply myServer backup_verify_purge # do all in one

This can be done nightly via cron. To restore a deleted file, you can now do:

$ duply myServer fetch /file/i/deleted /tmp/restored_file

All commands are documented in the manpage, of course.

Doing a backup is easy. Do it often and check it regulary. One day, you will be glad to have it

One Reply to “Tiny solution for automated backups: duply”

Schreibe einen Kommentar

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert