Create custom website backups using CRON
Several weeks ago I was looking for some alternative hosting for Media Temple and found WebFaction. While searching the net I found a few reviews and all of them are very good. Because their shared hosting package look different from others, I took an account.
Why Webfaction?
Webfaction is a 100% web hosting provider offering shared hosting plans and managed dedicated Linux web servers. Very special is that a shared hosting account has advanced features you didn’t find by other providers. The most important feature is that you get full SSH access to your Linux user directory and processes like PHP and HTTPD are executed as the same user. Having that kind of user accounts gives you much more possibilities like using a reserved amount of memory for a single process or the ability to setup new stuff in the user directory. I think WebFaction doesn’t provide a shared hosting account for a beginning webmaster, because their control panel is very basic and is mostly used to create new website, applications, email and database accounts. That will say that other functions like protecting directories or creating CRON jobs must be set via SSH and the command line. They create backups for all your website accounts and all databases. That is for the most users enough, but I like to have real time access to a copy to all my files.
Creating backups for your websites and applications
While this tutorial is based on my experience with the hosting account by WebFaction, it should work for most other Linux accounts or servers. All you need is full access to the user’s home directory. In the past I used the backup function provided by the DirectAdmin control panel for all my hosting. Those backup function (similar to other control-panel types) creates one zipped file from all the user data which is stored on a local or remote location. This type of backup will generate a huge server load if sites becoming bigger. In this tutorial we backup our databases to a local location and we are using rsync to backup our websites and databases.
Creating backups from your database
First we need one database user which has at least the rights to select an lock tables.
GRANT SELECT, LOCK TABLES ON database.* TO 'yourdumpuser'@'localhost'
If you like to have a single user with full rights for easier database access, you can use this query as an alternative:
GRANT ALL ON database.* TO 'yourdumpuser'@'localhost'
Using the rights for the dump user we’re able to create database backups from the command line like:
mysqldump -uyourdumpuser -hlocalhost -pyourdumppassword database | gzip --best > yourmysqlbackup.gz
This command creates an already gzip compressed database backup file. If you have have many databases you need a lot of commands it would be easier to run a bash script like this one:
#!/bin/sh DBUSER="yourdumpuser" DBHOST="localhost" DBPASS="yourdumppassword" BACKUPDIR="dbbackup" DBS=`mysql -u$DBUSER -h$DBHOST -p$DBPASS -e"show databases"` for DATABASE in $DBS do if [ $DATABASE != "Database" ]; then FILENAME=$DATABASE.gz mysqldump -u$DBUSER -h$DBHOST -p$DBPASS $DATABASE | gzip --best > $BACKUPDIR/$FILENAME fi done
This small script will read all database names (where the user has access to) and creates files for all of them. Add this code into a file and save it under the name dbbackup.sh in your Linux home directory and create also the directory dbbackup for the files. The new script need to be executable, use chmod 775 to do that. Test the script with the following command within the directory where the script is located.
./dbbackup.sh
Backup your files using rsync
Next we need a remote backup location to store our data on a safe place. This can be a second server you own or an external service. Important is that you can access that location via SSH. I’m using EVBackup from ExaVault. They are not very expensive and they offer a great service. They have also a great tutorial on how-to configure your account to access their servers using rsync via SSH.
After you setup you SSH connection between you server and your backup server we can start using rsync.
Note! We followed the instructions on from the EVBackup site and created a SSL key to access the remote location without providing a password. It’s strongly advised to do the same, otherwise the next steps will not work for you. You can use the instructions from their website for your own server if you change some parameters and values.
The following example will copy all files within the webapps directory from your local server to the remote location (directory backup-1):
rsync -avz --delete -e "ssh -i /home/username/backup/ssh_key" "/home/username/webapps" someuser@someuser.exavault.com:backup-1
/home/username/backup/ssh_key is the location of the ssh key you have created before
/home/username/webapps is the directory with files you want to backup
someuser@someuser.exavault.com:backup-1 is the remote location (in this example an account at exavault.com/EVBackup) where all files are stored
Note the parameter –delete will delete files on remote if the didn’t exist on the local location and -i will create a incremental backup. For additional information about the parameters used with rsync run this command via the command line: rsync --help
.
Run backups using CRON
If you have multiple directories to backup you should group them in one bash script:
#!/bin/sh rsync -avz --delete -e "ssh -i /home/username/backup/ssh_key" "/home/username/webapps" someuser@someuser.exavault.com:backup-1 rsync -avz --delete -e "ssh -i /home/username/backup/ssh_key" "/home/username/dbbackup" someuser@someuser.exavault.com:backup-1
You can add as many rsync commands you need in that script, save it in the user’s home directory with the name data-backup.ssh.
- We need to run the database backup script to store the current data into the dbbackup directory
- If this backup is finished we start the rsync backup script ( give the first CRON job enough time)
An alternative is to merge both bash scripts, but this is not part of this tutorial. Our entries for the crontab file are (use this command: EDITOR=nano crontab -e)
0 10 * * * /home/username/dbbackup.sh 0 11 * * * /home/username/data-backup.sh
These CRON jobs will run the first script at 10:00AM and the second at 11:00AM.
That’s all to backup your websites and databases using rsync and CRON. If you like to try Webfaction use finalwebsites as promo code, they offer a 60 days money-back guarantee.
Published in: Web Hosting