Synchronisation between two Virtualmin servers

When, like me, you get paranoid to losing your data or web server functionality…

I have created a little perl script to allow the synchronization of MySQL databases and /home between my Webservers.

The master server (where the script runs from) is the main production server, the salve server is on standby just in case.

I can appreciate that the passwords are clearly inserted in the script and it is a security issue. Nonetheless, I am the only admin and no other users have ftp or other privileges on both servers. I am sure that there is a way to prevent this but I am happy with the current situation and, obviously I am also making a regular backup of the master server on an external backup provider (I use rsync.net).

#!/usr/bin/perl
# Performs a synchronisation of home folder and dumps sql databases 
# from one Virtual server to another using rsync and secure shell 
# 
# Written by G.Serex Sharpnet UK (c) 03.12.2020 

# Var definitions 

############### SQL Config ############# 
# SQL root username 
$username = "root"; 
# Local SQL root password 
$password = "localmysqlpassword"; 
# Remote SQL root password 
$rpassword = "remotemysqlpassword"; 
# The dumped files path . (absolute path + trailing / please) 
$dumped_dbs_path = "/root/mysql/"; 
# The dumped file name 
$dumped_db = "dump.sql"; 
# Name of the database to exclude from the dump (here the mysql and sys are obviously dedicated to each server, so don't dump them!) 
$exclude_database = "mysql,sys,information_schema,performance_schema"; 

################ SSH Config ################# 

# The remote host name 
$remotehost = "ipaddress"; 

#The ssh username 
$sshusername = "root"; 

#The ssh port 
$sshport = "xx"; 

#____ E N D _ V A R _ D E F S. ________________ 

# First check and optimise the lot.

# A little house keeping 
system("/usr/bin/mysqlcheck --optimize --all-databases --auto-repair -u $username -p$password"); 

# Dump the dbs 
system("/usr/bin/mysqlpump -u $username -p$password --exclude-databases=$exclude_database --add-drop-table --result-file=$dumped_dbs_path$dumped_db"); 

# Transfer them abroad 
system("/usr/bin/rsync -avz -e 'ssh -p $sshport' $dumped_dbs_path $sshusername\@$remotehost:$dumped_dbs_path"); 

# Restore the dump abroad 

system("/usr/bin/ssh -p $sshport $sshusername\@$remotehost 'mysql -u root -p$rpassword < $dumped_dbs_path$dumped_db'"); 

# rsync the home directory 

system("/usr/bin/rsync -avz --delete -e 'ssh -p $sshport' /home/ $sshusername\@$remotehost:/home"); 
exit;

Plesk panel, backup strategy when low in space

Plesk backup is a real pain in the backside, if your server disk space is limited, even when choosing to backup in a ftp repository. All the files are created locally and then send over via ftp.

Prerequisites

Note that in order to implement this backup strategy, one must have an external backup space available like rsync.net (which I use) so that you can use the “rsync” command to transfer your files.

Minimising disk usage

In order to exclude any gigantic compressed (or not) archives being stored in your local hard drive, here is a little guide to help you avoid filling your server disk.

The first thing to backup regularly is the Plesk server configuration, this backup will not save the websites/email/database data but the content/configuration of your Panel, which is the first thing to restore if your server packs up.

Backing up Plesk (11) configuration

  1. Open your Plesk Panel as Admin
  2. Choose Tools & Settings from the Server Management sub-menu
  3. Click on Backup Manager
  4. Select Scheduled Backup Settings
  5. Activate the scheduled backup
  6. Select Store backup in: Server repository (or you can chose ftp there also as the files aren’t big, a couple of MB’s)
  7. I have set the Maximum number of backups in repository to “3”, but you can choose another value here
  8. Under the Backup content section, select  “
  9. Press OK

Obviously the choice of Schedule and prefix is at your convenience, I personally back up every day at 00:10 and the prefix is set to “configuration”.

If you aren’t using ftp, the content of this backup is stored in the /var/lib/psa/dumps/ directory. So in order to backup this on another server, simple rsync this location with a command that could be:

rsync -avz /var/lib/psa/dumps/ user@domain.rsync.net:var/lib/psa/dumps

If you don’t want to keep aggregated data add the –delete option:

rsync -avz --delete /var/lib/psa/dumps/ user@domain.rsync.net:var/lib/psa/dumps

To automate

To automate all this simply stick it in a cron (one can use the “Scheduled Tasks” under “Tools & Settings” (select root user) with a command that could look like this:

nice -20 rsync -avz --stats --delete /var/lib/psa/dumps/ user@domain.rsync.net:var/lib/psa/dumps 2>&1 | mail -s "backups dir rsync report" you@youremail

Databases

I will refer to MySQL databases in this section.

Database content is important, and you must back them up regularly, be it incrementally or not. I have Perl scripts that are in charge of dumping the content of the entire dataset and compress it. The file for my setup ends up being about 300MB, which is acceptable. I proceed to dump the content of the dataset via a “schedule” (root cron) and another cron simply “rsync’s” this file over to the backup server.

Email

All the email accounts are in /var/qmail/mailnames, the process is the same as above, use a rsync via a cron:

rsync -avz /var/lib/qmail/mainames/ user@domain.rsync.net:var/qmail/mailnames

Website content

Same as email but the directory is /var/www/vhosts/ so:

rsync -avz /var/www/vhosts/ user@domain.rsync.net:var/www/vhosts

There you have it, obviously I am providing this entirely as guidance and will in no way be liable for any loss you may get using these instructions. Even if you are on a cloud server (like me) it is a good idea to be able to extract files, especially databases or web content should a mistake be made that a cloud server cannot prevent like deleting a site by mistake etc…

Feel free to comment if you have any questions or if I have missed some important bits to backups that a Plesk backup may make which aren’t in this article.