SSL smtp on Virtualmin

By default, virtual servers aren’t setting up SMTP to use SSL. Following this information regarding SSL SMTP I am just summarizing:

 netstat -an | grep :465

Returns nothing, so.

vi /etc/postfix/master.cf

find the lines:

#smtps inet n - n - - smtpd
# -o smtpd_tls_wrappermode=yes
# -o smtpd_sasl_auth_enable=yes
# -o smtpd_client_restrictions=permit_sasl_authenticated,reject
# -o milter_macro_daemon_name=ORIGINATING

And uncomment them:

Continue reading SSL smtp on Virtualmin

index.shtml and #include virtual in Virtualmin

For some reason the default virtualmin install does not have index.shtml in the Directoryindex directive. To enable it, one must edit the directive:

Virtualmin > Services > Configure Website > Edit Directives 

Find the following line:

DirectoryIndex index.html index.htm index.php index.php4 index.php5

Add index.shml at the end  of the line:

DirectoryIndex index.html index.htm index.php index.php4 index.php5 index.shtml

Click “Save” then “Apply Changes” (on the top right hand side of the “Virtual Server Options” page). Checking your virtual website will show the index.shtml page.

To make it a default configuration

In order to enable  this for all further virtual websites:

Continue reading index.shtml and #include virtual in Virtualmin

Virtualmin Virtual Servers (GPL)

Following my last post I now found out that Webmin does not allow multi virtual servers :-(, BUT there is a module called Virtualmin Virtual Servers (GPL) which is what I need.

To save myself hassle, I installed a compatible OS (CentOS 6 64bit) and simply installed virtualmin from the install.sh command after downloading it:

cd /root
wget http://software.virtualmin.com/gpl/scripts/install.sh
sh ./install.sh

After this my next  move was to install csf lfd from configserver.com:
(there is now a new URI for the archive which I changed on 2017-02-08)

wget http://www.configserver.com/free/csf.tgz
wget https://download.configserver.com/csf.tgz
tar zxvf csf.tgz
cd csf
sh ./install.sh

Once installed integrate it in Webmin and you’re good to go.

– Install the csf webmin module in:
Webmin > Webmin Configuration > Webmin Modules > From local file > /etc/csf/csfwebmin.tgz > Install Module

Email access problems

All worked well from the word go except SMTP/POP3 (Dovecot) server; I was getting Failed to connect to localhost:143 : Connection refused  (as well as port 993,995,110 and 25) when trying to connect either via my email client or usermin.

System Information was reporting that Dovecot IMAP / POP3 Server was offline, and trying to start Dovecot failed:

Starting dovecot: Error: socket() failed: Address family not supported by protocol
Error: service(pop3-login): listen(::, 110) failed: Address family not supported by protocol
Error: socket() failed: Address family not supported by protocol
Error: service(pop3-login): listen(::, 995) failed: Address family not supported by protocol
Error: socket() failed: Address family not supported by protocol
Error: service(imap-login): listen(::, 143) failed: Address family not supported by protocol
Error: socket() failed: Address family not supported by protocol
Error: service(imap-login): listen(::, 993) failed: Address family not supported by protocol
Fatal: Failed to start listeners

After editing the /etc/dovecot/dovecot.conf file and commenting out/insert: listen = *  Here is how my edited file looks like:

# A comma separated list of IPs or hosts where to listen in for connections. 
# "*" listens in all IPv4 interfaces, "::" listens in all IPv6 interfaces.
# If you want to specify non-default ports or anything more complex,
# edit conf.d/master.conf.
#listen = *, ::
listen = *

Now, why would Dovecot server come with this line disabled is baffling, anyway, after this, Dovecot starts fine allowing usermin and email clients to connect without problems:

dovecot

 

Server heartbeat

This little perl program allows you to check the availability of an IP address via a cron command that outputs the result in a text file.

#!/usr/bin/perl
# This script pings IP addresses
#
# In a live application, read host list
# from a config file
@hosts = ("192.168.1.1","192.168.1.19");
($sec,$min,$hour,$mday,$mon,$year,$wday,$yday,$isdst) = localtime(time);
$year += 1900;
$mon += 1;
$min = sprintf("%02d", $min);
$sec = sprintf("%02d", $sec);
$mon = sprintf("%02d", $mon);
$mday = sprintf("%02d", $mday);
$year = sprintf("%02d", $year % 100);

@live = ();
foreach $h (@hosts) {
        $r = `ping -c2 $h`;
        if ($r =~ /2 re/) {
                push @live,$h;
                }
        }
$alive = "@live";
print ("You have $alive on $mday $mon $year @ $hour:$min:$sec\n");

This script can then be used in a cron like:

*/5 * * * * /path/to/file/pingtest.pl >> /some/path/pingtest.txt

This command will write the result on a new line in the file called pingtest.txt every 5 minutes. Which will look like that:

You have 192.168.1.1 192.168.1.19 on 06 01 13 @ 20:05:01
You have 192.168.1.1 192.168.1.19 on 06 01 13 @ 20:10:01
You have 192.168.1.1 192.168.1.19 on 06 01 13 @ 20:15:01

Plesk panel, backup strategy when low in space

Plesk backup is a real pain in the backside, if your server disk space is limited, even when choosing to backup in a ftp repository. All the files are created locally and then send over via ftp.

Prerequisites

Note that in order to implement this backup strategy, one must have an external backup space available like rsync.net (which I use) so that you can use the “rsync” command to transfer your files.

Minimising disk usage

In order to exclude any gigantic compressed (or not) archives being stored in your local hard drive, here is a little guide to help you avoid filling your server disk.

The first thing to backup regularly is the Plesk server configuration, this backup will not save the websites/email/database data but the content/configuration of your Panel, which is the first thing to restore if your server packs up.

Backing up Plesk (11) configuration

  1. Open your Plesk Panel as Admin
  2. Choose Tools & Settings from the Server Management sub-menu
  3. Click on Backup Manager
  4. Select Scheduled Backup Settings
  5. Activate the scheduled backup
  6. Select Store backup in: Server repository (or you can chose ftp there also as the files aren’t big, a couple of MB’s)
  7. I have set the Maximum number of backups in repository to “3”, but you can choose another value here
  8. Under the Backup content section, select  “
  9. Press OK

Obviously the choice of Schedule and prefix is at your convenience, I personally back up every day at 00:10 and the prefix is set to “configuration”.

If you aren’t using ftp, the content of this backup is stored in the /var/lib/psa/dumps/ directory. So in order to backup this on another server, simple rsync this location with a command that could be:

rsync -avz /var/lib/psa/dumps/ user@domain.rsync.net:var/lib/psa/dumps

If you don’t want to keep aggregated data add the –delete option:

rsync -avz --delete /var/lib/psa/dumps/ user@domain.rsync.net:var/lib/psa/dumps

To automate

To automate all this simply stick it in a cron (one can use the “Scheduled Tasks” under “Tools & Settings” (select root user) with a command that could look like this:

nice -20 rsync -avz --stats --delete /var/lib/psa/dumps/ user@domain.rsync.net:var/lib/psa/dumps 2>&1 | mail -s "backups dir rsync report" you@youremail

Databases

I will refer to MySQL databases in this section.

Database content is important, and you must back them up regularly, be it incrementally or not. I have Perl scripts that are in charge of dumping the content of the entire dataset and compress it. The file for my setup ends up being about 300MB, which is acceptable. I proceed to dump the content of the dataset via a “schedule” (root cron) and another cron simply “rsync’s” this file over to the backup server.

Email

All the email accounts are in /var/qmail/mailnames, the process is the same as above, use a rsync via a cron:

rsync -avz /var/lib/qmail/mainames/ user@domain.rsync.net:var/qmail/mailnames

Website content

Same as email but the directory is /var/www/vhosts/ so:

rsync -avz /var/www/vhosts/ user@domain.rsync.net:var/www/vhosts

There you have it, obviously I am providing this entirely as guidance and will in no way be liable for any loss you may get using these instructions. Even if you are on a cloud server (like me) it is a good idea to be able to extract files, especially databases or web content should a mistake be made that a cloud server cannot prevent like deleting a site by mistake etc…

Feel free to comment if you have any questions or if I have missed some important bits to backups that a Plesk backup may make which aren’t in this article.