Backups on a Budget

Being on a budget and paying $5 a month to perform backups on a $10-a-month VPS seems a bit excessive. This is how you can achieve that goal gratis.
First and foremost, you don't need to backup every single file on your VPS: Identify which ones need to be backed up; keep a list, if you find it helpful. For example, your web root directory and database are good candidates. Install a cron job that will run every night, say at 4, to backup these essential files:

# Website backup
0 4 * * * /root/backup

Here are the contents of the /root/backup script:

#!/bin/bash
DATE=`date +'%Y%m%d'`
cd /var/www
tar zcf /home/user/Backups/"$DATE"_dir.tgz dir
chown user.user /home/user/Backups/"$DATE"_dir.tgz
su - user -c "pg_dump >/home/user/Backups/\"$DATE\"_dir.dump"
gzip -c /home/user/Backups/"$DATE"_dir.dump >/home/user/Backups/"$DATE"_dir.dump.gz
rm -f /home/user/Backups/"$DATE"_dir.dump
chown user.user /home/user/Backups/"$DATE"_dir.dump.gz
exit

This shell script, that must be run as root, backs up the web root directory and PostgreSQL database into the user's backup directory, located at /home/user/Backups. The database credentials are stored in ~user/.pgpass:

hostname:port:database:username:password

Now, as best practices require, we copy the backup files on a remote Fedora 30 box. The only prerequisites are public-key SSH authentication on your web server and rsync installed on both the client and server.
Nowadays, Fedora Workstation automatically activates ssh-agent on desktop logins and ssh automatically stores passphrases into Fedora's keyring located at /run/user/`id -u`/keyring/.ssh (ssh authentication socket), so most of the work is already done. We just miss a script that runs some time after the server has prepared the backup files.

/home/user/rsync:

#!/bin/sh
#Purge all files matching $2 pattern in dir $1 keeping the $3 most recent files
purge() {

    DIR=$1
    PATTERN=$2
    KEEP=$3

    count=0
    for file in `ls -t1 $DIR/$PATTERN`
    do
        (( ++count ))
        if [ $count -le $KEEP ]
        then
            continue
        fi
        rm -f $file
    done
}
#Manual synch needed: rsync must run _after_ the backup files have been produced
sync() {
/usr/bin/rsync -av www.example.com:/home/user/Backups/*_dir* \
    /home/user/Backups/receive_dir/
purge /home/user/Backups/receive_dir "????????"_drupal"*gz" 16
}
##########################
# Processing begins here #
##########################
DATE=`date +'%Y-%m-%d %H:%M'`
if [ "$SSH_AUTH_SOCK" == "" ]
then
    user=`id -u`
    filename=`find /run/user/"$user" -name 'ssh' 2>/dev/null`
    if [ "$filename" != "" ]
    then
        export SSH_AUTH_SOCK="$filename"
        sync
    else
        echo "$DATE - SSH socket undefined and no file found" >> $HOME/dead.letter
    fi
else
    if [ -e "$SSH_AUTH_SOCK" ]
    then
        sync
    else
        echo "$DATE - SSH socket defined but no file found" >> $HOME/dead.letter
    fi
fi

Remember to make the script executable:

$ chmod 700 /home/user/rsync

The above script provides a retention period of 8 days (KEEP=8*2, 2 being the number of files transferred -which has to be known in advance). Depending on how long it takes for the server to prepare the backup files, schedule the script to run when the files are sure to be ready, say after half an hour:

# Receive Website backup
30 4 * * * /home/user/rsync

Install the above crontab and you're done. This completes the tutorial on how to automatically perform safe and reliable backups on a budget.