This tutorial explains the process of taking automated backup of some directories, files etc in Linux over the network as well as on that PC. The following script will compress your directories and save the backup on a shared folder on network and some other location on your PC. I felt the need of such a script when taking daily backups of three databases that we work upon started seeming tedious. There are 100s of tutorials on net, but I had to modify the procedure so much that it’s almost an entirely an original script.
#! /bin/bash # What to backup. SOURCE="/var/lib/mysql/data1 /home" # Where to backup to. DEST1="/root/backup/mysql" DEST2="/mnt/backup/" # Backup the files tar vczf $DEST1/$(date +%m-%d).tar.gz $SOURCE tar vczf $DEST2/$(date +%m-%d).tar.gz $SOURCE echo "Backup finished" date # Long listing of files in $DEST to check file sizes. ls -lh $DEST