Most of us has experienced loss of valuable files and data at some point in time when a local storage system fails, and most that did, knew the importance of having a current backup of the files and data. Sadly, not many actually do have or implement a proper backup solution.

Some good backup strategies are to create a consistent and automatic backup, and to store the backup in a remote location.

This could be achieved in Linux environments by creating an automated system to backing up to a remote SSH server using rsync.

Automatically backup to remote server in Linux:

  1. Configure passwordless SSH login from your local machine to the remote backup server.
  2. Create a directory on the remote server as a backup target.
    $ mkdir -p ~/backup_folder/folder_01
  3. Make sure the connecting SSH user has full access to the directory on the remote backup server.
    $ chmod -R 777 ~/backup_folder/folder_01

    This is hypothetical as the directory needs to be assigned full permission just for the backup user.

  4. Manually run your backup script on your local machine to test if backup operation is successful.
    $ rsync -av --delete /path/to/folder_01/ remoteuser@remotehost:backup_folder/folder_01

    Sample of a more complete script for automated backup.

    backup.sh
    #!/bin/bash
     
    TARGET="remoteuser@remotehost:~/backup_folder"
     
    for i in folder_01 folder_02 folder_03; do
    	rsync -av --delete $i/ $TARGET/$i;
    done
  5. Check on the remote server if the files were successfully backed.
    $ ls -l ~/backup_folder/folder_01
  6. Open crontab editor on the local machine.
    $ crontab -e
  7. Configure cron on your local machine to automatically run your backup script at a set time.
    # Run backup command every day on midnight, sending the logs to a file.
    0 0 * * * rsync -av --delete /path/to/folder_01/ remoteuser@remotehost:backup_folder/folder_01 >>~/.backup.log 2>&1 
  8. Save and exit crontab editor.
Leave a comment:
Share!