Losing valuable files and data is a frustrating experience many of us have faced. While we understand the importance of backing up our files and data, not everyone has implemented an effective backup solution.
A major cause of data loss is the failure of a local storage system. Good backup strategies involve creating regular, automatic backups and storing at least one copy at a remote location.
A popular method to back up a Linux server or desktop is to use rsync for transferring files to a remote SSH server, and cron for automating the backup process.
remoteuser@remoteserver:$ mkdir -p ~/backup_folder/folder_01
remoteuser@remoteserver:$ chmod -R 777 ~/backup_folder/folder_01
localuser@localhost:$ rsync -av --delete /path/to/folder_01/ remoteuser@remoteserver:backup_folder/folder_01
Sample of a more complete script for automated backup.
#!/bin/bash TARGET="remoteuser@remoteserver:~/backup_folder" for i in folder_01 folder_02 folder_03; do rsync -av --delete $i/ $TARGET/$i; done
remoteuser@remoteserver:$ ls -l ~/backup_folder/folder_01
localuser@localhost:$ crontab -e
# Run backup command every day on midnight, sending the logs to a file. 0 0 * * * rsync -av --delete /path/to/folder_01/ remoteuser@remoteserver:backup_folder/folder_01 >>~/.backup.log 2>&1
Related: Crontab Generator
Comment anonymously. Login not required.