Most of us has experienced loss of valuable files and data at some point in time when a local storage system fails, and most that did, knew the importance of having a current backup of the files and data. Sadly, not many actually do have or implement a proper backup solution.
Some good backup strategies are to create a consistent and automatic backup, and to store the backup in a remote location.
This could be achieved in
Linux environments by creating an automated system to backing up to a remote
SSH server using
Automatically backup to remote server in Linux:
SSHlogin from your local machine to the remote backup server.
$ mkdir -p ~/backup_folder/folder_01
SSHuser has full access to the directory on the remote backup server.
$ chmod -R 777 ~/backup_folder/folder_01
This is hypothetical as the directory needs to be assigned full permission just for the backup user.
$ rsync -av --delete /path/to/folder_01/ remoteuser@remotehost:backup_folder/folder_01
Sample of a more complete script for automated backup.
#!/bin/bash TARGET="remoteuser@remotehost:~/backup_folder" for i in folder_01 folder_02 folder_03; do rsync -av --delete $i/ $TARGET/$i; done
$ ls -l ~/backup_folder/folder_01
crontabeditor on the local machine.
$ crontab -e
cronon your local machine to automatically run your backup script at a set time.
# Run backup command every day on midnight, sending the logs to a file. 0 0 * * * rsync -av --delete /path/to/folder_01/ remoteuser@remotehost:backup_folder/folder_01 >>~/.backup.log 2>&1
Share your thoughts, suggest corrections or just say Hi. Login not required.