Ago I still used to store backup VPS using tools Duplicity or Rsync . However, now there was a new, more effective methods, saving more (Free), which is backed up to the Cloud with Rclone.
Rclone is a tool for data synchronization similar Rsync but is focused on developing the functions connected to the cloud storage service.
The advantage of using cloud storage services such as high speed (due to the server are located around the world), data security (no worries hardware issues, network) and most is mostly Free . Especially liked his stuff Free!
Rclone supports many popular Cloud services such as:
instead of backup time is taken up other VPS hosting, I switched to using Google Drive, 15GB of free storage, also quite cheap to buy, only 45k / month and 100GB. You do have a free Google Apps account, the more great again.
In this article will have two main parts, one is installed on VPS Rclone, 2 is used to upload files compressed backup Rclone to Google Drive. With the cloud of other service you do the same.
Creating a full backup of data files your VPS has detailed instructions in the article Guidelines automatically backup the entire VPS this article will only focus on the installation of automatic compressed file upload to Google Drive. More manuals Rclone with Google Drive and other cloud services here .
Automatic backup scenario is as follows:
Now starting on offline.
Rclone is a command line program, so I will then move down to the file executable file to the folder
/ usr / sbin / VPS to use later.
– Install the operating system version Linux 64bit
cd / root / wget http://downloads.rclone.org/rclone-v1.33-linux-amd64.zip unzip rclone-v1.33-linux-amd64.zip cp rclone-v * -linux-amd64 / rclone / usr / sbin / rm -rf rclone - *
– Install the operating system version Linux 32bit
cd / root / wget http://downloads.rclone.org/rclone-v1.33-linux-386.zip unzip rclone-v1.33-linux-386.zip cp rclone-v * -linux-386 / rclone / usr / sbin / rm -rf rclone - *
Link download directly here .
Now you can use the command
rclone for more information to use.
Rclone statements often use the following form:
command is the statement,
parameters are the parameters.
Some commonly used commands using Rclone:
Details of each command you see here .
First, we will configure the connection Rclone with Google Drive, it will only have to do one time only.
Connect with VPS SSH and run the command:
You will get the message:
No remotes found - make a new one enter
n then press Enter to create a new connection.
In line name you enter
remote to name the connection, you can choose any name will do.
A list of cloud services that appears, select the number of
7 Google Drive, then press Enter.
In the next 2 lines
client_secret you leave it blank and press Enter.
Use auto config? enter
n then press Enter. Immediately, Rclone will give a link, you can click directly on it or copy and paste into your browser.
interface appears as follows:
Click Allow to agree, then you will receive a verification code snippet shown below:
Return SSH window, copy and paste this code into line
Enter verification code> then press Enter.
Rclone need to verify the information again, press
y to agree and then press
q to exit the configuration interface connections.
The entire installation process will be similar to the following:
No remotes found - make a new one n) New Remote s) Set password configuration q) Quit config n / s / q> n name> remote Type of storage to configure. Choose a number from below, or type in your own value 1 / Amazon Drive "Amazon cloud drive" 2 / Amazon S3 (also Dreamhost, CEPH, Minio) "S3" 3 / Backblaze B2 "B2" 4 / Dropbox "Dropbox" 5 / Encrypt / Decrypt a remote "Crypt" 6 / Google Cloud Storage (this is not Google Drive) "Google cloud storage" 7 / Google Drive "Drive" 8 / Hubic "Hubic" 9 / Local Disk "Local" 10 / Microsoft OneDrive "Onedrive" 11 / OpenStack Swift (Rackspace Cloud Files, memset Memstore, OVH) "Swift" 12 / Yandex Disk "Yandex" Storage> 7 Google Application Client Id - thường leave blank. client_id> Google Application Client Secret - thường leave blank. client_secret> Remote config Use auto config? * Say Y if not sure * Say N if you are working on a remote machine or headless or Y did not work y) Yes n) No y / n> n If your browser does not open automatically go to the drop down link: https://accounts.google.com/o/oauth2/auth Log in and authorize rclone for access Enter verification code> x / xxxxxxxxx-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx -------------------- [remote] client_id = client_secret = token = "access_token": "xxxx.xxxxx-xxxxxxxxxxxxxxx_xxxxxxxxxx-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "token_type": "Bearer", "refresh_token": "1 / xxxxxxxxxxxxxxxx_xxxxxxxxxxxxxxxxxxxxxxxxxx", "EXPIRY": "2016-09-12T00: 12: 53.66619724- 04:00 " -------------------- y) Yes this is OK e) Edit this remote d) Delete this remote y / e / d> y Current remotes: Name Type ==== ==== remote drive e) Edit existing remote n) New Remote d) Delete the remote s) Set password configuration q) Quit config f / n / d / s / q> q
The public is highlighted in red, please enter input
That’s it, now you can test to try to connect with the command:
LSD remote rclone:
If no matter what the output will be similar to below:
account your Google Drive is newly created, not what to upload, only the default file Getting started.
In Guidelines automatically backup the entire VPS shared his script automatically backup VPS but in this article I will edit down a bit, to script automatically uploaded to Google after creating the archive.
I wrote this script according to the folder structure on the server because of HocVPS Script management. With the other server, you need to adjust a bit the new work.
– Create file
backup.sh in the directory / root /
– Copy the entire contents of the script below and paste into:
# HocVPS Plugin Script - Backup Server and Upload to Google Drive #! / Bin / bash . /etc/hocvps/scripts.conf SERVER_NAME = HOCVPS_BACKUP TIMESTAMP = $ (date + "% F") BACKUP_DIR = "/ root / backup / $ TIMESTAMP" MYSQL_USER = "root" MYSQL = / usr / bin / mysql MYSQL_PASSWORD = $ mariadbpass Mysqldump = / usr / bin / mysqldump SECONDS = 0 mkdir -p "$ BACKUP_DIR / mysql" echo "Starting Backup Database"; = `$ MYSQL databases --user = $ MYSQL_USER -p -e $ MYSQL_PASSWORD" SHOW databases; " | -Ev grep "(Database | information_schema | performance_schema | mysql)" ` print for $ db databases; dirty $ Mysqldump --user = $ MYSQL_USER --opt --force -p $ MYSQL_PASSWORD --databases $ db | gzip> "$ BACKUP_DIR / mysql / $ db.gz" done echo "Finished"; echo ''; echo "Starting Backup Website"; #Loop Through the / home directory for D in / home / *; dirty if [ -d "$D" ]; then #if a directory domain = $ D ## * / # Domain name echo "-" $ domain; zip -r $ BACKUP_DIR / $ domain.zip / home / $ domain / public_html / q -x / home / $ domain / public_html / wp-content / cache / ** * #Exclude cache fi done echo "Finished"; echo ''; echo "Starting Backup Nginx Configuration"; cp -r $ /etc/nginx/conf.d/ BACKUP_DIR / nginx / echo "Finished"; echo ''; size = $ (du -SH $ BACKUP_DIR | awk 'print $ 1') echo "Starting Backup Uploading"; / Usr / sbin / rclone move $ BACKUP_DIR "remote: $ SERVER_NAME / $ TIMESTAMP" >> /var/log/rclone.log 2> & 1 / Usr / sbin / 2w rclone -q delete --min-age "remote: $ SERVER_NAME" #remove all backups older than 2 week echo "Finished"; echo ''; duration = $ SECONDS echo "Total $ size, $ (($ duration / 60)) minutes and $ (($ duration% 60)) seconds elapsed."
/ usr / sbin / 2w rclone -q delete --min-age "remote: $ SERVER_NAME" #remove all backups older than 2 weektransfer
5d(5 days) or
30d(30 days) depending on the needs.
– Press Ctrl + O, Enter to save and Ctrl + X to exit.
– Distribution rights to the script
chmod + x /root/backup.sh
– So that’s it then, now you can test it by running the command
Try checking on Google Drive can see the new folder with the backup data is not light, or test with the command
LSD remote rclone:
Now I would for scripts automatically run at 2:00 am.
crontab -e EDITOR = nano
Paste the following into the Terminal window
0 2 * * * /root/backup.sh> / dev / null 2> & 1
Press Ctrl + O, Enter to save and Ctrl + X to exit
That’s it, keep daily 2am script automatically runs, the entire data backup of VPS and then upload to Google Drive. At the same time, data backup on the VPS will be deleted always after the upload is complete.
See also manual cronjob .
The easiest way for you to restore data that is downloaded from Google Drive backup file to your computer, and then depending on the needs that up again to VPS.
However, if you want to load the backup files directly on VPS, you can use the command always Rclone
rclone copy "remote: /VPS/VPS-2016-11-02_02-00.zip" / root /
The command above will copy the file
VPS-2016-11-02_02-00.zip in directory
VPS Google Drive on the directory
/ root / VPS. Upload and download speeds from Google Drive are also very fast.
backup job VPS / Server is the extremely important I’ve lost all of the data is not recoverable by pressing the wrong backup Rebuild and not subjective. Hopefully with this detailed tutorial, you will have to add new methods and more efficient savings.
Now It’s Your Turn to follow it, should support further comments or whatever you below in comments.