How to Backup and Restore VPS to Google Drive Using Rclone
This article is specifically made to ease the management of VPS (Virtual Private Server), here you will learn a step-by-step guide on how to backup your VPS to your Google Drive storage with Rclone and automate it.
Everyone who owns a self-managed VPS knows it is one of the cheapest and also it has high capacity compared to purchasing a shared hosting or VPS from popular hosting companies like NameCheap, Bluehost, Inmotion Hosting and the rest.
Also, a self-managed VPS is cheaper and faster but we tend to come across some bottlenecks especially if you do not know much about Linux-based servers, trust me, this is a wonderful experience you do not want to miss. So if you are reading this article before deciding whether to buy a VPS or not then do not get discouraged as on this blog you will find a lot of articles on how to manage your Virtual Private Server by yourself without fear.
What raised the most concern about Virtual Private Servers is usually “Backups” and also “Restore”, because having a serious project on a server without plans for daily, weekly and even monthly backup is as good as you not having any project online because you could lose all your files and data at any time.
This guide will walk you through the steps to set up and automate VPS backups to Google Drive with Rclone and also show you how to restore the data if the need arises.
Why Use Rclone for VPS Backups?
Rclone is an open-source tool that enables you to easily synchronize data between a VPS and multiple cloud storage. It is a versatile command-line tool that is specially made to be compatible with a wide range of cloud providers, including Google Drive (GDrive), Dropbox, and Amazon S3 Bucket.
Its flexibility and speed make it the most efficient and also excellent option for VPS backups never neglecting its capability to handle large files and encrypted backups. It is noteworthy that this process happens within a few seconds.
Key Benefits of Rclone:
- Cross-platform compatibility – works with almost all cloud storage providers.
- Efficient file transfer – large datasets are not exempted.
- Data encryption – for security purposes.
- Incremental syncs – every little change gets added (optional).
- Command-line automation – bash and crone jobs automation.
Requirements and Prerequisites
To follow this guide, you’ll need:
- A VPS running a Linux-based Operating System (e.g., Ubuntu, Debian, CentOS, Almalinux or Cloudlinux).
- A Google account for you to configure Google Drive as storage.
- Root or sudo access on your VPS.
- A computer with Putty or GitBash terminal installed (any terminal will do).
- Internet Connection.
Without further ado, now that we have all the requirements let us proceed.
How to Backup and Restore VPS to GDrive With Rclone
Based on popular demand ZudoTech has received a lot of requests on how to backup a Contabo VPS to Google Drive, how to backup Hetzner VPS to GDrive and many more like DigitalOcean e.t.c. Be rest assured that this step works for all VPS providers be it Contabo, vultr, DigitalOcean, Linode and even AWS.
It also works with any Control Panel be it Virtualmin, CloudPanel, VestaCP, aaPanel e.t.c., although for this tutorial we will be using Virtualmin even though we do not necessarily need a control panel to do what we want to do.
Here, we will be using Google Cloud Console and the upgraded version of Authentication which makes the issue with “redirect_uri_mismatch” a thing of the past as we will be connecting through “Service Account” to authenticate.
Below is the step-by-step guide on how to backup and restore your VPS on GDrive storage.
Step 1: Create a Service Account in Google Cloud Console
Google Cloud Console is very important to make our backup process consistent and with no limitations, in addition, it also provides an extra layer of security giving us total control.
- Visit Google Cloud Console with the Gmail account to be used for the backup.
- Create a new project by clicking the top-left menu New Project.
- Allow it to fully create the project then select the Project you just created.
- Now you will be presented with Quick access with various options to navigate to.
- Select APIs & Services under Quick access and at the left menu click on Library.
- Scroll down or search for Google Drive API, click on it and click Enable.
Now we need to create a Service Account which will be used to authenticate Rclone from our VPS to the Google Drive account, follow the steps below to do that.
- Click the arrow back button on your browser to return to where we have Quick access, alternatively, you can click the Google Cloud logo and you will be returned to the Quick access page.
- Now click on IAM & Admin, scroll down on the left menu navigation to Service Accounts and click Create Service Account.
- Give the Service Account a name and description then click Create and Continue.
- Skip assigning roles by clicking on Continue then click Done to create it.
We are done with that and one more step to go with Google Cloud Console, we now need to create a Service Account Key, to do this follow the steps below:
- Under the Action tab there is three vertical dots, click on the dots and select Manage Keys.
- On the Manage Key page click ADD KEY dropdown then click Create New Key.
- Ensure the JSON key type is selected then click on Create, this will prompt you to download a JSON credential file which will be used for authentication on your VPS server.
- Once this is done click the Back Arrow button to return to the previous page then copy the Email Address under the Email tab as seen below.
Step 2: Share Google Drive Folders with the Service Account
- Open a new tab on your browser and visit Google Drive.
- Create a folder e.g. Backup then right-click on the folder and click Share.
- In the Sharing box provided paste the copied Email Address of the Service Account which looks like this ([email protected]).
- Give it an Editor privilege and click Send.
Step 3: Install and Configure Rclone with the Service Account
- First, upload the JSON file previously downloaded to your server, in my case, I will upload it under the directory that houses the list of other directories that I intend to backup “/home”.
- SSH to the server using GitBash or Putty
- Install Rclone with the command below, if you ain’t using the root user then ensure to always add sudo.
curl https://rclone.org/install.sh | sudo bash
Once rclone has been installed we can then proceed with configuration.
rclone config
- Create a new remote from the options n/d/r/c/s/q and select n name it gdrive.
- For Storage type input the number indicates Google Drive.
- Press the enter key to skip client_id, client_secret.
- Scope select 1 (Full access all files, excluding Application Data Folder).
- For service_account_file you add the path to the directory that houses the list of other directories that we intend to backup, in my own case as mentioned in Step 3 above “/home/gdrive-439617-77a484bbac93.json”.
/path/to/service-account-file.json
- Edit advanced config? type y for YES, and skip token, auth_url, and token_url.
- For root_folder_id you need to copy the ID of the Google drive folder, example below:
NOTE: Remember we created a folder in our Google Drive root named Backup now we need to click on the folder and copy the URL, the code which ends it is the root_folder_id.
- Here is my Backup folder URL (https://drive.google.com/drive/folders/1Q2aXRWmgNwIOhxJoGgYcOfRp01Bd2gM), this means my root_folder_id is 1Q2aXRWmgNwIOhxJoGgYcOfRp01Bd2gM.
- Skip the rest by pressing the Enter button, just continue skipping till you get the Edit advanced config then type n for NO.
- Configure this as a Shared Drive (Team Drive)? type n for NO.
- The terminal should show you some of the entered results and now you need to type y for Yes this is OK (default), then type q for Quit config.
Step 4: Test the Service Account Setup
To ensure the service account is correctly accessing your Google Drive, test it with:
rclone lsd gdrive: -vv
Once this is entered it should list the contents in your Google Drive that the service account has access to, if you can’t see anything then it means the folder is empty, so you can create a folder or file in the Google Drive Backup directory and re-execute the command. If successful, it will list the newly created folder and now we can proceed to use Rclone with this service account for your backup process.
You can as well from the terminal create a test file and upload it to Google Drive as shown below:
echo "Test Backup" > /root/testfile.txt
This will create the file textfile.txt, to upload it on Google Drive we use the below command:
rclone copy /root/testfile.txt gdrive:/test-backup/ -vv
Check your Google Drive folder and the file should be there, If the Service Account has permission, it should be complete without errors.
Next Steps: Automate Server Backups With RClone & Rsync
It won’t make any sense to do the backup and transfer manually, so we have to automate it at first by creating a bash script and then using a cron job to trigger it at a particular time or date of our choice.
Step 1: Create a Backup Script
ZudoTech has created a bash script for you which will help make the automation as easy as possible. This script uses rsync to copy your files temporarily to a folder on the server and then transfer the files to Google Drive of which after successful transfer it will delete the temporary backup on the server to save us from filling up our VPS storage.
Create a file in the /root directory, this can be done on the terminal as shown below:
nano /root/backup-script.sh
Then paste the bash script below into it then save by pressing CTRL + X, then Y to confirm changes, and Enter to save. Alternatively, you can create the script file via your control panel like Virtualmin or aaPanel, then paste the bash script below and save.
Here’s a backup bash script:
#!/bin/bash
# Define the source directory and the backup directory
SOURCE_DIR="/home" # The main directory to back up
BACKUP_DIR="/root/backups" # Temporary backup location
DATE=$(date +"%Y-%m-%d_%H-%M-%S")
# Create the backup directory if it doesn't exist
mkdir -p "$BACKUP_DIR"
# Part 1: Backing up all directories under /home in a single archive
BACKUP_FILE="$BACKUP_DIR/backup-home-$DATE.tar.gz" # Define the backup file name
echo "Backing up all directories in $SOURCE_DIR to $BACKUP_FILE..."
# Compress all contents of /home into a single backup file
tar -czvf "$BACKUP_FILE" -C "$SOURCE_DIR" .
# Upload the backup to Google Drive using rclone
echo "Uploading $BACKUP_FILE to Google Drive..."
rclone copy "$BACKUP_FILE" gdrive:/Virtualmin/Files -vv
# Remove the local backup if uploaded successfully
if [ $? -eq 0 ]; then
rm "$BACKUP_FILE" # Remove the compressed backup file
echo "Backup for all directories under $SOURCE_DIR completed, uploaded, and local file deleted."
else
echo "Failed to upload $BACKUP_FILE to Google Drive."
fi
# Retain only the last 5 file backups on Google Drive
echo "Cleaning up old file backups on Google Drive..."
rclone ls gdrive:/Virtualmin/Files | awk '{print $2}' | sort -r | sed -e '1,5d' | while read -r file; do
rclone delete "gdrive:/Virtualmin/Files/$file"
echo "Deleted old file backup: $file"
done
# Part 2: Backing up all databases in a single file
DB_BACKUP_FILE="$BACKUP_DIR/all-databases-backup-$DATE.sql.gz"
echo "Backing up all databases to $DB_BACKUP_FILE..."
mysqldump --all-databases | gzip > "$DB_BACKUP_FILE"
# Verify if the database backup was successful
if [ $? -eq 0 ]; then
echo "Database backup successful. Uploading to Google Drive..."
# Upload the database backup to Google Drive using rclone
rclone copy "$DB_BACKUP_FILE" gdrive:/Virtualmin/DB -vv
# Remove the local database backup if uploaded successfully
if [ $? -eq 0 ]; then
rm "$DB_BACKUP_FILE"
echo "All databases backup uploaded and local file deleted."
else
echo "Failed to upload the database backup to Google Drive."
fi
else
echo "Database backup failed."
fi
# Retain only the last 5 database backups on Google Drive
echo "Cleaning up old database backups on Google Drive..."
rclone ls gdrive:/Virtualmin/DB | awk '{print $2}' | sort -r | sed -e '1,5d' | while read -r db_file; do
rclone delete "gdrive:/Virtualmin/DB/$db_file"
echo "Deleted old database backup: $db_file"
done
Now you need to make the script executable, copy and paste the command below in your terminal then hit Enter.
chmod +x /root/backup-script.sh
What This Bash Script Does?
- It creates a backup directory in /root and every temporary backup will take place here.
- It takes the backup of all directories in /home into one and compresses it to tar.gz.
- Also, it takes the complete Database and compresses it into one.
- Once each backup is completed it will transfer the backup to Google Drive and delete the one on the server to save you space.
- It created a Folder inside Backup folder on Google Drive then another Folder called Virtualmin, then a folder for Files and another for DB. You can alter this as you want.
- It will then delete older backups from Files and DB keeping the latest 5 backups.
Run The Backup With a Line of Command
Before we proceed to automate the backup with cron job try and make the backup manually first and ensure everything is working perfectly. Use the line of command below:
/root/backup-script.sh
Check your Google Drive to see it has been uploaded successfully, then we can now proceed to add a cron job for automation.
Schedule Automated Backups with Cron
We need to set up our cron job and to do that we can easily open our crontab then add a new cron, in my case I want the backup to trigger the bash script every day at 2 AM. Here is the command to do that:
crontab -e
Once the crontab has opened you can add a cron, copy and paste this:
0 2 * * * /root/backup-script.sh >> /var/log/backup.log 2>&1
Save and exit the crontab editor, now we will have our complete backup run daily at 2 AM and log the output to /var/log/backup.log. We can always monitor the success of the cron by checking the log file /var/log/backup.log to see if there is any error(s). Also, after the time set, you should check your Google Drive to be sure the file is uploaded.
CONGRATS: These are the most important steps to follow so as to ensure you create daily backup of your files qnd databases, you can also modify the bash script more to have it refined to your taste. Meanwhile I will provide you more bash script that further simplify this process.
How to Restore Backups From Google Drive to Your VPS
One thing is making a swift backup of your files and databases to your GDrive storage, another thing is restoring the backup back to your VPS. As you already know, we won’t give you an incomplete tutorial, we’ve got you covered.
Here is how to transfer your backup from Google Drive to your VPS either you want it back on the current VPS or on a different or newly bought VPS, the process is the same.
To retrieve a file or database from Google Drive you should use the commands below:
# The command below will retrieve the File, ensure the file name correlates.
rclone copy gdrive:/Virtualmin/Files /root/backups/ --include "backup-home-2024-10-28_16-27-40.tar.gz"
# The command below will retrieve the Database, ensure the database name correlates.
rclone copy gdrive:/Virtualmin/DB /root/backups/ --include "all-databases-backup-2024-10-28_16-27-40.sql.gz"
This will restore the file or database name you specified from the Google Drive backup File and DB folder.
Other Backup Bash Scripts and Their Usefulness
You can utilize the other backup bash scripts provided for you below, to make it easier I will give an explanation of each of them, you have various ones to choose from.
1. Backup All Directories Excluding foldername1, foldername2 and foldername3 & Individual Databases
#!/bin/bash
# Define the source directory and the backup directory
SOURCE_DIR="/home" # The main directory to back up
BACKUP_DIR="/root/backups" # Temporary backup location
DATE=$(date +"%Y-%m-%d_%H-%M-%S")
# Create the backup directory if it doesn't exist
mkdir -p "$BACKUP_DIR"
# Part 1: Backing up all directories in /home except specified ones
for dir in "$SOURCE_DIR"/*; do
if [ -d "$dir" ]; then # Check if it's a directory
DIR_NAME=$(basename "$dir") # Get the directory name
# Skip directories that are excluded from the backup
if [[ "$DIR_NAME" == "foldername" || "$DIR_NAME" == "foldername2" || "$DIR_NAME" == "foldername3" ]]; then
echo "Skipping $DIR_NAME..."
continue # Skip this iteration for the excluded directories
fi
BACKUP_FILE="$BACKUP_DIR/backup-$DIR_NAME-$DATE.tar.gz" # Define the backup file name
echo "Backing up $DIR_NAME to $BACKUP_FILE..."
# Rsync to gather files into a temporary directory
rsync -av --progress "$dir/" "$BACKUP_DIR/$DIR_NAME/" # Copy the contents to a temporary directory
# Compress the backup
tar -czvf "$BACKUP_FILE" -C "$BACKUP_DIR" "$DIR_NAME"
# Upload the backup to Google Drive using rclone
echo "Uploading $BACKUP_FILE to Google Drive..."
rclone copy "$BACKUP_FILE" gdrive:/Virtualmin/Files -vv
# Remove the local backup if uploaded successfully
if [ $? -eq 0 ]; then
rm -rf "$BACKUP_DIR/$DIR_NAME" # Remove the temporary directory used for backup
rm "$BACKUP_FILE" # Remove the compressed backup file
echo "Backup for $DIR_NAME completed, uploaded, and local file deleted."
else
echo "Failed to upload $BACKUP_FILE to Google Drive."
fi
fi
done
# Part 2: Backup each database individually
DB_LIST=$(mysql -e 'SHOW DATABASES;' -s --skip-column-names | grep -Ev "(information_schema|performance_schema|mysql|sys)")
for DB in $DB_LIST; do
DB_BACKUP_FILE="$BACKUP_DIR/${DB}-backup-$DATE.sql.gz"
echo "Backing up database $DB to $DB_BACKUP_FILE..."
mysqldump "$DB" | gzip > "$DB_BACKUP_FILE"
# Verify if the database backup was successful
if [ $? -eq 0 ]; then
echo "Database $DB backup successful. Uploading to Google Drive..."
# Upload the database backup to Google Drive using rclone
rclone copy "$DB_BACKUP_FILE" gdrive:/Virtualmin/DB -vv
# Remove the local database backup if uploaded successfully
if [ $? -eq 0 ]; then
rm "$DB_BACKUP_FILE"
echo "Database $DB backup uploaded and local file deleted."
else
echo "Failed to upload the database backup for $DB to Google Drive."
fi
else
echo "Database backup for $DB failed."
fi
done
What This Bash Script Does?
- It takes the backup of all directories individually in /home excluding the mentioned
- Also, it takes individual Databases.
- Once each backup is completed it will transfer the backup to Google Drive and delete the one on the server to save you space.
2. Backup All Selected Files/Directories e.g zudotech1, zudotech2, zudotech3 Excluding Other Directories
#!/bin/bash
# Define the source directory and the backup directory
SOURCE_DIR="/home" # The main directory to back up
BACKUP_DIR="/root/backups" # Temporary backup location
DATE=$(date +"%Y-%m-%d_%H-%M-%S")
# Create the backup directory if it doesn't exist
mkdir -p "$BACKUP_DIR"
# Part 1: Backing up specified directories
for dir in "$SOURCE_DIR"/*; do
if [ -d "$dir" ]; then # Check if it's a directory
DIR_NAME=$(basename "$dir") # Get the directory name
# Only back up specified directories
if [[ "$DIR_NAME" != "zudotech1" && "$DIR_NAME" != "zudotech2" && "$DIR_NAME" != "zudotech3" ]]; then
echo "Skipping $DIR_NAME..."
continue # Skip this iteration if the directory is not one of the specified ones
fi
BACKUP_FILE="$BACKUP_DIR/backup-$DIR_NAME-$DATE.tar.gz" # Define the backup file name
echo "Backing up $DIR_NAME to $BACKUP_FILE..."
# Rsync to gather files into a temporary directory
rsync -av --progress "$dir/" "$BACKUP_DIR/$DIR_NAME/" # Copy the contents to a temporary directory
# Compress the backup
tar -czvf "$BACKUP_FILE" -C "$BACKUP_DIR" "$DIR_NAME"
# Upload the backup to Google Drive using rclone
echo "Uploading $BACKUP_FILE to Google Drive..."
rclone copy "$BACKUP_FILE" gdrive:/Virtualmin/Files -vv
# Remove the local backup if uploaded successfully
if [ $? -eq 0 ]; then
rm -rf "$BACKUP_DIR/$DIR_NAME" # Remove the temporary directory used for backup
rm "$BACKUP_FILE" # Remove the compressed backup file
echo "Backup for $DIR_NAME completed, uploaded, and local file deleted."
else
echo "Failed to upload $BACKUP_FILE to Google Drive."
fi
fi
done
# Part 2: Backup each database individually
DB_LIST=$(mysql -e 'SHOW DATABASES;' -s --skip-column-names | grep -Ev "(information_schema|performance_schema|mysql|sys)")
for DB in $DB_LIST; do
DB_BACKUP_FILE="$BACKUP_DIR/${DB}-backup-$DATE.sql.gz"
echo "Backing up database $DB to $DB_BACKUP_FILE..."
mysqldump "$DB" | gzip > "$DB_BACKUP_FILE"
# Verify if the database backup was successful
if [ $? -eq 0 ]; then
echo "Database $DB backup successful. Uploading to Google Drive..."
# Upload the database backup to Google Drive using rclone
rclone copy "$DB_BACKUP_FILE" gdrive:/Virtualmin/DB -vv
# Remove the local database backup if uploaded successfully
if [ $? -eq 0 ]; then
rm "$DB_BACKUP_FILE"
echo "Database $DB backup uploaded and local file deleted."
else
echo "Failed to upload the database backup for $DB to Google Drive."
fi
else
echo "Database backup for $DB failed."
fi
done
What This Bash Script Does?
- It takes the backup of mentioned directories alone individually in /home excluding the rest.
- Also, it takes individual Databases.
- Once each backup is completed it will transfer the backup to Google Drive and delete the one on the server to save you space.
- You can use the restore method of the first bash to also restore this.
3. Backup Only Databases Individually
#!/bin/bash
# Define the backup directory and date format for the backup files
BACKUP_DIR="/root/backups" # Temporary backup location
DATE=$(date +"%Y-%m-%d_%H-%M-%S")
# Create the backup directory if it doesn't exist
mkdir -p "$BACKUP_DIR"
# Part 1: Backup each database individually
DB_LIST=$(mysql -e 'SHOW DATABASES;' -s --skip-column-names | grep -Ev "(information_schema|performance_schema|mysql|sys)")
for DB in $DB_LIST; do
DB_BACKUP_FILE="$BACKUP_DIR/${DB}-backup-$DATE.sql.gz"
echo "Backing up database $DB to $DB_BACKUP_FILE..."
mysqldump "$DB" | gzip > "$DB_BACKUP_FILE"
# Verify if the database backup was successful
if [ $? -eq 0 ]; then
echo "Database $DB backup successful. Uploading to Google Drive..."
# Upload the database backup to Google Drive using rclone
rclone copy "$DB_BACKUP_FILE" gdrive:/Virtualmin/DB -vv
# Remove the local database backup if uploaded successfully
if [ $? -eq 0 ]; then
rm "$DB_BACKUP_FILE"
echo "Database $DB backup uploaded and local file deleted."
else
echo "Failed to upload the database backup for $DB to Google Drive."
fi
else
echo "Database backup for $DB failed."
fi
done
What This Bash Script Does?
- It takes the backup of individual databases.
- Once each backup is completed it will transfer the backup to Google Drive and delete the one on the server to save you space.
- You can use the restore method of the first bash for a database to also restore this.
4. Backup Only the Database in One file (System Database)
#!/bin/bash
# Define the backup directory and date format for the backup files
BACKUP_DIR="/root/backups" # Temporary backup location
DATE=$(date +"%Y-%m-%d_%H-%M-%S")
# Create the backup directory if it doesn't exist
mkdir -p "$BACKUP_DIR"
# Backup all databases in a single file
DB_BACKUP_FILE="$BACKUP_DIR/all-databases-backup-$DATE.sql.gz"
echo "Backing up all databases to $DB_BACKUP_FILE..."
# Perform the database backup
mysqldump --all-databases | gzip > "$DB_BACKUP_FILE"
# Verify if the database backup was successful
if [ $? -eq 0 ]; then
echo "Database backup successful. Uploading to Google Drive..."
# Upload the database backup to Google Drive using rclone
rclone copy "$DB_BACKUP_FILE" gdrive:/Virtualmin/DB -vv
# Remove the local database backup if uploaded successfully
if [ $? -eq 0 ]; then
rm "$DB_BACKUP_FILE"
echo "All databases backup uploaded and local file deleted."
else
echo "Failed to upload the database backup to Google Drive."
fi
else
echo "Database backup failed."
fi
# Retain only the last 5 database backups on Google Drive
echo "Cleaning up old database backups on Google Drive..."
rclone ls gdrive:/Virtualmin/DB | awk '{print $2}' | sort -r | sed -e '1,5d' | while read -r db_file; do
rclone delete "gdrive:/Virtualmin/DB/$db_file"
echo "Deleted old database backup: $db_file"
done
What This Bash Script Does?
- It takes the backup of the full system database in one file.
- Once the backup is completed it will transfer the backup to Google Drive and delete the one on the server to save you space.
- It will then delete older backups from Google Drive DB keeping the latest 5 backups.
- You can use the restore method of the first bash to restore database.
Frequently Asked Questions
Backing up both files and databases of your server is very crucial for data retention, whenever you have issues you can easily restore your files back to your server.
Daily backup is more ideal, you can also have weekly and monthly backups.
The reason for the compression of files is to reduce their size and make it faster to transfer or restore to cloud storage to and fro. When downloading it also keeps all files intact and makes them download faster.
Through a custom bash script, you can automate the backup process of your databases with a scheduled cron on your Linux server.
Here comes the conclusion of this tutorial and I am confident that you can now make seamless backups of your files and databases on your Virtual Private Server for free. If you encounter any error kindly make use of the comment box as I am here to respond as soon as possible.