Rsync for website backup in a shared hosting environment
Back End Backup
Shared hosting is the best way for first-time webmasters to get started. But what do you do about backup?
Shared hosting remains the go-to choice for many first-time webmasters. The shared-hosting model, which allows several websites to share the same centrally managed server, lets the customer focus on web matters without getting involved with the details of the underlying OS. But the simplicity of the shared hosting environment also leads to some complications. For instance, although many shared hosts do allow users to connect to the shared server over SSH, hosting vendors typically don't provide root access for shared-hosting customers.
From a backup perspective, this lack of root access makes life a little difficult. Although there is a vibrant market of third-party vendors and managed service providers (MSPs) offering various types of cloud-to-cloud backup and data extraction solutions, many tools are proprietary and are not designed to allow easy replication to an on-site source, such as a user's desktop.
Also, although many popular applications like WordPress have their own backup and recovery plugins, these will obviously not be helpful if your website is not running WordPress – or if it's not running a content management service (CMS) at all.
Try the DIY Approach
Linux users are generally not adverse to trying out commands in the Linux terminal or getting under the hood to see what makes their systems tick. For them, backing up files and databases with a utility like rsync [1] is often a preferable option to hoping that a third-party solution will provide the functionalities that they are looking for to back up their data.
Rsync is one of the best known command-line utilities for backup and recovery. First released in the mid '90s, knowledge of rsync is a fundamental job requirement for many sysadmins and backup administrators. The tool supports transferring and synchronization to a variety of networked remotes. More recently, rclone [2] has extended its functionality to support backing up data to cloud storage repositories.
Using rsync, local backups can be taken onto a Linux-running local machine, like a desktop, or a network-attached storage (NAS), and then synced back off-site to comply with the 3-2-1 backup rule, which stipulates that data should be mirrored to one off-site repository and copied twice with each on different storage media. Rclone (rsync-like syncing between remote sources and targets) is fine for the latter purpose.
Given that, in a typical shared hosting environment, users will be caged from accessing any parts of the filesystem other than their user directory (/home/foo
), a slightly more creative approach has to be employed than would be the case when backing up from a virtual private server (VPS) or dedicated machine. But it's one that's readily achievable nonetheless.
Here's what I've set up.
1. Authenticate Local Machine with Host
Because rsync runs over the SSH protocol, the first step is to make sure that the machine from which you plan on backing up your shared hosting environment's filesystem has been authenticated with the server.
This is done in the usual manner. Generate an SSH key set if you don't already have one on the machine. Shared web-hosting environments typically include access to the cPanel web-hosting control panel for ease of administration. This has an SSH functionality, where public keys can be imported and authenticated. Generate the keys and import onto the host.
2. Connect over SSH and Run an Rsync Pull
Before adding this to a Bash script and setting it to run on cron it needs to be QA'd and tested. First, SSH into your shared hosting environment to make sure that it works. Next, you'll need to pull from source to destination using rsync. To do this, pay attention to the order of the syntax (it should be source and then destination). Make sure that the hosting environment is your source and the local filesystem your destination. Mixing up source and destination can have catastrophic effects and may cause you to wish that you had thought about backups sooner!
If your host insists that you connect over SSH with a non-standard port in order to improve security (some now do), then you'll need to pass that with a 'ssh -p 12345'
– obviously replacing 12345
with the port number they've asked you to connect over SSH with. Otherwise you can just connect as usual over port 22 – simply modify the syntax of the command.
rsync -arvz -e 'ssh -p 12345' yourhost@123.456.789.71:/home/youruser/ /backups/hosting/website1
Of course, you'll want to replace yourhost
with your web-hosting username and replace the example IP with the actual public IP of the shared server to which you need to connect.
Now let's break down that command a little. rsync
calls the rsync utility. Then come the parameters, which are entered together and prefixed by a minus symbol:
a
runsrsync
in archive mode. This recursively copies across files. It also keeps all file permissions, ownerships, and symbolic links intact.r
runsrsync
recursively.v
is a verbosity parameter. Threev
s, in fact, can be daisy-chained to produce the most verbose output possible. This is useful for debugging the command (when used in conjunction with the dry run parameter).z
compresses the file data as it is sent to destination.
Next we have the 'ssh -p XXXXX'
. I provided a non-standard SSH port here, but if you're with a shared host, then yours is more likely 21. After that, I provided my SSH username and the IP address of my hosting server. After adding a colon (:), I then provided the path that I want to back up recursively. At the end of the command comes my destination.
The beauty of rsync is that it is a block-level, change-syncing algorithm. Only the files that have been changed since the last time the command ran will be moved. Additionally, only the parts of those files that have been changed will be synced. This minimizes data transmission and maximizes the efficiency of the command. Rsync has been integrated into many backup programs where it can be used to power the full range of conventional backup options (full, incremental, and differential).
If you're just trying to back up the files in, say, a WordPress installation, then I recommend simply backing up from the WordPress root in order to avoid capturing the unnecessary file clutter that you'll typically find at the user root level in a shared hosting environment.
For instance back up from :/home/youruser/public_html/wp
to the target. Next, run the command and then verify that the directory has successfully built on your local machine.
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you've found an article to be beneficial.
News
-
The GNU Project Celebrates Its 40th Birthday
September 27 marks the 40th anniversary of the GNU Project, and it was celebrated with a hacker meeting in Biel/Bienne, Switzerland.
-
Linux Kernel Reducing Long-Term Support
LTS support for the Linux kernel is about to undergo some serious changes that will have a considerable impact on the future.
-
Fedora 39 Beta Now Available for Testing
For fans and users of Fedora Linux, the first beta of release 39 is now available, which is a minor upgrade but does include GNOME 45.
-
Fedora Linux 40 to Drop X11 for KDE Plasma
When Fedora 40 arrives in 2024, there will be a few big changes coming, especially for the KDE Plasma option.
-
Real-Time Ubuntu Available in AWS Marketplace
Anyone looking for a Linux distribution for real-time processing could do a whole lot worse than Real-Time Ubuntu.
-
KSMBD Finally Reaches a Stable State
For those who've been looking forward to the first release of KSMBD, after two years it's no longer considered experimental.
-
Nitrux 3.0.0 Has Been Released
The latest version of Nitrux brings plenty of innovation and fresh apps to the table.
-
Linux From Scratch 12.0 Now Available
If you're looking to roll your own Linux distribution, the latest version of Linux From Scratch is now available with plenty of updates.
-
Linux Kernel 6.5 Has Been Released
The newest Linux kernel, version 6.5, now includes initial support for two very exciting features.
-
UbuntuDDE 23.04 Now Available
A new version of the UbuntuDDE remix has finally arrived with all the updates from the Deepin desktop and everything that comes with the Ubuntu 23.04 base.