Your NAS isn't enough – you still need to back up your data!
Power Failure
Modern filesystems are moderately resistant to power failure, but even the mighty ZFS could suffer from a blackout [9]. A UPS will help, but beware of cheap units: Many budget domestic UPS are not prepared to handle continuous operation and will wear out, eventually bringing down the NAS with them. According to a Ponemon Institute's 2016 survey, UPS failure is the top cause of unplanned data center outages [10]. What this means in practice is that blackout protection reduces the risk of suffering data loss from power loss, but it does not remove the threat entirely.
In enterprise scenarios, administrators are aware that trying to make a NAS bulletproof is not enough to guarantee true high availability. In practice, the enterprise uses Storage Area Networks (SAN) or distributed filesystems such as Ceph [11]. Such tools are deployed in computer clusters, in such a way that if a server goes down, the rest of the cluster remains operational.
The minimal (and, for serious purposes, insufficient) storage cluster that can be deployed is described in Figure 7. This is known as a Primary-Replica topology, in which the primary performs services for the clients. The replica's contents are periodically synchronized with the primary's. Should the primary go down, the load balancer will promote the replica and turn it into the new primary (Figure 8).
![](/var/linux_magazin/storage/images/issues/2022/260/high-availability-vs.-backup/figure-7/806252-1-eng-US/Figure-7_large.png)
![](/var/linux_magazin/storage/images/issues/2022/260/high-availability-vs.-backup/figure-8/806255-1-eng-US/Figure-8_large.png)
The Cloud Option
Real life high-availability systems are not something you are likely to be able to run at home: typically they feature redundant load balancers and might require some Border Gateway Protocol (BGP) magic thrown in. Even the naive and simple method I just described multiplies the cost of the storage by more than two, because it requires a redundant server and a load balancer (at which point you are likely to need a server rack in a server room).
It is therefore not a surprise that many users, especially small businesses, turn to professional storage vendors, who offer cloud storage for a fee and take care of ensuring the storage systems are perpetually available. Professional storage vendors might also be very cost effective. For example, cloud storage might cost you around $1,500 in four years, which is less than what you are likely to spend on a good NAS. As I assume a NAS is likely to need an upgrade around the fourth year, the cloud option is not entirely unreasonable. Sadly, storage vendors come with their own issues: Uploading your data to them can take much longer than uploading it to a local server, and some vendor environments might present privacy concerns.
Humans and Software
Even if you were to assume that your chosen storage solution is completely indestructible, it would still not eliminate the need for a proper backup system. If you manually delete a file by mistake, or if you lost the file to a software bug or malware, it makes no difference whether it was stored on a regular laptop, a high-end NAS, or a cloud storage provider. Experience shows that human mistakes force you to restore from backups much more often than hardware failures. Certain storage vendors know this and keep a historical registry of every file uploaded to them, so you can retrieve an old version of a file if you discover you have uploaded a corrupt version or deleted something important by accident. Interestingly, the vendor is actually running a backup policy for you.
« Previous 1 2 3 Next »
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.
![Learn More](https://www.linux-magazine.com/var/linux_magazin/storage/images/media/linux-magazine-eng-us/images/misc/learn-more/834592-1-eng-US/Learn-More_medium.png)
News
-
NVIDIA Released Driver for Upcoming NVIDIA 560 GPU for Linux
Not only has NVIDIA released the driver for its upcoming CPU series, it's the first release that defaults to using open-source GPU kernel modules.
-
OpenMandriva Lx 24.07 Released
If you’re into rolling release Linux distributions, OpenMandriva ROME has a new snapshot with a new kernel.
-
Kernel 6.10 Available for General Usage
Linus Torvalds has released the 6.10 kernel and it includes significant performance increases for Intel Core hybrid systems and more.
-
TUXEDO Computers Releases InfinityBook Pro 14 Gen9 Laptop
Sporting either AMD or Intel CPUs, the TUXEDO InfinityBook Pro 14 is an extremely compact, lightweight, sturdy powerhouse.
-
Google Extends Support for Linux Kernels Used for Android
Because the LTS Linux kernel releases are so important to Android, Google has decided to extend the support period beyond that offered by the kernel development team.
-
Linux Mint 22 Stable Delayed
If you're anxious about getting your hands on the stable release of Linux Mint 22, it looks as if you're going to have to wait a bit longer.
-
Nitrux 3.5.1 Available for Install
The latest version of the immutable, systemd-free distribution includes an updated kernel and NVIDIA driver.
-
Debian 12.6 Released with Plenty of Bug Fixes and Updates
The sixth update to Debian "Bookworm" is all about security mitigations and making adjustments for some "serious problems."
-
Canonical Offers 12-Year LTS for Open Source Docker Images
Canonical is expanding its LTS offering to reach beyond the DEB packages with a new distro-less Docker image.
-
Plasma Desktop 6.1 Released with Several Enhancements
If you're a fan of Plasma Desktop, you should be excited about this new point release.