Avoiding data corruption in backups

Alternatives

There are a number of alternatives with a long, successful history in the field of integrity verification.

AIDE, a host-based intrusion detection system, can be repurposed for verifying the integrity of the files in a given folder [3].

You can use mtree, a popular program from the BSD world, to verify that the contents of a directory tree match a specification. For example, mtree could be used to create a specification file from a folder containing known good data:

$ mtree -c -k md5digest -p Foals > /var/specification

Then, you can verify the contents of Foals against the specification file with:

$ mtree -f /var/specification -p Foals

mtree isn't popular in the Linux ecosystem, but you can find it in some repositories [4].

Finally, bitrot, a python script, can locate files damaged because of hardware defects. While it does not locate files lost because of human error or certain sorts of software error, it is very easy to set up and run despite its limited nature. If you are interested in using bitrot, I recommend reading SolËne Rapenne's tutorial [5].

Limitations

Creating a checksum of every file within the folder you intend to backup is time consuming. For datasets that are larger than a couple of terabytes, the process may take more than half an hour.

Generating a checkum file and verifying it against the previous checksum file before each backup is only practical if the data being backed up doesn't change often. If you try this approach with a busy folder, comparing both checksum files will throw more differences than it would be reasonable to verify manually. For this reason, I recommend this method for folders which don't change often, such as directories full of family pictures or ebooks, in which files are usually added but rarely modified or removed.

Should it be necessary to verify the integrity of a folder whose contents change frequently, then I recommend using bitrot, because this tool only throws warnings for files whose checksum have changed but which have not suffered any changes to their modification times.

Conclusions

Having multiple backups of your data and keeping old versions of your files are great measures for preventing data loss, but they are not enough. In order for a backup strategy to work, you must be able to verify that your backup files are uncorrupted.

While many tools exist for verifying the integrity of your data, you don't need a complex solution. The coreutils package lets you manage moderate amounts of data.

Ultimately, discipline is the most important factor when it comes to data integrity. You need to define a routine and stick to it. This, in my experience, is where most users fail.

Infos

  1. rsync: https://rsync.samba.org
  2. ZFS: https://docs.freebsd.org/en/books/handbook/zfs/
  3. "Detect evidence of break-in attempts with host-based intrusion detection systems" by Tobias Eggendorfer, Linux Magazine, issue 183, February 2016: https://www.linux-magazine.com/Issues/2016/183/Host-Based-IDS
  4. NetBSD's version of mtree: https://repology.org/project/mtree-netbsd/versions
  5. SolËne Rapenne's bitrot tutorial: https://dataswamp.org/~solene/2017-03-17-integrity.html

The Author

Rubén Llorente is a mechanical engineer who ensures that the IT security measures for a small clinic are both legally compliant and safe. In addition, he is an OpenBSD enthusiast and a weapons collector.

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy Linux Magazine

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

comments powered by Disqus
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters

Support Our Work

Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.

Learn More

News