Cloud backup with MCrypt and S3cmd

Making a Hash of It

To alter the hash algorithm, use the -h parameter as follows:

# mcrypt -h tiger grrrr.doc

The role of the hash algorithm is to create a simple digest, which is added to the encrypted file and used to provide a checksum to detect any file corruption reliably. I've used the hash algorithm and changed it to tiger, which is one of the options shown in Figure 2:

# mcrypt -h tiger grrrr.doc
Figure 2: A list of the available hashes in MCrypt.

If you look closely from within a text editor, you can spot the easy-to-read tiger on the far right, in among the gobbledygook:

^@m^C@rijndael-128^@ ^@cbc^@mcrypt-sha1^@

You can change the algorithm with the -a switch (all are vulnerable to one attack or another). The following command changes to the famous DES – the Data Encryption Standard – algorithm created by IBM in the 1970s, which helped improve encryption significantly over the years:

# mcrypt -a des

The header of the file, as seen in a text viewer, shows what's changed (look for des):


Once your files are encrypted, you're ready to upload them into the cloud. If MCrypt doesn't suit your needs then check out the "Bcrypt" box below for an alternative.


The "b" in bcrypt [2] stands for Blowfish, which is a more network-friendly encryption algorithm developed by Bruce Schneier. I've mostly used bcrypt in SSH clients. Bcrypt can speed up encrypted sessions over poor connectivity because it's so amazingly lightweight. To use bcrypt, drop the package onto a Debian-based box by running this command:

# apt-get install bcrypt

Encrypted files will have the .bfe file extension. Unlike MCrypt, the efficient bcrypt will compress files automatically before performing the encryption. Additionally, it will remove any source files after it has had its way with them. Adding a -c tells bcrypt not to compress the files before wrapping them up, and -r asks it not to delete the original source files. The -o switch

# bcrypt -o linux_binnie.cfg

outputs the encrypted data directly to your console and your original file willl not be changed at all. The command

# bcrykpt -s100 chris_password.asc

scrubs the original, sensitive file from the disk 100 times after encrypting it.

My favorite add-on to bcrypt is associated with more clandestine operations. You can indulge in a little secrecy by "scrubbing" any deleted files repeatedly to prevent your hard drive from leaving any trace of them. To overwrite your original source files five times with randomized data, specify the -s5 switch. The default option is three overwrites if this flag is not set. Alternatively, if disk I/O is too precious, you can disable overwrites with -s0.

To the Power of Three

The Python-based S3cmd utility [3] lets you use the behemoth that is Amazon S3 to store your files ultra-reliably. You might be surprised that S3cmd and other utilities let you use Amazon S3 almost as if it were a local filesystem mounted on your desktop.

Although you do need to expose your Amazon Web Services login credentials, there are ways of entering your passphrase only when it is essential. A simple script could help you automate the process somewhat. Alternatively, full automation might use a root-owned, encrypted password file to drop your security token into the main .s3cfg file so that you can run your backups periodically with a cron job.

You can even use a third-party tool to limit the bandwidth the S3cmd utility uses, which allows you to run a bulky, and therefore lengthy, file transfer as a background process. To use S3cmd, you need an AWS account, which isn't that big a challenge (just give away all your private details, including your credit card, and you are all set).

On Debian or Ubuntu, install the minuscule file as follows:

# apt-get install s3cmd

This command will drop the packages python-support and s3cmd onto the system.

In the past, I had a few issues with older versions in the Debian repositories. It's no problem if your desired feature isn't immediately available; simply download the source and follow the instructions in the INSTALL file.

I was pleased to see that the newer version of S3cmd supports the ability to help out with a --configure option at installation time. You should be wary of where you expose your account details if you are copying and pasting the two important login and password equivalents from AWS, because they are highly sensitive. Don't try typing them because typing errors are all too common; make sure you can cut and paste them.

Fire It Up

To configure S3cmd, use the --configure option:

# s3cmd --configure

The output of the command is shown in Listing 1.

Listing 1

S3cmd Configuration

Enter new values or accept defaults in brackets with Enter.
Refer to user manual for detailed description of all options.
Access key and Secret key are your identifiers for Amazon S3
Access Key: chris binnie
Secret Key: linux
Encryption password is used to protect your files from reading
by unauthorized persons while in transfer to S3
Encryption password: abc
Path to GPG program [/usr/bin/gpg]: /usr/bin/gpg
When using secure HTTPS protocol all communication with Amazon
S3 servers is protected from 3rd party eavesdropping. This method
is slower than plain HTTP and can't be used if you're behind a proxy
Use HTTPS protocol [No]: Yes

As you can see from Listing 1, S3cmd even asks if you would like to encrypt the files with GPG [4], another excellent and sophisticated encryption tool. Additionally, you will see that I've selected the HTTPS transport method to avoid network sniffing. You can enable encryption with GPG by using the -e option. However, if you're absolutely adamant that that's not what you want (because you want to use a different encryption method, e.g., the excellent bcrypt), specify --no-encrypt.

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy Linux Magazine

Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

  • Command Line – crypt

    If you just need to encrypt a file or two, a descendant of crypt can do the job. Which one you choose depends on your objective.

  • Duplicity Cloud Backup

    If you're looking for a secure and portable backup technique, try combining the trusty command-line utility Duplicity with an available cloud account.

  • Manage Amazon S3 with s3cmd
  • Charly’s Column: S3QL

    Sys admin Charly has been an enthusiastic amateur photographer for many years. Recently, he started worrying about something happening to his rapidly expanding photo collection. Can the cloud save the day?

  • Duplicati

    The free backup tool Duplicati simplifies the process of backing up data with cloud providers while at the same time protecting backups with strong cryptography.

comments powered by Disqus

Direct Download

Read full article as PDF:

Price $2.95


njobs Europe
Njobs Netherlands Njobs Deutschland Njobs United Kingdom Njobs Italia Njobs France Njobs Espana Njobs Poland
Njobs Austria Njobs Denmark Njobs Belgium Njobs Czech Republic Njobs Mexico Njobs India Njobs Colombia