With the World Backup Day in our rear-view mirror, giving a second thought to our backup needs become utterly apparent. Most computer professionals probably have some kind of nagging voice inside their heads reminding them of creating backups, which works fine to some extent, until realizing that all backups are in-house and will be lost in case of a fire.
People tend to have no backup at all however, and adding a cloud based backup solution would greatly benefit these kinds of users. There are a lot of options though, and finding one that suits a particular need is not the easiest thing to accomplish.
Given that modern age files are quite large, with photo libraries containing 20 Gb worth of pictures every year or more, having a solution with unlimited storage, or as cheap as possible per gigabyte, is crucial. Not only that, setup has to be minimal and it should by default backup everything in the normal documents folder, music and other type of user-created content.
Having a DSLR camera that outputs raw files at about 10 Mb per photo, which within in a year amounts to 10 to 40 Gb worth of pictures, a remote backup solution with plentiful of storage is desperately needed. Other types of media include captured HD video files of irreplaceable moments and bought music, which together amounts to hundred of gigabytes worth of precious and irreplaceable data.
This means that my storage needs are quite large and increasing by the day, which means that a reasonably cheap and fast service is needed, which in addition is reliable and as secure as possible. These demands might sound like an oxymoron, but finding the perfect backup solution should encompass all these properties in some way.
I would also like a service which is reasonably priced for at least three computers backing up to the same account, but preferably being able to use at least five computers would be optimal. This means that backing up my parents’ computer to the same account will be a breeze and with no extra cost.
The whole reason for having a cloud based backup is to have my precious data available off-site, and to make things easier, the service should preferably have reasonable download and upload speeds and its agent should be able to operate without intervention when everything is configured and running.
When deciding to use a cloud based backup solution, there is a wide array of applications and services to consider. There are different types of backup services, and the most common ones are probably file synchronization services such as Dropbox and box.net.
While their goal is to synchronize files between different computers and other devices, they also have the ability to backup versions of the file when they change or are deleted. This provides an excellent solution for sharing document and other files when collaborating with other people, or when working on the same content using different devices. Storage is however not cheap if you plan to store more than a couple of gigabytes worth of data.
On the other side, there are backup software which usually do not have the file synchronization capability, but are more focused on keeping backups of your files, with no bells and whistles. The benefit of using something like this instead, is that cloud space is usually cheaper, with many backup providers claiming “unlimited” space.
There are a lot of players in this market however, such as SpiderOak and BackBlaze. While SpiderOak could possibly be a descent service, it would be too expensive for my storage needs. At the rate of $10 per 100 Gb, with how many computers you like, it however becomes apparent that this is an excellent service if your storage needs do not exceed that first tier of 100 Gb.
Backblaze on the other hand has a native Mac client and offers an easy plan of $5 per computer and month for unlimited storage. One of the key features however is their restore service, which means that they can overnight you a hard drive or DVD with your data for a fast restore. There is just one problem with this service — the data on the chosen media is sent unencrypted!
That brings me to the topic of security, and that no one of the services above have (to my knowledge) support for using your own encryption key. This means having to trust the provider to keep your password and key secure, instead of knowing that your own encryption key never leaves your computer.
Another option I considered was CrashPlan, which was featured on the World Backup Day website. Having never tried it or even heard of it before, I was reluctant to consider it. The client is also written in Java, making it easier to run on multiple platforms, but memory and performance issues are usually lurking.
The user interface is quite pleasing to the eye, and once the client is initially launched and an account is created, a backup of the home directory is started automatically. Most people would be satisfied with leaving the application in its default state, since their entire account would be backed up. There is however a lot more than meets the eye at first glance.
The most prominent feature when starting the application is the destination selection, providing the ability to backup using different storage endpoints. While backing up to “CrashPlan Central” will cost you money, the other backup options are free.
If you have a friend running CrashPlan, you can add each other as destinations for the backups, giving both parties the benefits of off-site backups while still using the free version. You will however need to provide enough storage for each others’ backup needs, which is not free in itself.
The same procedure can be used between different computers within your own account. They can act as destinations as well, potentially providing you will off-site backups if you have computers at different physical locations.
As mentioned earlier on, having an online backup together with a large backup size requires plentiful of bandwidth to work properly. Having backed up a considerable amount of data to the CrashPlan servers, there was a big difference in how fast the server nodes were able to receive the data.
Before measuring the upload speed, the settings for CPU and bandwidth usage were tweaked to allow maximum throughput. My internet link is a 100 Mbit fiber connection, so if there are any delays or bandwidth issues, they reside on the server side.
I started backing up my music collection on my Macbook Pro, which performed at a fairly constant rate at 3.2 Mbit/s. Even though this was fairly slow, it was bearable, give my not so large music library on this particular computer.
Backing up on the NAS was a completely different story however. Another server was chosen as the target for the backup (this is done automatically), but this time around, the throughput maxed out at about 700 Kbit/s at times, which is terribly slow if the data to be backed up exceeds 100 Gb, which it did in this case.
Having a backup solution in the cloud inherently raises privacy and security concerns. A lot of people will be uneasy giving up their data to a third party without knowing their data is safe from prying eyes.
CrashPlan uses Blowfish with a 448 bit key to secure the data at rest, and the communication is additionally encrypted using normal SSL connections with AES and a 256 bit key. The Blowfish key is then escrowed together with your data on the CrashPlan servers, encrypted with your account password.
For most people, the above solution is perfect, given the simple nature of the setup. The end user never has to touch the encryption key or remember anything more complicated than their own account password. When restoring files on a new computer, it is just the matter of logging into the account and restoring the files from the server.
The downside of this solution is that there is no way to partition the associated computer within the account, meaning that any computer logged into the user account can restore any file from any computer to the local computer.
There is another security mode which separates the encryption key with the user account. That way, you still have the CrashPlan user account, but the encryption key is protected with another password. The benefits of using this mode is that different computers can have different passwords, and thus separate encryption keys. This fixes the problem with all computers being able to access all information on each server associated with the account.
The third option is to provide the encryption key directly instead of using passwords to encrypt the key stored on the server. This means that it is impossible for someone without knowledge of the encryption key to decrypt the data. The downside is that the key needs to be kept secure, since it needs to be provided when doing a data restore. Having the key on paper in a safety deposit box or some other secure location will be necessary, since losing the key means that it will be impossible to decrypt the data on the CrashPlan servers.
Security conscious people will undoubtedly distrust the implementation of the client handling the encryption key. Who knows if the key is secretly transmitted to CrashPlan without the user’s knowledge?
Having started the trial of CrashPlan only a few days ago, I have yet to uncover severe behavior and inconsistencies. It has been a fairly smooth ride so far setting up my own encryption key and backing up three computers.
There was however one weird kink when creating and using keys for encryption. When the key was created on the Windows platform, it could for some reason not be validated on the Mac, which at first made me doubt the service. However, when I created a new key on the Mac, it could successfully be used both on the Mac and in Windows, as well as my Linux server.
If you are planning to use CrashPlan on the Mac, you may experience an unusually high memory load, which is partly the result of CrashPlan being executed using 64 bit Java. There is a simple way to change to 32 bit execution however, which involves editing /Library/LaunchDaemons/com.crashplan.engine.plist and adding “-d32” to the ProgramArguments section. For other memory optimizations and a discussion, have a look at the Reduce memory usage thread on the CrashPlan forums.
Another thing which could be improved upon are upload and download speeds, which are abysmal compared to the available throughput. The speed when backing up my Mac seemed to stabilize at about 3.2 Mbit/s and the speed on my NAS is running at about 1 Mbit/s. Extremes ranged from about 500 Kbit/s to 20 Mbit/s, which is basically all over the place. Not that this is usually a problem once the initial backup has completed, but it could be a lot faster. This is also one of the reasons I am hesitant to become a member once the trial has run out, but I may change my mind, since it is extremely convenient.
The other reason however, is privacy. While I am confident that CrashPlan does not “backup” my encryption key once I have chosen to use my own, there could be programming errors or other problems, exposing this key in some manner.
The alternative would be to setup a backup server at some location with plenty of disk space to mirror all my data, including changes made to file and using some kind of rsync snapshot solution. This requires a somewhat hefty investment on the hardware side however, while CrashPlan is ready to backup anything I throw at it.
When the trial expiration starts to creep up, I will hopefully have some more insight into reasons to stay or quit. Until then, I am staying with CrashPlan.