Worst Practices for Data Backup

Here are some of the worst practices in backing up data:

  1. Doing no data backup at all. This seems like a no-brainer but it is a common mistake. In many cases it is an unintentional mistake. Backup jobs are neglected and don't backup all important data.

  2. Failing to keep offsite backup copies of data. This is one of the most common problems. Storing backup copies in the same building as the computers your are backing up does not protect against many disasters.

  3. Failing to monitor your backup jobs effectively. Don't assume that no news is good news. Backup jobs often fail, don't run, or skip files. Check your backup jobs and logs often and correct any problems as soon as possible.

  4. Using manual procedures. Manual procedures are often forgotten, postponed, neglected and just not performed. Use automated procedures to make your data backup reliable.

  5. Not encrypting backup data. Most backups contain some sensitive data. Backup media, including portable disk drives, can be lost or stolen and fall into the wrong hands. An old tape or disk drive that you may have erased, may end up being salvaged and data scavenged by identity thieves long after you have discarded them.

  6. Not performing test restores of your data. Just because a backup job finished without errors doesn't guarantee that you can restore your data. The only way you can be sure is to periodically test the restore procedure.

Data Backup and Recovery

3 comments:

Anonymous said...

I completely agree with your ideas. Thats why the software I use encrypts and sends the data to an offsite server. I have been using it for 2 years now and it is very reliable and covers everything in your guidelines! Try www.remote-backup.com if you are interested in something that fits your goals for a backup.

Mike Mavilia said...

Completely agree to this post. These practice can be suicidal. And moreover to know this facts in detail is essential. Thus, you deserve a thanks, as you have shared these aspects.

hnblog said...

Completely agree to all the best practices . Test restores are specifically important to ensure data integrity . However when you are backing up petabytes of data as in high density storage , data integrity tests will need to rely on proven mathematical algorithms like checksumming that backup softwares provide. The very reason that restoring a large volume may need duplication of the primary storage infrastructure .In addition sampling procedures during non-critical times to validate test restores as previously mentioned are equally important to guarantee media sanity and ensure no deterioration with age. Specifically true with tape and disk drives that are old .
While this is true for enterprises managing their own infrastructure for data protection , businesses relying on cloud for data protection need not be as worried . Cloud providers like Amazon(AWS) provide both archival and backup features to backup local data center data to the cloud and can guarantee SLA's based on the tier you sign up for . Cloud providers ensure data integrity and it may not be necessary to be overly worried about executing test restore procedures .