Curt Frierson, EVP, Technology and Education
Archiving critical data is inarguably one of the most important technology functions of any organization. For many businesses, data is the most valuable company asset. If their data is lost, so too is their business. So why, then, do data backups so often go improperly managed? Perhaps this is due to a corporate culture that looks at this function as a distraction – a constant impediment which prevents real work from getting done. Or maybe it is simply the result of this process being the bane of a network administrator’s existence – a source of constant problems which must be continuously dealt with and overcome. Either way, the fact remains that network backups are the single largest issue that Safe Systems sees on a daily basis. This is a bit frightening considering the importance of the backup process.
In an effort to minimize the burden and headaches associated with data backups, Safe Systems has continuously searched for more stable, reliable, and cost-effective solutions. Understandably, the constant theme voiced from our customers has been the desire to require less involvement on the part of the organization in the backup process and to do so in a cost-effective manner. To this end, our recommended solutions have evolved from local, single-server backups using tape drives to a centralized model where all network servers are backed up through a single server to an external hard drive. Each solution has had its share of challenges. For the most part, though, the burden on the financial institution has effectively been reduced. In this way, the evolution from local to centralized backups has been a considerable success.
A responsible backup plan, however, still presents a few elements which can be painful to adequately address. There are three primary pillars of any good backup plan. First and foremost, data must be stored offsite. If you’re backing up your data but storing it in the same location as your production systems, a disaster could wipe out both the live and archived data together. Second, data should be archived offline. Online backup data (data that is accessible in real-time â€“ typically disk-based) is usually subject to the same risks of corruption, malware infection, and data manipulation as online production data. Keeping an offline copy eliminates the risks of these threats. Third, multiple copies of backup data should be kept to allow recovery to a previous point in time. Without multiple copies, a backup containing a deleted file may be overwritten by a more current backup before realizing the file was missing. The furthest historical recovery point may vary depending on the specific needs of the environment, but a good rule of thumb is to be able to recover data up to at least three weeks. Adhering to these three basic principles provides a solid foundation for backups and reduces the threat of irrecoverable data loss.
Although these requirements sound simple, there are many factors which make them difficult to achieve logistically. For institutions with widely dispersed branches, expensive courier services may be required to transfer backups to an offsite location. Also, many of the latest backup solutions provide online backups only. This is a problem when the data resides on the same network as the production data. A widespread malware infection could just as easily destroy the online backup data as the live production data. Next, some of these backup systems provide Continuous Data Protection (CDP) only without multiple backup versions. CDP replicates data to provide essentially a mirrored copy of production data to a central storage server. If a file or folder is deleted from a server, the changes are replicated through the CDP application to mirror the existing state of the production data – thus deleting the file or folder from the central storage server as well. This scenario creates obvious problems if files need to be restored due to deletion or undesirable changes.
Now that many of the problems have been highlighted, the question remains, “What is the ideal backup solution for a community bank or credit union?” The answer, of course, depends on the goals and preferences of the particular institution. However, the solution that is quickly emerging as the leading candidate for many institutions is data vaulting (sometimes referred to as online backup).
Data vaulting is the process of using a third-party provider to backup an organization’s data over the Wide Area Network (WAN). Backup agents are typically installed to each production server and configured to archive critical data to the service provider’s data center. The data is sent over an encrypted channel to provide the security required for transmission across the Internet. A central management console is typically provided to configure the backup jobs and select the appropriate data to archive, making the look and feel of this solution similar to that of a traditional, customer-hosted solution. The key difference is that this is a purely online backup solution which satisfies all three of the key requirements of a solid backup plan while avoiding many of the issues involved in a customer-hosted solution. The data is sent offsite to service provider’s facility. Also, data is essentially stored offline in relation to the institution’s production data. This is because the data is not at risk from threats involving the customer’s environment. A virus outbreak or data corruption in the institution’s network would not affect the data at the service provider’s facility. In addition, retaining multiple versions of backup data is an available option in virtually all data vaulting solutions. All three of these requirements are satisfied without the need for the institution to lift a finger. There are no tapes to rotate, no courier services required to send backup media to an offsite facility, and no hardware or media to purchase and constantly keep updated. Additionally, the archived data is accessible from anywhere in a disaster scenario. It can either be downloaded over the Web or shipped on an external disk drive. And since this is a completely hosted solution, the institution’s administrator is freed from the constant hassle of troubleshooting backup issues.
So what’s the catch? While data vaulting has been around for several years, the cost has historically been the main barrier barring widespread adoption. The good news is that the prices have continued to fall and are now at a point that is acceptable for many institutions. Typical prices are around $10/GB per month depending on the amount of data, which is about half the cost from a few years ago. This may still sound a little expensive for some but considering the cost of hardware, software, maintenance, media, labor, courier services, and outside support it is not an unrealistic option. A comprehensive cost/benefit analysis can usually make a strong case for data vaulting as a viable alternative. Many community banks and credit unions are finding that the benefits far outweigh any additional costs involved. Now may be the time for your community bank or credit union to investigate a data vaulting solution.