Small and mid-sized organizations are increasingly looking to modernize backup and recovery processes that have been around for 30 years. Several technologies underpin this trend including disk-to-disk backup, available network bandwidth, and Cloud infrastructure. The key business drivers are improved quality of backups, less risk to the business and potentially lower costs. Backup is becoming one of the most common use cases for the Cloud and in some form has been around for more than a decade, both as an enterprise service (e.g. disaster recovery) and for individual users (e.g. Mozy, Carbonite, etc).
As always however, IT practitioners need to manage issues including Cloud security, recovery performance, and other potential risks to the business. But the allure, simplicity, lower costs, and faster recovery make backup a perfect application for the Cloud.
Based in Oklahoma City, The National Cowboy & Western Heritage Museum is the country’s most prominent institution of Western history, art and culture; serving millions of visitors each year. A few years ago, management of the museum took a hard look at backup and recovery and decided that it needed to make some changes. On June 16, 2010, the Wikibon community welcomed Susan Adams, the assistant director of development at the Museum, and Sharon Kasper, its manager of IT.
Why Cloud Backup?
In fact the museum’s small two person IT staff did not set out with the intention of implementing Cloud backup per se. Rather the goal was to address the following problems:
- 10-15% of the IT staff’s time was spent on backups;
- Only about 20% of the backups were succeeding;
- Recovery was time-consuming, unreliable and inefficient; and,
- The organization was bearing unnecessary reputational risks.
Susan Adams shared with the Wikibon community that as a ‘small business’ the museum was interested in acquiring solutions to solve a problem, not acquire a specific technology. As such, the firm engaged with Corevault, a local service provider with an offsite data protection offering. The solution uses the Asigra backup technology as a core component of the offering.
The Project
In late 2007, the museum identified around 400GB of tier 1 data that needed to be protected and chose to go with a SaaS solution. Key applications included SQL databases, email and retail applications from the museum store. One key issue for the organization was that while the move to Cloud backup promised to shift capital expense to a monthly operating expense. Because Corevault charges on a per GB basis, the financial team from the museum had concerns about the variable nature of the expense. Specifically, the CFO wanted to eliminate surprises and ensure that the expense would remain under a threshold.
To manage the cost exposure, Corevault helped the museum IT staff predict the volume of activity and put a retention and migration policy in place that would tier the storage and allocate backups to the most cost-effective tiers. Policies are set and managed using Asigra’s software. The Asigra solution allows the customer to carve data into four tiers, including:
- Tier 1 – on site disk to disk – recovery in seconds,
- Tier 2 – remote in the cloud – recovery in minutes,
- Tier 3 – archive/nearline - recovery in mins or hours,
- Tier 4 – public cloud (i.e. deeper archive) – recovery in hours+.
The client, through Asigra’s solution, chooses how much data goes into each tier based on a number of factors, including RPO, RTO, cost, and business risk. One other benefit of the Asigra technology Kasper emphasized was that the museum can now perform restores at the individual e-mail level, meaning a lost e-mail can be recovered much more quickly without requiring a reload of the entire mailbox and consequent loss of any mail received after the backup, improving both IT, end-user productivity, and client service.
Since late 2007, the data set at the museum has grown to around a 1TB today. Kasper shared with Wikibon that she intends to pare that back by doing a manual data deduplication process. Specifically, using Asigra tools, she flags duplicate files and then deletes them manually. The museum has found this is a cost-effective way to manage costs and get rid of files, given the size of the organization.
Advice to Peers
Adams and Kasper were asked what if anything they would do differently if they had to initiate the project from scratch again. The following serves as guidelines for smaller organizations just getting started:
- Perform housekeeping before implementation – specifically identify duplicate and unneeded data and eliminate it to cut costs.
- Educate the staff prior to going live. Set organizational policies and encourage good storage practices – i.e. keep unnecessary junk off the servers.
- Think about retention rules – how long does data need to be online? Can it be nearline to lower costs?
Action Item: For many smaller organizations, simplified approaches to managing data, getting rid of unneeded files and demystifying backup can save substantial time and money and avoid unnecessarily expensive backup and restore solutions. Look for service providers that deliver turnkey cloud backup offerings and partner with them to set retention policies, understand RPO and RTO requirements and share certain operational risks of protecting data.
Footnotes: