2018 Resolution: No More Branch Office Backups

by Michael Fiorenza on February 12, 2018

How much easier would your life be if you never backed up branch office data again, and if you never had to worry about recovering branch office data following a local disruption or even a ransomware attack? What if you never had to ask anyone to ship tapes from a branch office to your central data center, or vice versa? What if you never worried about whether each of your branch offices could meet RTO and RPO objectives following a disaster?

If this life sounds appealing to you, you're ready to lose some weight in 2018—the weight of dealing with branch office backups, that is.

Now let's take it a step further. What if we told you that, in addition to eliminating local backup to hardware or enterprise cloud file storage, you may not even need local file servers and storage arrays at all? Imagine how much time, money, and hassle you would save with only one authoritative data set to back up, govern, and protect.

This year, you may start by eliminating branch office backups and end up eliminating a lot of unneeded branch office infrastructure. In our experience, clients end up saving as much as 70 percent off their storage costs. Let's take a look at why continuing to deal with local backup makes no sense. Then, we'll explain how to quit branch office backups, cold turkey, for good.

Branch Offices and Unstructured Data

Cisco has compiled some stats about enterprises and branch offices.

  • 80 percent of enterprise employees work away from the company headquarters
  • The average branch office has 4 to 6 servers.
  • These servers run at 15 percent or less utilization.
  • Businesses spend $6 billion per year on branch servers, storage, backup, and management.

Eighty percent of an enterprise's stored data is unstructured; we estimate that approximately 60 to 80 percent of unstructured data lives outside the data center at branch locations. At any given time, less than 10 percent of that data is actively being used. Word docs, CAD files, electronic health records, medical imaging, security logs, and other images and videos you own but rarely use, are all left untouched for years, taking up valuable space.

Enterprises approach branch office backup in many ways. They may use a combination of tapes, local file servers, SSDs, SANs, and enterprise cloud file storage, depending on factors like data sensitivity, how often the data is accessed, and how quickly it would need to be recovered in case of disaster. This unstructured data has to be preserved, sometimes because it contains intellectual property, other times because there's a regulatory requirement to do so, and always to prevent permanent loss. In some ways, enterprises are lucky to have more storage options than ever before, but in other ways, too many choices have led to way too much complexity. A lot of time, money, and effort is dedicated to data that's active less than 10 percent of the time.

Local backup and storage also ignores the fact that multiple employees at different locations often collaborate on the same file. Not only are you backing up these files at one branch, you're backing up different versions of the same file at local branches. This workflow creates islands of data at different branches, with inconsistent versions of the file. There's a lot of duplication and no authoritative copy of the file.

A FAST™ Solution

It's easy to see the benefits of transitioning to a consolidated, centralized storage model. Capital costs are lower, and operational costs are more predictable when you can consolidate storage onto fewer machines. Functions like governance, security, and backup, which, at the end of the day are cost centers, become easier, more effective, and less expensive.

At the same time, a lot of the people we talk to worry that they don't have sufficient network infrastructure to support data centralization. They think applications won't perform as well as they would if users were accessing local file storage. The good news is that FAST™ was created to solve the performance problems associated with centralizing data, which can help you reduce server complexity, infrastructure, and storage, and eliminate your need for branch office backups.

The FAST™ Fabric

You can centralize enterprise data in one or more locations, in a traditional data center, hybrid cloud, or public enterprise cloud file storage. The FAST™ fabric transparently extends centralized file shares to all branch locations. The process is simple:

  1. In the data center, you enable the FAST™ fabric by deploying a software instance in the data center. This instance runs on Windows Server 2012 or beyond and serves as the core role, mounted directly to your corporate file shares.
  2. FAST™ integrates into any storage environment, including Azure and AWS. It runs on commodity storage that relies on SMB/CIFS, iSCSI targets, or direct integration with your cloud storage. You can continue to manage your data using NetApp or whatever other solution you're using.
  3. To enable distributed locations, you deploy additional FAST™ instances, configured as edge instances. These can be either software installation packages or virtual appliance templates that run on Windows Server 2012 or above. The edge instance connects to the core instance already mounted to your data center's file storage.
  4. At each location, the software creates a virtual file share and Intelligent File Cache. Users can access centralized files using drive mappings to the Talon virtual file share or leverage integration with the organization's existing DFS namespace. The Intelligent File Cache is stored and automatically managed on an NTFS volume to keep the active dataset close to the user, so they have immediate access to important files and projects.
  5. When users access the central dataset, a copy of the file is stored in the Intelligent File Cache. As the user works, the FAST™ fabric updates the cache and streams changes to the data center, updating the authoritative copy of the file. Files are locked centrally so only one user can make changes at a time, there's always one authoritative copy.
  6. The core instance streams and compresses data in transit between the data center and the edge instances. These reductions in file size, plus the Intelligent File Cache that sends only changes to the central file, mitigate latency and bandwidth challenges.
  7. The Intelligent File Cache purges inactive data automatically, which means you don't store files of unstructured data locally when users aren't working on them.

Because there's no need for local file storage, and all changes are saved to the central file copy, the FAST™ fabric eliminates your need to back up unstructured data at the branch. You can consolidate storage and save as much as 70 percent on your current storage costs. Whenever inactive data needs to be accessed by regulators or for e-discovery, you know where it is, and you know you have a single source of truth. When you need to restore services to a branch, it's far simpler to meet your RTO and RPO objectives.

For a detailed example of how FAST™ has helped our clients eliminate branch office backups, check out our work with Capita Property & Infrastructure.

A Resolution That's Easy to Keep

Giving things up is hard to do, even when you know it's good for you. Maybe you've been thinking that a middle way, like continuing to store some data locally and backing Tier 2 and Tier 3 unstructured branch data up to enterprise cloud file storage, is the right approach for you. We encourage you to go cold turkey in 2018; give up branch office backups for good. FAST™ is the missing link that's going to help you drop the weight of branch backups. For more detailed information, check out our FAST™ data sheet.

May 31, 2018

Managing Unstructured Data

Data is the cornerstone of an enterprise. It shapes the evolution of corporations and dictates international investments. Unfortunately, many global organizatio...
Continue »


May 29, 2018

The Future of Data Analysis: Fog Computing

To understand fog computing, basic knowledge of edge computing is required. Edge computing, or remote computing, is the current forefront of data analysis and s...
Continue »