How SDS Can Keep Your Data At Home

by Shirish Phatak on May 5, 2017

Data storage is moving to the cloud at a rapid pace. The distinctive benefits of enterprise-level cloud storage offerings, including rapid deployment, accessibility, scalability, security, reliability, and substantial cost savings over on-premises installations, are proving to be very attractive to IT decision makers. According to a forecast by ReportsnReports, the global cloud storage market is expected to grow at a CAGR of 25.8% between 2016 and 2021.

Yet most storage professionals also acknowledge that not all information belongs in the public cloud. Although the percentage of corporate data residing in the cloud is increasing, a majority of enterprises continue to keep some or all of their data either under their own roofs or at a secure colocation facility. In fact, the dominant approach to enterprise storage today is the hybrid model in which an organization retains the most sensitive portions of its data in house while entrusting the rest to the cloud. The great advantage of the hybrid approach to storage is that it allows companies to take advantage of the financial and operational benefits of the cloud storage model for the bulk of their data, while keeping their most sensitive or frequently used information safely at home.

Growing numbers of storage professionals have reached the conclusion that the key to maximizing the effectiveness of both their on-premises and public cloud-based storage is the use of Software-Defined Storage (SDS). In a 2016 survey of more than 1,200 senior IT decision makers, 63 percent of respondents indicated that they expected to begin implementing SDS within the next year.

Reasons Companies Keep Parts Of Their Data At Home

The most common reason for enterprises to divide their data between in-house data centers and the public cloud is a concern for security. Although the reverse is often the case, many storage administrators are concerned that cloud storage is less secure than on-premises storage due to issues such as the cloud’s multi-tenancy model (in which multiple users share the same physical storage and compute resources) and the potential vulnerability of data while it is in transit across the internet.

In some cases legal or regulatory compliance issues pose obstacles to a company storing sensitive information in the public cloud. For example, companies subject to SSAE 16 financial reporting standards, or HIPAA requirements for the management of healthcare records, are often reluctant to consider cloud storage for such data because of the regulatory complexities involved.

For companies that depend on the highest levels of storage performance in order, for example, to service online customers in real time, cloud latency is a concern. Because data stored in the public cloud may actually reside in locations that are physically remote from the user, limitations imposed by the speed of light, if nothing else, ensure that access delays are an unavoidable consideration in any cloud storage solution. For that reason, organizations with performance-intensive workloads often opt to keep their storage facilities in house, so that “hot” (frequently used) data is as close as possible to the servers and users that access it. Information that is required less frequently or less urgently may be stored in the cloud.

Issues With On-Site Storage

A major concern with traditional on-premises storage solutions is that they tend to encourage data silos. Often the data used by a particular workload is contained within a storage subsystem that’s dedicated to that use, making sharing information with other applications that are serviced by similarly isolated storage extremely difficult. This is especially the case when data is generated or used in remote or branch offices (ROBOs), each of which may maintain its own copy of relevant information. Providing global access to ROBO information, while keeping the various local instances of such datasets in sync with one another, can be a daunting challenge.

The prevalence of silos in data center storage also makes storage management more complex. Each silo must be managed and backed up individually. Plus, there’s normally no way to perform load balancing between silos, resulting in some storage subsystems having too little capacity, while others in the same facility have more than they need.

One issue that especially affects hybrid storage arises because copying extensive amounts of data between on-premises storage and the public cloud is a slow process that requires a lot of bandwidth. That can limit an on-site data center’s ability to temporarily transfer applications and data to the cloud when demand spikes, a practice called “cloud bursting.”

How SDS Facilitates Keeping Sensitive Data At Home

The distinguishing feature of the software-defined approach is that it lifts storage system intelligence out of hardware and into software. This allows SDS to manage a heterogeneous mix of storage units and subsystems as a single, unified pool of storage. Using policy-based directives, the software can programmatically place and replicate data among various storage units in real time to accommodate the needs of different workloads, and for backup and disaster recovery purposes. Thus, with a well-designed SDS system, data silos can be eliminated.

In particular, a best-in-breed SDS solution, such as Talon FAST™, can effectively address the ROBO data isolation issue. With Talon FAST™ an organization can consolidate all its data into a central cloud-based (public or private cloud) repository. By the use of Intelligent File Caching, the most frequently used data is cached at the remote site, and only changes are sent back to main storage, substantially reducing both the time and the bandwidth required for such transmissions. In addition, the global file locking mechanism of the Talon solution insures that simultaneous changes to remote copies of shared datasets are disallowed, so that data integrity is maintained and users are always presented with the most up-to-date versions of the data.

Because SDS allows transparent spillover of data into the cloud, the difficulties and costs of provisioning storage to keep up with the exploding demand for data most companies are experiencing today are minimized. Only the most sensitive or performance-intensive data need be kept in house, while the bulk of a company’s information can be conveniently and safely stored in the cloud.

The proportion of corporate that is being migrated to the cloud continues to grow, but will probably never reach 100 percent. For the foreseeable future, many companies will have valid reasons for keeping their most critical data at home. But they don’t have to forego the benefits of the cloud storage revolution. With Software-Defined Storage, companies can literally have the best of both storage worlds.

If you'd like to know more about how a top-flight SDS solution can help you keep your data at home, please watch this video to learn more about the TalonFAST solution.

May 12, 2017

Why Hyperconverged Infrastructure is the Future of Enterprise Computing

With the explosion in the amount of digital data corporations generate and use, the traditional storage marketplace is undergoing a fundamental transformation. ...
Continue »


May 10, 2017

Expert Interview Series: Matthew Seto of Topia Technology on How To Secure Your Electronic File Sync And Storage Network

Thanks for agreeing to take part in our Expert Interview program! This is a great opportunity to promote yourself as an industry influencer, as well as earn a h...
Continue »


April 18, 2017

Why SDS Is Becoming More Attractive To CIOs

With the amount of information corporations handle on a daily basis growing at exponential rates, corporate CIOs are having to rethink their approach to data st...
Continue »


April 11, 2017

How SDS Is Disrupting The Traditional Storage Industry

Disrupt: to prevent something, esp. a system, process, or event, from continuing as usual or as expected...
Continue »