Rethinking the Approach to Enterprise Data Storage

by Andrew Mullen on October 19, 2016

As the pace of technological advancement has accelerated, seemingly day by day, IT has often been the driver of change that brought the entire corporation along in its wake. But now IT managers and CIOs are themselves struggling to adjust to rapidly changing conditions, particularly in the area of data storage.

Enterprise data storage is growing at an exponential rate. New application areas, such as social media and the Internet of Things, are driving an explosion in the amount of data corporations generate, use, and store. But in many cases, corporate data centers have been slow to adapt. They are still doing business the way they have for decades, even though it's becoming more clear every day that those traditional practices are no longer as effective as they once were.

Often corporate data centers are filled with relatively inexpensive commodity storage hardware, such as hard disk drives (HDDs). IT managers are finding that as they attempt to accommodate modern application workloads by expanding the storage farms in their data centers, those facilities are becoming more complex, more difficult to manage, and more costly.

The problem is exacerbated by the fact that the practice in many corporate IT shops has been to purchase three to five years worth of storage up front, holding a significant portion of that capacity in reserve until needed. But with the greatly expanded performance demands of today's applications, and in light of new storage technologies that are now available, many of those pre-purchased storage units have become obsolete before ever being used. Yet, because of the amount of capital tied up in those older devices, IT managers often find themselves locked into outdated hardware, unable to replace it with newer, more capable technology.

The effect has been to back IT managers into something of a corner. As Ellen Rubin, CEO of ClearSky Data puts it, "[Customers] need to be able to decommission hardware when the time is exactly right for the business." She goes on to say, "Enterprise IT teams believe they have no choice but to invest and reinvest in data center infrastructure every few years as their data stores grow and technology changes. They are stuck on a commodity storage treadmill."

Worse, when IT shops reach a point in their procurement cycle where they can do major upgrades to their storage hardware, they are often faced with an extremely complex data migration effort that may cause application downtime, and sometimes even loss of data.

With such issues becoming more urgent as the pace of change continues to accelerate, it's time for IT professionals to change the way they do business. Here are some suggestions about new directions your IT department may wish to consider.

1. Move to a scale-out architecture

Most legacy storage can be characterized as "scale-up" rather than "scale out." In a scale-up system adding storage units does not increase the compute power or bandwidth they all share. The result is that system performance degrades as capacity is added.

But in a scale-out arrangement, storage subsystems are implemented as logical nodes, each of which contains not only its own set of storage devices, but also its own compute engine and its own I/O bandwidth. Capacity is increased by adding more nodes. Since each node brings with it more compute power and bandwidth, a scale-out system actually becomes faster as capacity grows. This relieves the IT department of the necessity of over-provisioning in anticipation of future demand growth.

2. Gradually decommission older devices, and upgrade to new technology

One great advantage of scale-out systems is that they allow use of different technologies within the storage array. This means that an IT department's legacy storage can be mixed with new acquisitions. Over time the older storage devices that are approaching their end of life can be replaced with newer technology.

3. Break dependence on legacy devices by taking advantage of cloud storage

As commodity devices (those HDDs that have served so well over the years) are taken out of service, some of that capacity can be replaced with cloud-based storage rather than by the purchase of new on-site hardware.

Use of a Storage as a Service (STaaS) provider can relieve budget pressures by allowing almost infinitely scalable capacity to be acquired through small monthly OpEx expenditures rather than large CapEx outlays. In addition, tasks such as maintenance support, backups, data security, and disaster recovery can be taken on by the STaaS vendor, allowing on-site staff to concentrate on more strategic issues.

4. Implement an in-house/cloud hybrid strategy

The biggest concerns many IT managers have about moving their data to the cloud are security and latency. Both issues are addressed by a hybrid solution in which storage for mission-critical or performance-driven data is kept in-house, while data that is less sensitive, or which is accessed less frequently, is committed to the cloud. A 2014 survey conducted by 451 Research indicates that 39 percent of enterprises were already implementing a hybrid strategy.

5. Take advantage of the mix-and-match capabilities of new technologies

Storage solutions built around new technologies such as flash memory, Software Defined Storage (SDS), virtualization, and tiering can increase the performance, flexibility, agility, and cost effectiveness of a company's IT data storage infrastructure many fold.

For example, a tiered system allows storage to be segregated and accessed based on performance and capacity requirements. Thus, Tier 1 (or Tier 0) data used by applications that demand the greatest performance might be stored on speedy but costly flash-based Solid State Drives (SSDs). But the data of applications that work fine with a lower level of responsiveness might be housed in Tier 2 storage such as inexpensive, large capacity HDDs. And through the use of technologies such as SDS and virtualization, the management required to slot data into the appropriate tiers, and their associated storage devices, can be done automatically and on the fly by software.

This has been just a brief overview of the issues facing enterprise IT managers today, and the opportunities available to them for once again getting ahead of the technological change curve in their organizations. If you'd like to further explore how your IT department can lower costs and improve service by adopting new data storage solutions, please watch the Talon FAST™ video.

May 12, 2017

Why Hyperconverged Infrastructure is the Future of Enterprise Computing

With the explosion in the amount of digital data corporations generate and use, the traditional storage marketplace is undergoing a fundamental transformation. ...
Continue »


May 10, 2017

Expert Interview Series: Matthew Seto of Topia Technology on How To Secure Your Electronic File Sync And Storage Network

Thanks for agreeing to take part in our Expert Interview program! This is a great opportunity to promote yourself as an industry influencer, as well as earn a h...
Continue »


April 18, 2017

Why SDS Is Becoming More Attractive To CIOs

With the amount of information corporations handle on a daily basis growing at exponential rates, corporate CIOs are having to rethink their approach to data st...
Continue »


Read more...


October 15, 2016

Busted! Debunking 7 Enterprise Data Storage Myths About Software-Defined Storage

The amount of data being generated, consumed, and stored by companies around the world is growing at an exponential rate. Software-Defined Storage (SDS) is bein...
Continue »

October 5, 2016

The IT Pro's Guide to Enterprise Data Storage

The market for enterprise data storage is expanding fast. Applications such as social media, cloud services, server virtualization, and big data analytics are d...
Continue »