Is Your Team on Top of These Seven Disruptive Enterprise Data Storage Trends?

by Jaap van Duijvenbode on December 7, 2016

Enterprise data storage is changing fast. Traditional solutions are quickly becoming outmoded as both the pace of technological change in storage technology, and the amount of data being produced continue to rapidly increase. What's more, says a report by 451 Research, the IT budgets of most enterprises are not keeping pace with the growth in storage requirements.

As the field continues to evolve, IT teams will need to be aware of some significant trends that are already having an impact on data storage operations. Let's look at a few of them.

1. Migration To The Cloud Is Accelerating

Many organizations have begun to recognize the benefits they can gain by moving some or all of their storage capacity to the cloud. The ability to shift storage management responsibilities, as well as CapEx spending, to a cloud services vendor can provide welcome relief to strained IT budgets.

Understanding and preparing for the issues that arise as data is shifted from on-premises data centers to the cloud is of paramount importance. For example, which of your workloads can comfortably run on remote servers, and which might be compromised by the latency effects that can slow down access to data stored in the cloud? Such issues will require careful thought and planning.

2. Software Defined Storage Is Becoming Strategic

Software Defined Storage (SDS) is becoming a key technology in both the data center and in the cloud. A report published by storage market analyst firm Wikibon states that between 2014 and 2015 SDS revenues rose almost 100 percent, from 1.25 billion to $2.5 billion.

Why all the excitement? With SDS, management of a data storage system is accomplished via a software layer that presents the same interface, regardless of the specific hardware employed to store the data. This device-independence permits mix-and-match use of a variety of storage platforms, including low-cost commodity hardware. The cost savings, operational flexibility, and almost limitless scalability of the SDS model are making it more and more attractive to enterprise IT managers.

As the technology matures, SDS seems well positioned to force significant changes in the way enterprises implement their data storage infrastructure.

3. Object Storage Is Penetrating Both The Cloud and Data Centers

With object storage, each dataset is stored as an object that includes the data itself, metadata that describes the data, and a unique global identifier. Rather than employing a hierarchical file structure that must be traversed to reach a particular file, the object storage architecture is based on a flat address space that uses the global identifier to quickly address any of a huge number of objects in a single operation.

Object storage is particularly well suited for storing large amounts of unstructured data. This makes it ideal for applications such as backups, archives, and streaming audio or video files. With its ability to store extremely large datasets and to scale endlessly simply by adding nodes to the address space, object storage is becoming more and more central to both cloud and data center storage solutions.

4. Big Data and the Internet of Things Bring Huge Storage Demands

Big data is big and getting bigger, both in size and in reach. Whether it's an oil drilling company gathering and analyzing huge datasets generated by seismic sensors, a medical research institute attempting to extract diagnostic patterns from millions of patient records, or an internet content provider analyzing user histories in order to bring up personalized advertisements as individuals surf the web, the ability to capture, retain, and retrieve large amounts of unstructured data is becoming central to the operations of many enterprises. And now, with the advent of the Internet of Things (IoT), the amount of data that must be captured and retained every day is exploding.

Because the traditional expedient of simply installing more and more storage hardware to meet expanding storage requirements would quickly become cost-prohibitive, meeting the demands of the big data revolution will require careful planning.

5. Use of Commodity Storage Hardware Is Increasingly Crucial

As the demand for storage increases exponentially, IT budgets are increasing (if at all) at a much slower rate. Many organizations are addressing this issue through use of low cost commodity storage hardware.

In this approach, low cost servers with hard disk drive (HDD) arrays are used to store data that does not require quick access. But when IOPS performance is critical, a tiered approach may be used. Tier 1 data, which demands the fastest access times, is staged as needed to a small amount of very high performance - and very expensive - flash memory. Tier 2 data, for which the performance demands are less stringent, is stored on the low cost commodity hardware.

Conceptually simple, this approach to lowering the overall cost of a company's storage infrastructure may be more difficult to implement. Doing so will require a lot of forethought and planning.

6. Flash Memory Prices Are Still High, But Falling Rapidly

As flash memory technology matures, its cost is coming down more and more quickly. Flash-based Solid State Disk (SSD) arrays are rapidly becoming less expensive while also increasing in capacity. Most observers believe that SSDs will displace HDDs as the standard storage solution within the next several years.

IT managers will need to carefully plan the changeover from legacy HDD platters to new SSD arrays. Or, they can just commit their data to the public, private, or hybrid cloud, and let their cloud services provider handle the whole thing.

7. Keeping Data Secure Is Becoming More Difficult

Threats to business-critical data are relentless and increasing. For most IT teams, there is no greater responsibility than ensuring that the organization's sensitive information is protected.

This is true no matter whether the information is stored in an on-site datacenter, or on the servers of a cloud services provider. In either case, an IT manager must have a well thought out security plan that limits access only to authorized users. As Henry Newman has so aptly put it, "The quickest way to lose your job is to have a security breach with your architecture or your technical decisions. Security needs to be the number one thing you think about all of the time."

This has been just a brief summary of the some of the trends that your company's IT team should be aware of. If you'd like to know more, please watch the Talon FAST™ video.

November 3, 2017

Trends in Disaster Recovery

Disaster recovery plans have been a part of organizations' data management and availability strategies for years. However, disaster recovery techniques have evo...
Continue »


November 2, 2017

Trends in Data Storage

We're in the midst of a data storage revolution. Whereas data storage technologies remained relatively unchanged for decades, the last several years have witnes...
Continue »


October 26, 2017

How Companies Are Modernizing Their Approach To Data Storage

As applications today become bigger, more complex, and more data-intensive, companies are having to change their approach to data storage. This is especially tr...
Continue »


Read more...


December 7, 2016

The CxO's Ultimate Guide to Data Management Through Storage Consolidation

The amount of data enterprises are collecting and storing continues to increase at a staggering rate. According to one estimate, 90 percent of all the world's d...
Continue »

December 7, 2016

5 Tips for Setting User Access Privileges in Your Business Cloud Storage

Your company has made the decision to consolidate your data in the cloud. You're partnering with a great storage services provider (like Talon), that is expert ...
Continue »