Location, Location, Location: Why Your Next Microsoft Cloud File Sharing Data Center Might Be Under the Deep Blue Sea

by Andrew Mullen on April 18, 2016

There are three primary concerns any time a new data center is in the works. First, where will you put it? To locate a data center near a populated area is expensive. In Silicone Valley, for instance, a simple residential lot with a modest home starts at about $1 million, while commercial real estate starts at astoundingly mindboggling amounts and only goes up from there. Situating a data center in the boonies means additional latency, among other considerations, such as how many of your senior engineers can you convince to move to East Whereisthis, Nowhereville?

The second consideration is power: how will you keep the computing stacks cool? Heat is the number-one contributing factor to server failure. Google has experimented with keeping servers at a balmy 80-degrees Fahrenheit, claiming that it's cheaper to replace the servers at a higher rate than to cool the data center year-round. Google, Facebook, Microsoft, and others have also been experimenting with putting data centers in cold regions. First, the air is cool, so all you need is ventilated walls and some fans, and boom! Instant cooled equipment. Second, the real estate is cheap, because there aren't hoards of folks lining up to purchase a bungalow in Finland, Iceland, or the Arctic Circle (yes, Facebook actually has a data center literally down the street from the Arctic Circle). But again, convincing your best database administrators to pack up and move to the North Pole is no menial task.

The cooling issue ties directly into the third consideration: the environmental impact of data centers. Not only do these quite-necessary operations take a lot of heat (pun intended) for running up the old fossil fuel consumption, they also take it for gobbling up land that some feel rightly belongs to the deer and the antelope at play. Others simply think data centers are ugly and have no place on planet earth.

More Servers, More Cloud File Sharing, Less Real Estate and Power Consumption

A data center in Silicone Valley is going to cost how much? Never mind. Just chunk the whole thing into the Pacific.

Microsoft has an entirely new take on the whole thing; one which solves all of these problems simultaneously. They have decided to dunk the data center into the Pacific Ocean. The whole mad tale begins with a team of Microsoft data center employees back in 2013. One of them had previously served on a naval submarine, so apparently the ocean was in his genetic makeup. The team wrote a whitepaper on the potential for sea-submerged data centers, and by 2014 Microsoft had a working prototype.

By 2015, an 8-foot steel submarine, outfitted with a single computing rack and pressurized with nitrogen stuffing, was dunked into the ocean off of the coast of California. The server-sub was rigged with more than 100 sensors to measure the conditions, both inside and outside of the unit, as well as the impact on the environment. Sensors included those for humidity, pressure, and motion, and were able to determine: 1. that the server fans were actually less noisy than the local shrimp population and 2. that the temperature of the water was only affected for a few inches outside the unit.

The server ran for 105 days, and according to Microsoft, the experiment was even more successful than they hoped. Cloud file sharing can be done in the cloud underwater with little or no impact to the ocean. The next such experiment is supposed to be three times the size of the first, which was affectionately named Leona Philpot after a character in the video game Halo.

Underwater Servers are Still a Work in Progress

No more complaints from the neighbors about how your data center is obstructing the view or affecting the local population of red tail hawks. They won't even be able to argue that you're taking more than your fair share of the power grid.

The biggest challenge the team will need to overcome is the ability to build and deploy servers that are capable of running for long periods of time with no regular maintenance. Microsoft hopes that these underwater data centers will eventually be able to operate for 10 years at a time, and cooled by underwater turbines and/or tidal power. Microsoft believes that if they are able to mass produce these sea-based data centers, that they can have one set up and operational within 90 days, versus the two-plus years it normally takes to build a data center (not counting the time for locating the real estate, bidding on the property, navigating local zoning laws, addressing any environmental concerns, and other land-based data center considerations).

The underwater data center would address several problems at once. First, it would incur no real estate costs, and would not take up valuable land needed for wildlife habitats or human purposes. About half of the earth's population lives within 125 miles of an ocean, meaning data centers dunked in the ocean could be close to the people who need them, without being in the way. Second, it would fix the energy consumption issues surrounding conventional data centers, especially if these units could eventually be self-powered via the tide or turbines. Finally, an underwater data center would have a far lower (perhaps negligible) impact on the environment. It's a win-win-win all around.

Are you looking for better cloud file sharing tools? Visit Talon Storage today to view our demo video and see how this solution works above the deep blue sea.

February 18, 2019

Talon and NetApp Enable Enterprises to Utilize a Revolutionary Software Storage Platform

Talon and NetApp work closely to provide businesses with enterprise grade cloud storage and file management. Through NetApp’s Cloud Volumes ONTAP and Talon’...
Continue »

May 31, 2018

Managing Unstructured Data

Data is the cornerstone of an enterprise. It shapes the evolution of corporations and dictates international investments. Unfortunately, many global organizatio...
Continue »

May 29, 2018

The Future of Data Analysis: Fog Computing

To understand fog computing, basic knowledge of edge computing is required. Edge computing, or remote computing, is the current forefront of data analysis and s...
Continue »