Get In Touch

Get In Touch

01202 237370

Visit Us

Bournemouth HQ

First Floor
8-10 Christchurch Rd
Bournemouth
BH1 3NA

London

10 York Road
Waterloo
London
SE1 7ND

Enquiry Form

The Internet Gets The Underwater Treatment As Microsoft Reveals Its Seabed Server System

  • Written By Andy
  • Posted February 4, 2016
  • 3 minutes Read Time

While we’re all busy enjoying the internet, online shopping, social media love, and taking advantage of Magento support, if you’re a customer of ours, Microsoft has been looking at something quite remarkable, and we’ve got the info for you, here…

Microsoft has, in recent months, been carrying out a series of tests on a prototype data centre that has underwater capabilities, and can operate up to hundreds of feet below the surface of the ocean.

The company hopes this will help speed up everything from browsing the internet to downloading music.

The underwater test, named by the team as Project Natick, was carried out during a 105-day period and saw experts placing a steel capsule — eight feet in diameter —30 feet underwater in the Pacific Ocean near the coast of San Luis Obispo, California.

Microsoft inserted 100 various sensors on the capsule in order to measure humidity, pressure, motion and other conditions so that they could test the results in case of leaks.

The server offers the same computing power as that of 300 desktop PCs, and Microsoft researchers have cited that it was the first time a datacentre has been positioned below the surface of the ocean.

Test Results

The good news for Microsoft is that the capsule held up to the rigours of the testing. Furthermore, the engineers were even able to run commercial data-processing projects from Microsoft’s Azure cloud computing service.

As a result of the positive testing, Microsoft has now begun the designing of an underwater data centre set to be some four times bigger than the prototype used in the test.

What’s the Point?

Microsoft’s experts have stated that by going underwater they could solve a series of problems.

First of all, cooling is an essential aspect of running data centres, so much so that a regular centre will run up substantial costs for the operating chiller plants needed in order to keep the computers from overheating.

By placing the servers underwater, the cold environment of the seas provides an automatic cooling environment, therefore making it less costly and more energy efficient to run the datacentre.

Additionally, by taking advantage of the hydrokinetic energy from tides and waves for computing power it has the potential to make centres, even more, energy efficient.

The implications mean datacentres have the potential to work independently of existing energy sources, located closer to coastal cities, powered by renewable ocean energy.

By placing computing power closer to users it lowers the delay people experience when using the internet, so this would also mean faster browsing and download speeds.

It’s a fact that half of the world’s population lives within 120 miles of the sea, so the idea is certainly an appealing one.

This project showed it’s possible to deploy datacentres faster. Building the vessel that housed the experimental datacentre took a mere 90 days.

What’s Next?

The team is currently still looking at the data from the experiment, and from the studying of the results, so far, they are promising.

The experts behind the initiative are currently planning the project’s next phase, which could feature a vessel four times the size of the current container with as much as 20 times the computing power, which would be the equivalent power of 6,000 PCs!

They are also evaluating test sites for the vessel, which could be in the water for at least a year, deployed with a renewable ocean energy source.