Are Microsoft’s underwater data centers the wave of the future or just a cool science project?

Microsoft recently unveiled details about Project Natick, an experiment in which they tested the world’s first underwater data center off the coast of Central California, near San Luis Obispo. Their vessel, the Leona Philpot (Named after a character in the HALO series) spent August through November of 2015 underwater to test the feasibility of housing data underwater. The Leona Philpot was outfitted with over 100 sensors to see how the vessel handled the harsh conditions of the sea floor.

The Project Natick team’s vessel held together for the 3 months it spent on the seafloor, but the team has a lot of work ahead of them if they want to reach their goal of a 5-year deployment. Since these data centers will be underwater, they have to be extracted and brought back to land for repairs. Five years is a long time for the Leona Philpot’s 300 computers to go without maintenance, and will be even longer when Microsoft begins testing larger models to house more computers. Microsoft’s goal for phase 2 of the project is to make a container that is 4 times larger that houses 20 times more computing power.


Microsoft made pretty neat video for Project Natick

Whether or not Microsoft can pull this off remains to be seen, but their goal of making energy efficient and environmentally friendly data centers is admirable. The idea for Project Natick began in 2013 when Sean James, a Microsoft employee who formerly served on a US Navy submarine submitted a ThinkWeek paper with the idea of an underwater data center that could be powered by renewable energy from the ocean. These data centers would also take significantly less time to manufacture than traditional data centers, and the vessels can be recycled. Norm Whitaker liked the idea so much that he decided to build a team to test the feasibility of underwater servers and kicked off Project Natick in late 2014.

The project has been met with a fair amount of challenges and skepticism. One of the major concerns about its viability is the wear and tear that underwater tidal power generators take. Several corporations and countries have invested in tidal power, but none has been commercially viable, and their machines have not been able to stand the conditions as long as hoped. Many think that operating conditions under the ocean are simply too harsh.

Those who are more cynical dismiss Project Natick as simply a publicity stunt. After all, the project kept people’s minds off of Microsoft 10 for a little while, and eco-friendly machines are all the rage. The video and coverage of Project Natick have several convenient placements of the Microsoft logo, but then again what company today wouldn’t take that kind of brand exposure? Microsoft’s commitment to these underwater data centers is both selfless and self-serving. Making an energy efficient, environmentally friendly machine is a noble goal, but it would also save Microsoft a ton of money if they can pull it off. Even if Microsoft’s goals with the project are more self-serving, there’s plenty of incentive for them to commit to taking their research as far as they can.

At least Aquaman will know this thing was made by Microsoft

Microsoft isn’t the only company experimenting with alternative cooling methods for data centers. Facebook built a data center in Lulea, Sweden that uses the freezing air from outside for cooling. It’s also powered by renewable hydroelectricity. These air-cooled data centers are not with their own problems, however. Since they can only be built in extremely cold weather climates, this forces the servers into a relatively small area of the world. Keeping them so far away from their users can lead to latency issues since the data has to travel further. Microsoft took the opposite approach, stating “50% of us live near the coast. Why doesn’t our data?

With the advent of technologies within the growing IoT industry, the demand for cloud storage will be ever increasing for the foreseeable future. By making data centers less expensive and environmentally friendly, Microsoft has a chance to make helpful technology accessible to more people than ever. Both of these alternative cooling methods have their challenges, but the commitment of companies like Microsoft and Facebook to explore them could lead to the next big breakthrough.






  • Juan E Jiménez 2016-02-19

    How quaint—a brand new way to DIRECTLY contribute to global warming by DIRECTLY warming the ocean. Whomeve came up with that should be given the prize for the most stupid IT invention of the decade. I guess that’s a lot easier than recycling the heat back into some form of energy…

    • richard_h 2016-02-19

      I’m not sure the directness makes much difference - but if it manages to do it without powered aircon, then the total heat output will be a lot less than a data centre that does.

    • CFM 2016-02-19

      If the power is coming from tidal generators, that energy is already in the ocean, no heat/energy is being added to the system from outside.

      All of the energy the moon is constantly putting into the ocean in the form of tidal flow converts to heat anyway, as it always has.

      With this system, the mechanical tidal energy is only being converted from mechanical (kinetic) form to electrical, then to waste heat, then back to back to the ocean as it would have been anyway, just by a different path. There is no extra heat-energy being created.

  • redrooster01 2016-02-20

    Why not build an under sea building with access from the top so technicians can service the computers and equipment from inside? We have the technology to do it.

  • tranzz4md 2016-09-26

    Yeah, tidal power is cool I guess, and MS has got the bux to burn, but if they, or somebody else got serious about it, there’s sure some advantages there.  They could easily go surface windmills, and access from the surface need not be difficult, because great depths aren’t necessary.