Underwater computers sound idea but, when you look at the amount of power dissipated by computers, the idea becomes less bizarre.
Currently, multiple forms of cooling exist for computers, including methods that use air, water, mineral oil, and even liquid helium for some of the highest overclocked processors. If so many forms of cooling exist and are in wide use, why bother with underwater cooling? Researchers at the National Institute of Informatics have been doing just this and there is a very good reason for such cooling methods.
More Tech Means More Energy Consumption
The Internet is a truly amazing invention and arguably the second most important invention of the 20th century (first place goes to the transistor). It has given nearly 3.2 billion people access to the most valuable possession in existence: instant access to knowledge and information.
With the rise of cloud computing and the IoT, Internet speeds need to be increased, more cable is needed to connect more people, and data center are being built like there’s no tomorrow. With each data center that goes up and with the increasingly large number of electronic devices being manufactured, power generation has to cope with the demand.
The increase in demand puts ever more pressure on energy companies, dwindling resources, and the environment as a whole. On top of that, as time progresses, devices become more powerful which often coincides with a need for more energy to run. While this is not necessarily the case with mobile devices (and tech companies are increasingly researching low-power devices), servers are constantly consuming more and more power.
The Challenge of Heat Dissipation
One of the biggest issues that data centers, server farms, and even individual users have to tackle is heat dissipation. In the past, computers such as the ZX Spectrum produced such little heat that they only used a small piece of metal on the power regulator to keep them cool. In most cases, the heat sink could be removed and no damage would be done to the machine.
But as computers became more compact and powerful, the heat dissipation has shot up so much that you could actually cook meat off a modern CPU without its heatsink.
Therefore, computers require cooling methods to prevent damage and maximize processing power.
Cooling methods that currently exist include:
- Air flow
- Mineral oil
- Liquids such as helium and nitrogen at very low temperatures
Water cooled systems require energy to work. Image courtesy Don Richards
While these methods work, they all have one issue in common: they all require external power. The use of external power results in higher operating costs and also has a direct impact on the environment if the power generated is from non-renewable sources.
So how can we cool systems without needing to use more power?
Use The Environment!
One solution that is currently in use is careful location choice for the data centers, namely to places where it is already cold.
For example, Iceland is becoming a popular location for data centers and server farms as the air from the outside can be directly drawn in for cooling. The hot air that is expelled can then be transferred to local housing, which provides heating. This method not only saves money on cooling bills but it also prevents further energy use with heating in housing, which has a very positive impact on the environment.
There is one issue with such a move. Not many people live in or near cold regions like Iceland. Distance from clients is an important issue in the interconnected world of the Internet because, as the distance between the client/server increases, the latency time increases (i.e., the time for the client to send a message and receive the response). With more devices becoming reliant on the cloud, this increase in latency is largely unacceptable which is why many data centers are still found near large populations in even hot climates.
However, there is one biome within 120 miles of nearly half of all human beings: the ocean. Microsoft has recently begun experiments of submerging small data centers underwater to see if the ocean can keep the system cool without the need for external energy or cooling systems.
The Microsoft underwater test data center
So far, the program currently shows positive results but even these devices require complex designs and pressure vessels to prevent leaks as the internal electronics are not waterproof.
Waterproofing electronics is a popular topic these days, especially for mobile devices. However, the term "waterproof" for electronics can mean many different things. Plenty of electronics are actually more water-resistant than waterproof and aren't designed to function while submerged. Such ratings are insufficient for the level of waterproofing required for truly submersible computers.
Research professors Michihiro Koibuchi and Kazuki Fujiwara at the National Institute of Informatics have been experimenting with methods for cooling electronics by directly submerging them without the need for leak-proof vessels or special IP-rated equipment.
Their plan is to coat electronics in a waterproof material that will enable direct submerging in water while preventing shorts between electrical conductors and also increasing heat dissipation efficiency.
The researchers had to struggle through multiple trials and errors. Initial experiments involved coating pre-fabricated computers with an epoxy resin to insulate all exposed parts. However, this did not succeed as protecting every single connection was near impossible (remember, all it takes is two pins of a 100TQFP part to be missed and the circuit as a whole can easily fail).
The second attempt involved sealing the computer inside an aluminum box with joints filled with a waterproof sealant, but this also failed as water would eventually leak in.
The Sony Xperia ZR, a "waterproof" and "dust resistant" (IP58/IP55) phone. Image courtesy of Sony Mobile
The researchers eventually came across a substance called paraxylene which had one major and important advantage over the first epoxy that was used. This substance can be applied in a gaseous form which can produce a more consistent coating. When in gaseous form, the molecular bonds are weak as it adheres to surfaces but quickly hardens to produce a waterproof skin as it sets. This waterproof skin is around 0.1mm in thickness, which is enough to prevent shorting and corrosion.
So what did these two researchers do once they had found a viable coating? They took a computer motherboard, waterproofed it, and then submerged it in a fish tank. For three months, the computer ran continuously with no issues at a comfortable temperature!
With that experiment successful, they moved to the ocean to see if a computer could survive ocean conditions. The choice of motherboard was an ADRock Mini-ITX. Many of the computers did not last long in these experiments, but one of the computers survived for a month.
When the computers were examined for clues as to why they failed, signs suggested that wildlife may have been the cause. Crustaceans and seaweed had been living inside the computers. One had a crab squatting in the case of one of the motherboards. The researchers believe that some of the critters may have eaten through the protective waterproof layer, which would have resulted in the failure.
The researchers continue with underwater experiments in their quest to produce computers that can withstand ocean conditions. They're also taking advantage of tidal and wave energy to possibly also makes these computers independent from any external power supply.
Underwater computing, as strange as it may seem, could really be the answer to cooling issues and combatting environmental issues in the technological world. Waterproofed computers could easily sit in large containers of water which could be kept cool by just adding fresh cool water from the mains. If this technology works, it could be a game changer for all tech companies and individuals alike.