There are also the hobbyists who cool their motherboards with liquid nitrogen so they can clock them very fast. I think that a major issue with all liquid cooled systems is that the coolant requires more maintenance than the underlying computer. It also represents a common point of failure that can take everything else down. Air cooling may be crude - but it is relatively robust.
Before Cray Research used Fluorinert, he used Freon in two systems: CDC 6600/6400 and the CDC 7600.
And BTW: Could not get a similar cooling system working for what was to be the CDC 8600. That failure is why Cray left Control Data and formed
what would be really interesting is if someone (Intel) promoted a standard location for a cold plate on a standard 1U. vendors could arrange heatpipes inside the chassis however they liked, and the rack vendor would be responsible for circulating coolant through the plates that mated with the chassis plates...
modular, non-proprietary and not requiring a coolant hookup for each server.
Technology triumphs! Let's try this:
I have been continually surprised that a liquid-based cooling sysem has not been put in place for servers: much more efficient than cooling with air.
(I am also a little disappointed there is not opportunity to edit/delete a post when errors are made.)
Replay available now: A handful of emerging network technologies are competing to be the preferred wide-area connection for the Internet of Things. All claim lower costs and power use than cellular but none have wide deployment yet. Listen in as proponents of leading contenders make their case to be the metro or national IoT network of the future. Rick Merritt, EE Times Silicon Valley Bureau Chief, moderators this discussion. Join in and ask his guests questions.