Cooling AI Datacenters consumes water, right? Maybe not.
Over 2 decades ago, a tent datacenter was deployed in Seattle to prove outside air could be used to cool datacenter.

Cooling AI Datacenters consumes water, right? Maybe not.

When I was running a datacenter in Seattle, I met Christian Belady , who had just joined Microsoft from computer manufacturer, Hewlitt Packard. He told me something that blew my mind: servers were a lot tougher than the manufacturers led us to believe. You didn’t need to keep them in near-sterile conditions at chilly temperatures. Around the same time, Daniel Costello joined Microsoft from Intel. He also proposed servers were a bit more thick-skinned than most people thought, and together we decided to put that theory to the test—literally.

We moved some of my servers into a tent outside to prove they could handle warmer, dustier environments without breaking a sweat.. or breaking anything at all really. It was a gamble, but it worked. That experiment set off a revolution in our industry regarding how we cool servers. It showed the world we didn’t need massive energy intensive chilled-water plants or sealed-off rooms to keep everything running smoothly. Instead, we could leverage outside air, an approach called air-side economization. By pulling in fresh air under the right conditions and using a little bit of water for the hottest hours of the year, we improved energy efficiency and cut water usage compared to conventional cooling towers and chillers.

Fast forward to today, and the question of cooling and water use in datacenters is under the spotlight again, especially with concerns about drought and wildfires. Are datacenters, particularly AI datacenters, really draining local water supplies? Let’s take a candid look at how datacenters evolved from tents in the Seattle rain to zero-water evaporative cooling, and why that’s a much bigger deal than most people realize.

Why People Are Concerned

Droughts have worsened in recent years, making every drop of water feel critical, especially during wildfires. With entire communities on edge, it’s no wonder people look at datacenters with suspicion. After all, these facilities host thousands of servers that must be kept cool, traditionally consuming water in the process. The burning question is whether they’re siphoning off water that should go to households or firefighting efforts, and whether the rapid expansion of AI will make the situation worse.

How Datacenters Typically Use Water

Every watt of electricity going into a server comes out as heat, much like your laptop after hours of Netflix or gaming. Now imagine how hot your house would get if you had thousands of computers running at once, that’s the challenge datacenters are designed to handle. When datacenter design moved to evaporative cooling, water was sprayed over coils or special air filters. As the water evaporated, it lowered temperatures (similar to how sweating cools your body), but it also needs constant replenishment. The more servers you have, the more water gets lost to evaporation.

Video: How evaporative cooling works. Fast forward to 1:30


Next-Generation Cooling: Zero-Water Evaporation

One major shift in modern datacenter design is zero-water evaporative cooling, championed by AI companies like Microsoft. The chips in AI servers generate more heat than air cooling alone can accommodate, so modern AI servers use pumped cool water inside the server, similar to how a dip in a pool cools you faster than standing in front of an air conditioner. Microsoft has introduced chip-level liquid cooling and sealed-loop systems that circulate the same water rather than letting it evaporate. Once filled, that water keeps cycling between servers and outdoor heat exchangers, dissipating heat without consuming water. This approach can save millions of liters of water per year, per datacenter.

Trade-Offs: Water vs. Electricity

One frequent debate revolves around balancing water usage and energy consumption. If a datacenter opts not to use water-intensive cooling, does that inevitably translate into higher electricity costs for air conditioning? Although this was once a legitimate concern, modern chip-level liquid cooling enables datacenters to operate efficiently at higher temperatures, which in turn permits the use of outside air to cool the water loop, thereby offsetting a substantial portion of the additional power demand. While a slight increase in electricity usage may occur, it is generally negligible compared to the significant reduction in water consumption. In some grids, however, more energy consumption means more water use at the power plant. Engineers are continually refining these methods to optimize energy efficiency, such as increasing water temperature, ensuring that the net benefits remain firmly in favor of water-saving approaches.

The Bottom Line

Whenever I talk about datacenter cooling, I come back to that tent in Seattle. It wasn’t just a stunt; it was proof that conventional wisdom about extreme temperature controls was outdated. Today’s AI boom is sparking even more innovation, showing the world that running huge server farms doesn’t have to come at the expense of local water supplies. Next time someone rails against “water-guzzling datacenters,” ask if they know how far the technology has come. Our industry never stops innovating.

Anjali Mann

From Circuits to Clouds: Driving Scalable AI Infrastructure @ Microsoft | Technical Program Leader

6mo

I heard about the Tent lore when I joined the CTO. Thanks for sharing more insights into this wonderful experiment that has shaped our current forms of Data center Cooling

George Rockett

DCD Founder | Media Entrepreneur

7mo

I was just listening to a UK political podcast run by two very smart people. They covered the DeepSeek topic and kept reiterating the huge energy AND water consumption of AI. I'll send the this link to help withe the re-education process. Great write-up by the way.

Chris Regier

Controls R&D, Fluid dynamics, HVAC, Data centers.

7mo

Interesting read, Sean. Question: How did you find the increase in temperatures affected chip life? Or did it? From what I've seen reliability decreases as temperature increases. Curious as to your experience.

Like
Reply
Brian Groh

Key Account Director: at Google Cloud. Minneapolis and Midwest

7mo

No doubt that your tent experience kicked off a massive mind shift in data center design. Thank you Sean, Christian, Doug, Brian, and many others at Microsoft for your continued innovative work.

Edward Zemaitis

Mission Critical Facilities Data Centers - Construction & Design

7mo

Sean, you should get out here to see the progress being made on the construction of the ACHE’s and Chiller Plants (closed loop) here at MKE03 in Mount Pleasant. A far cry from the chilled water plant from yesteryear (CH1)….

To view or add a comment, sign in

Explore content categories