Microsoft’s AI investments skyrocketed in 2022 – and so did its water consumption

In its rush to lead the generative ML world, Redmond may have developed a datacenter drinking problem

Microsoft's water consumption surged 34 percent to 6.4 million cubic meters in 2022 and a generative AI guzzle may be to blame.

"It's fair to say the majority of the growth is due to AI," Shaolei Ren told the Associated Press in a report published Saturday, who pointed to Microsoft's relationship with OpenAI and massive investment in generative AI products as evidence.

Ren, a researcher at the University of California, Riverside, has been studying the impact of generative AI adoption on datacenter water consumption, and penned a paper on the subject this spring.

In its latest environmental, social, and governance (ESG) report, Microsoft said the higher rate of water consumption was in line with business growth. According to that report, water consumption increased by a third from 4.8 million cubic meters of water in 2021 to 6.4 million cubic meters last year. That's compared to the 14 percent increase in water consumption the software giant reported between 2020 and 2021.

While Microsoft doesn't specify AI adoption as the formal cause of this increase, we do know that the cloud provider has been deploying tens of thousands of GPUs to power the large language models behind Bing Chat and GitHub Copilot, and others. The mega-corp is also working closely with OpenAI, the developer behind the large language models (LLM) used in ChatGPT.

The Register reached out to Microsoft for comment on its increased water consumption. A rep told us:

As part of our commitment to create a more sustainable future, Microsoft is investing in research to measure the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application. We will continue to monitor our emissions, accelerate progress while increasing our use of clean energy to power datacenters, purchasing renewable energy, and other efforts to meet our sustainability goals of being carbon negative, water positive and zero waste by 2030.

Datacenter water consumption has become a point of concern, especially in America where scientists have warned changing weather patterns are likely to result in widespread and persistent drought conditions. 

According to US climate models developed at the Department of Energy's Argonne National Labs, by the middle of the century — just a short 27 years from now — large portions of the country could be in a state of persistent drought. The models show that these conditions are likely to be followed by brief but devastating floods.

What's Microsoft using all of that water for?

While the water consumption figures highlighted in Microsoft's ESG report are for the company as a whole, not just datacenters, it's no secret that these facilities suck down a lot of water.

The GPUs used to power these models are typically deployed in sets of eight and consume a prodigious amount of power compared to traditional datacenter infrastructure. It's not unusual for an eight GPU system to consume between 6kW and 10kW of power under full load. To put that in perspective, that means a single server can consume as much as a typical cloud rack.

All of that heat needs to go somewhere, and depending on the datacenter cooling technology, hotter systems can translate into greater water consumption.

Within these facilities water is used in a variety of datacenter cooling applications. Deionized water and other fluids can be used in direct liquid cooled (DLC) systems. Rather than blow air over a heatsink to cool the processors and GPUs, this tech removes heat by passing coolant through cold plates attached to hotspots throughout the system.

As we've previously reported, this approach is considerably more efficient than traditional air cooling. Google actually uses DLC to cool its Tensor Processing Units (TPUs) used in AI training and inference workloads. However, many modern GPUs systems are still air cooled. No matter which technology is used, complex air-handling and thermal-management systems are required to get the heat out of the datacenter.

One of the most efficient ways to do this is using evaporative coolers, like cooling towers. This technique uses water to pull the heat out of the air exiting the datacenter. As the hot air causes the water to evaporate, it is chilled back to a usable temperature. Evaporative cooling tech is quite popular among datacenter operators as it tends to use less power than alternative tech and in many climates they only need to be run during the hottest months of the year.

This approach to cooling can be problematic in certain regions where access to water or treatment facilities is limited. In datacenter hubs such as Phoenix, Arizona, many datacenters have transitioned to alternative cooling technologies, which don't consume water directly but do tend to use more power and generate more noise pollution.

How much water are we talking about?

Microsoft says it used about 1.6 million cubic meters more water in 2022 than in 2021. To put that in perspective that's enough water to fill 640 Olympic sized swimming pools; a volume equivalent to more than three billion grapefruit. And that's just new water consumption. Microsoft's total water consumption for 2022 is the equivalent of 2,560 Olympic pools.

In 2021, Researchers at the University of Oxford estimated that US datacenters collectively consumed about 1.7 billion litres of water per day. However, they note that measuring datacenter water consumption is often challenged by a lack of transparency.

Despite this, researchers at the University of California, Riverside and the University of Texas at Arlington recently attempted to determine just how much water generative AI was using.

"ChatGPT needs to 'drink' a 500ml bottle of water for a simple conversation of roughly 20-50 questions and answers, depending on when and where ChatGPT is deployed," the researchers estimated in an April paper.

However, as we've previously discussed, actual water consumption associated with running LLMs, like ChatGPT, depends on a variety of factors including the thermal management technologies used by the facility, where those models are being trained and run, and when those jobs are run.

For example, datacenters located in a colder climates are likely to consume less water and can take advantage of lower ambient temperatures, especially compared to those in hot, arid desert climes. According to the AP report, the Iowa datacenter where Microsoft and OpenAI trained GPT-4 only consumes water when the temperature is above 29.3 degrees Celsius.

What is Microsoft doing to mitigate its water consumption?

In its ESG report, Microsoft claims it's using water consumption metrics to guide water reclamation efforts as it seeks to be net "water positive" by 2030.

Projects to reach that elusive "water positive" state typically involve funding projects to protect watersheds, restore wetlands, and improve infrastructure in water stressed regions. For example, in Chennai, India, where Microsoft operates a datacenter, the company worked with The Natural Conservancy to restore the wetlands around Lake Sembakkam.

To date, Microsoft says it's signed 35 million cubic meters of water reclamation projects.

The Windows maker also says it's working on thermal management technologies to reduce the water consumption of its facilities. This includes a geoexchange system at its Thermal Energy Center in Redmond, which rejects heat to the ground rather than relying on cooling towers. The technology is expected to reduce that facility's water consumption by about 30,280 cubic meters — or about 12 Olympic swimming pools.

Microsoft has also moved away from evaporative coolers at some of its most water challenged datacenters. This spring, they agreed to transition its final two datacenters, located outside Phoenix to "zero water" cooling infrastructure.

How do the other cloud providers stack up?

Microsoft isn't the only cloud provider grappling with water consumption. As Amazon Web Services (AWS) and Google Cloud have steadily expanded their cloud empires, they've announced several efforts to curb their datacenter drinking problem.

Last fall AWS, committed to becoming "water positive" by 2023, which is to say, they claim they'll return more water to communities than is consumed by their operations.

Google has made similar commitments, but in April admitted AI adoption was making the situation more difficult. ®

More about

TIP US OFF

Send us news


Other stories you might like