the Water Thirst of AI, The Environmental Cost of Large Language Models


Large language models such as OpenAI’s ChatGPT and Google’s Bard are known for their power and versatility. However, these models come at a significant cost in terms of energy consumption and water usage. Recent research reveals that training the GPT-3 model alone required 185,000 gallons (700,000 liters) of water, equivalent to filling a nuclear reactor’s cooling tower. This raises concerns about the impact on water supplies, particularly during times of drought and environmental uncertainty in the US.

A study conducted by the University of California Riverside and the University of Texas Arlington sheds light on the water consumption of AI models. It highlights the distinction between water “withdrawal” and “consumption.” While withdrawal refers to physically taking water from sources like rivers or lakes, consumption refers to the evaporation of water in data centers, which cannot be recycled.

To keep server rooms cool, data centers rely on cooling towers that consume substantial amounts of water. Approximately one gallon of water is consumed for every kilowatt-hour expended in an average data center. Freshwater is necessary to prevent corrosion and ensure humidity control. Moreover, the electricity consumed by data centers contributes to off-site indirect water consumption.

The water consumption issue is not limited to OpenAI and AI models. In 2019, Google requested over 2.3 billion gallons of water for its data centers in three states. Google’s LaMDA and Bard models, which are even larger than GPT-3, may require millions of liters of water for training. Additionally, the energy requirements of these large language models are staggering, with GPT-3 alone releasing 502 metric tons of carbon during training.

The concern over water usage is amplified by climate change and worsening droughts. The World Economic Forum estimates that millions of US residents lack access to water and adequate water systems. Climate change and population growth are expected to exacerbate water scarcity issues. Rising temperatures and severe droughts in the American West have already had a significant impact on freshwater supplies.

To address these concerns, AI companies and data centers can take steps to improve water efficiency. Training AI models at cooler times or in more water-efficient data centers could help reduce water usage. Chatbot users could also engage with the models during “water-efficient hours.” However, achieving these changes requires greater transparency from tech companies regarding where and when the models are trained.

In conclusion, the water footprint of AI models is a pressing issue that needs to be addressed. Transparency, efficient data center practices, and user awareness can contribute to mitigating the impact on water resources, especially as AI becomes more pervasive across various sectors.


Author: robot learner
Reprint policy: All articles in this blog are used except for special statements CC BY 4.0 reprint policy. If reproduced, please indicate source robot learner !
  TOC