Menu

Blog

Apr 12, 2023

GPT-3 training consumed 700k liters of water, ‘enough for producing 370 BMWs’

Posted by in categories: information science, robotics/AI, transportation

The data centers that help train ChatGPT-like AI are very ‘thirsty,’ finds a new study.

A new study has uncovered how much water is consumed when training large AI models like OpenAI’s ChatGPT and Google’s Bard. The estimates of AI water consumption were presented by researchers from the Universities of Colorado Riverside and Texas Arlington in a pre-print article titled “Making AI Less ‘Thirsty.’”

Of course, the water used to cool these data centers doesn’t just disappear into the ether but is usually removed from water courses like rivers. The researchers distinguish between water “withdrawal” and “consumption” when estimating AI’s water usage.


Pp76/iStock.

In contrast to consumption, which relates mainly to water loss due to evaporation when used in data centers, the former involves physically removing water from a river, lake, or other sources. The consumption component of that equation, where the study claims “water cannot be recycled,” is where most of the study on AI’s water use is concentrated.

Leave a reply