Toggle light / dark theme

Alex de Vries-Gao, a PhD candidate at VU Amsterdam Institute for Environmental Studies, has published an opinion piece about the results of a simple study he conducted involving the possible amount of electricity used by AI companies to generate answers to user queries. In his paper published in the journal Joule, he describes how he calculated past and current global electricity usage by AI data centers and how he made estimates regarding the future.

Recently, the International Energy Agency reported that data centers were responsible for up to 1.5% of in 2024—a number that is rising rapidly. Data centers are used for more things than crunching AI queries, as de Vries-Gao notes. They are also used to process and store cloud data, notably as part of bitcoin mining.

Over the past few years, AI makers have acknowledged that running LLMs such as ChatGPT takes a lot of computing power. So much so, that some of them have begun to generate their own electricity to ensure their needs are met. Over the past year, as de Vries-Gao notes, AI makers have become less forthcoming with details regarding energy use. Because of that, he set about making some estimates of his own.

Leave a Comment

If you are already a member, you can use this form to update your payment info.

Lifeboat Foundation respects your privacy! Your email address will not be published.