Artificial intelligence has progressed from sci-fi fantasy to mainstream reality. AI now powers online tools from search engines to voice assistants and it is used in everything from medical imaging analysis to autonomous vehicles. But the advance of AI will soon collide with another pressing issue: energy consumption.
Much like cryptocurrencies today, AI risks becoming a target for criticism and regulation based on its high electricity appetite. Partisans are forming into camps, with AI optimists extolling continued progress through more compute power, while pessimists are beginning to portray AI power usage as wasteful and even dangerous. Attacks echo those leveled at crypto mining in recent years. Undoubtedly, there will be further efforts to choke off AI innovation by cutting its energy supply.
The pessimists raise some valid points. Developing ever-more capable AI does require vast computing resources. For example, the amount of compute used to train OpenAI’s ChatGPT-3 reportedly equaled 800 petaflops of processing power—on par with the 20 most powerful supercomputers in the world combined. Similarly, ChatGPT receives somewhere on the order of hundreds of millions of queries each day. Estimates suggest that the electricity required to respond to all these queries might be around 1 GWh daily, enough to power the daily energy consumption of about 33,000 U.S. households. Demand is expected to further increase in the future.
Comments are closed.