ChatGPT’s Electricity Usage is 17,000 Times More Than an Average US Household

It's undeniable that AI chatbots have grown in popularity, and ChatGPT is among the most used services out of all of them. Very few thought about how much energy and resources an AI company might use to accommodate all the users, but it turns out it's a lot.


(Photo : Getty Images)

ChatGPT's Electricity Usage

OpenAI has had a huge success with the release of ChatGPT, and now, not only is the free version used by a little over 180 million users, but the paid version is already integrated into companies and businesses to help with their operations.

That staggering number of users means that hundreds of millions of requests per day. To run that amount of activity, OpenAI needs a lot of power, which could be around half a million kilowatt-hours of electricity, as reported by Business Insider.

To put that number into perspective, an average American household only uses around 29 kilowatt-hours daily. That means that OpenAI's servers use up energy equivalent to around 17,000 households, which, without saying, is a whole lot.

What's more interesting is that right now, AI is still not at its full potential, which means that we will see a significant increase in the future as it advances, calling for bigger energy consumption as more users adopt the use of AI services.

Dutch National Bank data scientist Alex de Vries said AI is very energy intensive, adding, "Every single of these AI servers can already consume as much power as more than a dozen UK households combined. So the numbers add up really quickly."

In de Vries' paper, he estimated that by 2027, the AI industry will use up around 85 to 134 terawatt-hours, a billion times a kilowatt-hour.

"You're talking about AI electricity consumption potentially being half a percent of global electricity consumption," he added.

This calls into question what AI companies are doing to create a more sustainable way to generate energy to power the AI servers. Energy is not the only resource that's intensively used. To cool the supercomputers, a huge amount of water is also needed.

Read Also: Microsoft is Using Billions of Gallons of Water to Cool Its AI Supercomputers

Microsoft's Water Consumption

You've probably heard about water-cooling technology, as it is one of the most efficient ways to ensure that electronic units stay at the ideal temperature and avoid overheating. Since OpenAI is backed by Microsoft, its water consumption is also reflected in the tech giant's.

Since the growth of the AI company, Microsoft saw a 34% increase in water consumption for its data centers, which are pulled from the watershed of the Raccoon and Des Moines rivers in central Iowa, according to Fortune.

Given that a huge amount of activity generates a lot of heat the way an overworked PC would, Microsoft used around 1.7 billion gallons between 2021 and 2022, considering the 34% spike. That's about 2,500 Olympic-sized pools worth.

University of California researcher Shaolei Ren stated in her paper that every time someone asks between five to 50 questions or prompts, ChatGPT uses up about 16 ounces of water. The consumption depends on where the servers are and the temperature in that location.

Related: MIT Students Are Searching for Ways AI Data Centers Can Be More Eco-Friendly

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

More from iTechPost