According to earlier estimates, ChatGPT consumes about 3 watt-hours of power to answer a single query, which is 10 times more than the average power required to use Google Search. However, a new report from Epoch AI, a research institute that studies the key trends and questions that will shape the trajectory of artificial intelligence, contradicts these statistics and suggests that the OpenAI chatbot’s energy consumption is significantly lower than previously estimated.

Image source: OpenAI

Epoch AI’s report states that ChatGPT, powered by the GPT-4o model, consumes just 0.3Wh of power when generating a response. Speaking to TechCrunch, Epoch AI data analyst Joshua You noted, “The energy consumption is actually not that high compared to using typical appliances, heating or cooling your home, or using your car.”

According to the expert, previous estimates of ChatGPT’s energy consumption were based on outdated data. The specialist notes that the supposed “universal” statistics of ChatGPT’s energy consumption were based on the assumption that OpenAI uses old and inefficient chips to run and operate the AI.

«In addition, some of my colleagues pointed out that the most widely used estimate of 3 Wh per query was based on fairly old studies. And if we were to judge by some rough calculations, this statistic seemed too high,” Yu added.

It should be added, however, that Epoch AI’s estimate of ChatGPT’s energy consumption is also not definitive, as it does not take into account some key AI capabilities, such as the chatbot’s image generation.

The expert said he doesn’t expect ChatGPT’s energy consumption to increase, but as AI models become more advanced, they will require more energy to operate. Leading AI companies, including OpenAI, are leaning toward developing so-called reasoning AI models, which don’t just answer a question but also describe the entire process that led to that answer, which in turn requires more energy.

Multiple reports in recent years have found that technologies like Microsoft Copilot and ChatGPT (or rather the hardware they run on) consume the equivalent of a bottle of water for cooling when generating a response to a query. These findings follow an earlier report that found that the combined energy consumption of Microsoft and Google exceeds the electricity consumption of more than 100 countries worldwide.

One of the most recent studies detailed how OpenAI’s GPT-3 model consumed four times more water than previously thought, while GPT-4 consumed up to three bottles of water to generate just 100 words. It’s clear that AI models become more resource-hungry as they become more advanced. However, the findings of the latest study suggest that ChatGPT may not be as voracious as previously thought.

Leave a Reply

Your email address will not be published. Required fields are marked *