GPT-4 “drinks” up to one and a half liters of water to generate one hundred words

Using generative artificial intelligence comes with significant costs, a study from the University of California, Riverside found. AI requires significant amounts of water to cool the servers, even when they are simply generating text. And this does not take into account the high load on the electrical grid.

Image source: Growtika / unsplash.com

Exact amounts of water consumption in the United States vary depending on the state and the proximity of the consumer to the data center (data center) – while the less water consumed, the cheaper electricity is in that region, and the higher the volume of electricity consumption. Thus, in Texas, 235 ml of water is needed to generate an email one hundred words long, and in Washington, 1408 ml are needed. This may not seem like a huge amount, but the numbers grow very quickly when users run the large GPT-4 language model several times a week or even a day, and these results are valid for plain text generation.

Data centers are large consumers of water and electricity, which means that prices for these resources are rising in cities where such facilities are being built. For example, training the Meta✴ LLaMA-3 model required 22 million liters of water – the same amount needed to grow 2014 kg of rice, and the same amount, according to scientists, consumed by 164 Americans per year. The cost of electricity consumed by GPT-4 is also not cheap. If one in ten working Americans used the model once a week for a year (52 requests for 17 million people), 121,517 MWh of electricity would be required—enough for all households in the American capital for 20 days. And this is an unrealistically lightweight use case for GPT-4.

The Washington Post, which paid attention to the study, quoted representatives from OpenAI, Meta✴, Google and Microsoft – the largest companies in the field of AI. Most of them confirmed a commitment to reducing resource consumption, but did not provide actual action plans. Microsoft spokesman Craig Cincotta said the company intends to “work on methods for cooling data centers that completely eliminate water consumption,” but did not say how. So far, practice shows that profits from AI have a higher priority than the environmental goals proclaimed by companies.

admin

Share
Published by
admin

Recent Posts

Apple CEO Promises Trump to Invest Hundreds of Millions of Dollars in Developing Manufacturing in the U.S.

The directness of the current US President Donald Trump sometimes creates inconvenience for his partners,…

56 minutes ago

Apple Confirms It Will Soon Make Vision Pro Headsets More Comfortable and Smarter

Apple has officially confirmed that its generative AI platform, Apple Intelligence, will be coming to…

7 hours ago

OpenAI Purges ChatGPT of Suspected Malicious Accounts from China and North Korea

OpenAI has suspended accounts of users in China and North Korea who allegedly used the…

7 hours ago

“We Just Need More Power”: OpenAI Will Gradually Overcome Its Dependence on Microsoft

OpenAI currently relies heavily on the computing power of its major shareholder Microsoft to develop…

7 hours ago