GPT-4 “drinks” up to one and a half liters of water to generate one hundred words

Using generative artificial intelligence comes with significant costs, a study from the University of California, Riverside found. AI requires significant amounts of water to cool the servers, even when they are simply generating text. And this does not take into account the high load on the electrical grid.

Image source: Growtika / unsplash.com

Exact amounts of water consumption in the United States vary depending on the state and the proximity of the consumer to the data center (data center) – while the less water consumed, the cheaper electricity is in that region, and the higher the volume of electricity consumption. Thus, in Texas, 235 ml of water is needed to generate an email one hundred words long, and in Washington, 1408 ml are needed. This may not seem like a huge amount, but the numbers grow very quickly when users run the large GPT-4 language model several times a week or even a day, and these results are valid for plain text generation.

Data centers are large consumers of water and electricity, which means that prices for these resources are rising in cities where such facilities are being built. For example, training the Meta✴ LLaMA-3 model required 22 million liters of water – the same amount needed to grow 2014 kg of rice, and the same amount, according to scientists, consumed by 164 Americans per year. The cost of electricity consumed by GPT-4 is also not cheap. If one in ten working Americans used the model once a week for a year (52 requests for 17 million people), 121,517 MWh of electricity would be required—enough for all households in the American capital for 20 days. And this is an unrealistically lightweight use case for GPT-4.

The Washington Post, which paid attention to the study, quoted representatives from OpenAI, Meta✴, Google and Microsoft – the largest companies in the field of AI. Most of them confirmed a commitment to reducing resource consumption, but did not provide actual action plans. Microsoft spokesman Craig Cincotta said the company intends to “work on methods for cooling data centers that completely eliminate water consumption,” but did not say how. So far, practice shows that profits from AI have a higher priority than the environmental goals proclaimed by companies.

admin

Share
Published by
admin

Recent Posts

SpaceX explained why it sank the Super Heavy rocket during the last Starship launch

SpaceX has explained why, during the last test flight of the Starship rocket, it sank…

2 hours ago

Millionaire with a meager salary: it turned out how much the head of OpenAI earns

It is widely believed that the salaries of top managers of tech giants are measured…

3 hours ago

Astronomers have photographed a dying star outside our galaxy for the first time – it doesn’t look as expected

Chilean astronomers have captured a close-up photo of the giant dying star WOH G64 outside…

5 hours ago

Technology for cooling chips with light presented – secret and by appointment only

At the SC24 high-performance computing conference in Atlanta, Maxwell Labs, a pioneer in solid-state photonic…

5 hours ago