Researchers note the continuing exponential growth in the power, cost, and energy consumption of supercomputers for AI tasks. If the trend continues, by 2030, AI supersystems will cost $200 billion each, consuming up to 9 GW of energy, which is equivalent to the power of nine standard nuclear reactors.
Image source: AI generation Grok 3/3DNews
Researchers from Georgetown University (USA) presented an analysis on this issue. Scientists noted that between 2019 and 2025, the equipment costs and energy consumption of leading AI data centers doubled annually. Continuing this trend will lead to the emergence of supersystems with two million processors and accelerators, which in today’s prices is equivalent to $200 billion with an energy consumption of 9 GW.
At the same time, the course towards energy efficiency simply won’t be able to keep up with the growth of energy consumption. If in the period from 2019 to 2025, the computing performance per watt increased annually by 1.34 times, then the consumption itself increased by 2 times during this period.
The most advanced AI supercomputer to date is xAI’s $7 billion Colossus, which runs on 200,000 processors and AI accelerators and consumes 300 megawatts of power.
It is important to note that AI supercomputers have become a largely commercial solution. In 2019, private companies owned 40% of the capacity of AI supercomputers, and by 2025, this share has grown to 80%.
«“While supercomputers were previously used only as research tools, they are now used as industrial machines that generate economic benefits,” analysts explain.
Business interest has spurred growth in the size and scale of private-sector AI systems: 2.7 times per year, compared with 1.9 times per year for public-sector systems. And planned new investments in AI by businesses are staggering in scale—hundreds of billions of dollars from each of the industry’s leading players.
According to the study, the United States now controls about 75% of all AI computing power. China is second with 15%, while previously leading supercomputer-producing countries like Japan and Germany are now playing catch-up.
At the same time, it’s important to remember that the physical location of AI data centers doesn’t necessarily reflect who uses them, as many provide remote access via the cloud. Site owners may suffer losses due to various incentives for server tenants, while the benefits of hosting them are not always clear.
For example, according to a report by Good Jobs First, at least 10 US states lose more than $100 million in tax revenues annually due to generous incentives for data centers. Moreover, these centers consume huge amounts of water and occupy significant land resources, depleting local ecosystems. Government regulation in this area is probably not fully developed.
It is worth noting that there are also signs of cooling interest in the topic of AI supersystems. In particular, AWS and Microsoft have suspended some projects. It is not yet clear whether this means the beginning of a slowdown in growth or a strategic pause. However, one thing is clear: each subsequent system will be larger, more powerful and more expensive.