The ability of GPUs to scale computing performance well in the face of classic Moore’s Law slowdowns has long been portrayed by Nvidia management as a lifeline for all of humanity. As the explosive growth of artificial intelligence systems begins to slow, new performance scaling challenges loom on the horizon.
As the Financial Times notes, for many in Silicon Valley, Moore’s Law has been supplanted by a new concept: the “scaling law” of artificial intelligence. Until recently, it was believed that scaling the computing infrastructure and saturating it with larger volumes of data leads to qualitative changes in artificial intelligence systems. Essentially, it was expected that AI would become “smarter” due to this. As a result, all large companies in the technology sector have been focused on actively increasing the computing power of their data centers for several quarters in a row.
Previously, it was believed that the current rate of growth in data center productivity would continue until “superintelligence” was created, capable of surpassing human intelligence, but based on software algorithms and dependencies. Only in recent weeks have experts begun to express concern that the latest big language models from OpenAI, Google and Anthropic are not making the necessary progress on previous trends.
One of the founders of the startup, Ilya Sutskever, who left OpenAI, recently said: “The 2010s were the era of scaling, but now we are back in the era of discovery and amazement.” Remarkably, Sutskever was confident a year ago that the entire surface of the Earth would need to be covered with solar panels that would power equally numerous data centers.
Many market participants agree that the stage of active training of language models is coming to an end, but in order to maintain the current pace of progress, the next stage must be addressed. Microsoft CEO Satya Nadella believes that the slowdown in training large language models does not particularly limit the pace of progress as artificial intelligence systems gain the ability to reason. According to Nvidia founder Jensen Huang, even a decrease in the need for computing resources to train language models will not mean a decrease in demand for its products. Developers of AI systems will strive to reduce the system’s response time to questions asked by users. This race will require even more hardware resources, according to Nvidia’s permanent leader, and this is good for the company’s business. Microsoft President Brad Smith is convinced that market demand for accelerator chips will continue to grow for at least another year.
However, the transition of AI systems to a new stage of development should be ensured by the emergence of real areas of application that are useful for business. There are still problems with this, because any innovation must bring material benefits, and the effect of using AI in its current form in many sectors of the economy is not yet so obvious. That hasn’t stopped tech giants from investing huge amounts of money into expanding their computing resources. This year, the combined capital expenditures of Microsoft, Amazon, Google and Meta✴ should exceed $200 billion, and next year they will likely exceed $300 billion, according to representatives of Morgan Stanley.
Scientists have demonstrated the advantage of accurately linking the profile of wind turbine blades to…
A study by American scientists has shown that modern artificial intelligence systems capable of analyzing…
LG has released a 27-inch IPS monitor UltraFine 27US550-W, aimed at representatives of creative professions.…
Scientists from Imperial College London discovered a colony of microorganisms on a sample from the…
Amazon has developed its own generative artificial intelligence system that can work with text, images…
The owner of X, the CEO of Tesla, the head of SpaceX and the head…