Image source: Copilot

01.ai’s achievement is especially noteworthy given the limited access Chinese companies have to Nvidia’s advanced GPUs. Founder and CEO Kai-Fu Lee points out that despite Chinese companies having virtually no access to Nvidia GPUs due to US regulations, Yi-Lightning’s AI model ranked sixth in performance rankings models according to the LMSIS version of the University of California at Berkeley.

Image source: NVIDIA

«My friends in Silicon Valley are shocked not only by our performance, but also by the fact that we trained a model for only $3 million,” said Kai-Fu Lee. “It is rumored that approximately $1 billion has already been invested in GPT-5 training.” He also added that due to US sanctions, companies in China are forced to look for more efficient and cost-effective solutions, which is what 01.ai was able to achieve by optimizing resources and engineering ideas, while obtaining similar results to GPT-4 at significantly lower costs.

Instead of increasing computing power as competitors are doing, the company has focused on optimizing algorithms and reducing processing bottlenecks. “When we only have 2,000 GPUs, we have to figure out how to use them [effectively],” Lee said.

As a result, model output costs were only 10 cents per million tokens, which is approximately 30 times less than similar models. “We turned the computational problem into a memory problem by building a multi-level cache, creating a special inference engine, and so on,” Li shared the details.

Despite claims about the low cost of training the Yi-Lightning model, questions remain regarding the type and number of GPUs used. The head of 01.ai claims that the company has enough resources to implement its plans for a year and a half, but a simple calculation shows that 2,000 modern Nvidia H100 GPUs at the current price of $30,000 per unit would cost $6 million, which is double the stated costs. This discrepancy raises questions and requires further clarification. However, the company’s achievement has already attracted the attention of the world community and showed that innovation in the field of AI can be born even in conditions of limited computing resources.

Leave a Reply

Your email address will not be published. Required fields are marked *