OpenAI faces high costs and data shortages when training next-generation Orion AI model

OpenAI is having trouble developing its new flagship AI model, codenamed Orion. This AI model has demonstrated significant success in natural language processing tasks, but its effectiveness in programming remains low. These limitations, along with scarcity of training data and increased operating costs, call into question the profitability and business attractiveness of the mentioned AI model.

Image source: AllThatChessNow / Pixabay

One challenge is the cost of running Orion in OpenAI data centers, which is significantly higher than previous generation AI models such as GPT-4 and GPT-4o. The significant increase in costs threatens value for money and could dampen interest in Orion among enterprise customers and subscribers focused on cost-effective AI solutions. The high cost of operation raises questions about the economic feasibility of the AI ​​model, especially given the moderate increase in its productivity.

Expectations were high for the transition from GPT-4 to Orion, but the quantum leap was not as significant as the transition from GPT-3 to GPT-4, which somewhat disappointed the market. A similar trend is observed among other AI developers: Anthropic and Mistral are also recording moderate improvements in their AI models. For example, testing results from Anthropic’s Claude 3.5 Sonnet AI model show that the quality improvements in each new core AI model are increasingly incremental. At the same time, its competitors are trying to shift attention away from this limitation by focusing on developing new features such as AI agents. This signals a shift in focus from improving the overall performance of AI to creating its unique capabilities.

To compensate for the weaknesses of modern AI, companies are fine-tuning the results using additional filters. However, this approach remains only a temporary solution and does not eliminate the main limitations associated with the architecture of AI models. The problem is compounded by restrictions in access to licensed and publicly available data, which has forced OpenAI to form a special team tasked with finding a way to solve the problem of a lack of training data. However, it is unclear whether the team will be able to collect enough data to improve the performance of Orion’s AI model and meet customer requirements.

admin

Share
Published by
admin

Recent Posts

GeForce RTX 5070 Ti with “fallen off” ROPs loses up to 11% performance in synthetic tests

It was previously reported that some GeForce RTX 5090/RTX 5090D graphics cards, and as it…

9 minutes ago

Chinese scientists have figured out how to extend the life of lithium-ion batteries

A group of researchers from China has developed a technology that will restore the capacity…

59 minutes ago

Bybit to Pay Up to $140 Million to Help Recover Stolen Funds

Hackers broke into Singapore-based crypto exchange Bybit this week, stealing more than $1.4 billion worth…

4 hours ago

Microsoft Unveils Redesigned Start Menu in Windows 11 with Automatic Program Grouping

Microsoft has officially confirmed changes to the Windows 11 Start menu regarding the All apps…

10 hours ago

Physicists Doubt Microsoft’s Majorana 1 Quantum Processor’s Performance on Majorana Fermions

There is an opinion among experts that the new topological quantum processor Microsoft Majorana 1…

10 hours ago