Meta outdid OpenAI: its AI turned out to be twice as popular as ChatGPT

Meta✴ presented another version of its latest artificial intelligence model – designed for working with text, Llama 3.3 70B. The new product shows performance at the level of the previous flagship Llama 3.1 405B, but offers higher efficiency: as the name suggests, the new version has 70 billion parameters versus 405 billion for the old one. The company also revealed the audience size of its AI assistant – it turned out to be twice as popular as ChatGPT.

Image source: Igor Omilaev / unsplash.com

«Using the latest advances in post-training techniques, this model has improved core performance at a significantly lower cost,” Ahmad Al-Dahle, vice president of generative AI at Meta✴, said on Social Media X. He also published a chart that shows Llama 3.3 70B outperformed Google Gemini 1.5 Pro, OpenAI GPT-4o and Amazon Nova Pro in a number of industry tests, including MMLU, an assessment of a model’s ability to understand language.

Image source: x.com/Ahmad_Al_Dahle

The new model has been published as open source and is available for download on the Hugging Face platform and other sources, including the official Llama website. Meta✴, however, has partially limited the ability of developers to use Llama models: for projects with more than 700 million users, a special license is required. At the same time, the Llama models themselves, according to Meta✴, have been downloaded more than 650 million times. Llama is also used by the company itself: it is used to build the AI ​​assistant Meta✴ AI, which, according to the head of the company, Mark Zuckerberg, has almost 600 million monthly active users – twice as many as ChatGPT.

Earlier, Zuckerberg said that to train promising Llama 4, the company will need ten times more computing resources than for current Llama 3. For these purposes, Meta✴ purchased more than 100 thousand Nvidia accelerators – Elon Musk’s xAI also has a cluster of this power ). It is also known that in the second quarter of 2024, Meta✴ capital expenditures increased by almost 33% compared to the same period last year and amounted to $8.5 billion – funds were spent on servers, data centers and network infrastructure.

admin

Share
Published by
admin

Recent Posts

Apple Confirms It Will Soon Make Vision Pro Headsets More Comfortable and Smarter

Apple has officially confirmed that its generative AI platform, Apple Intelligence, will be coming to…

3 hours ago

OpenAI Purges ChatGPT of Suspected Malicious Accounts from China and North Korea

OpenAI has suspended accounts of users in China and North Korea who allegedly used the…

3 hours ago

“We Just Need More Power”: OpenAI Will Gradually Overcome Its Dependence on Microsoft

OpenAI currently relies heavily on the computing power of its major shareholder Microsoft to develop…

3 hours ago

Trump’s Crypto Warm-Up: Coinbase Gets Off SEC Lawsuit With Little Blood

The largest US cryptocurrency exchange Coinbase has announced that the US Securities and Exchange Commission…

3 hours ago

Jensen Huang Drops Nvidia Stock Crash on DeepSeek – Investors Got It All Wrong

The market misinterpreted the significance of technological advances from Chinese AI lab DeepSeek and drew…

3 hours ago