Meta outdid OpenAI: its AI turned out to be twice as popular as ChatGPT

Meta✴ presented another version of its latest artificial intelligence model – designed for working with text, Llama 3.3 70B. The new product shows performance at the level of the previous flagship Llama 3.1 405B, but offers higher efficiency: as the name suggests, the new version has 70 billion parameters versus 405 billion for the old one. The company also revealed the audience size of its AI assistant – it turned out to be twice as popular as ChatGPT.

Image source: Igor Omilaev / unsplash.com

«Using the latest advances in post-training techniques, this model has improved core performance at a significantly lower cost,” Ahmad Al-Dahle, vice president of generative AI at Meta✴, said on Social Media X. He also published a chart that shows Llama 3.3 70B outperformed Google Gemini 1.5 Pro, OpenAI GPT-4o and Amazon Nova Pro in a number of industry tests, including MMLU, an assessment of a model’s ability to understand language.

Image source: x.com/Ahmad_Al_Dahle

The new model has been published as open source and is available for download on the Hugging Face platform and other sources, including the official Llama website. Meta✴, however, has partially limited the ability of developers to use Llama models: for projects with more than 700 million users, a special license is required. At the same time, the Llama models themselves, according to Meta✴, have been downloaded more than 650 million times. Llama is also used by the company itself: it is used to build the AI ​​assistant Meta✴ AI, which, according to the head of the company, Mark Zuckerberg, has almost 600 million monthly active users – twice as many as ChatGPT.

Earlier, Zuckerberg said that to train promising Llama 4, the company will need ten times more computing resources than for current Llama 3. For these purposes, Meta✴ purchased more than 100 thousand Nvidia accelerators – Elon Musk’s xAI also has a cluster of this power ). It is also known that in the second quarter of 2024, Meta✴ capital expenditures increased by almost 33% compared to the same period last year and amounted to $8.5 billion – funds were spent on servers, data centers and network infrastructure.

admin

Share
Published by
admin

Recent Posts

TSMC CEO Reminds Compatriots That the Company Will Build 11 New Enterprises in Taiwan This Year Alone

The buzz surrounding TSMC's plans to increase its investment in the US by $100 billion…

39 minutes ago

The graphics card market showed growth last quarter, but the long-term outlook is weak

According to a new report from analyst firm Jon Peddie Research, the global market for…

49 minutes ago

Solar film has been printed in rolls like wallpaper

British company Power Roll, together with scientists from the University of Sheffield, reported progress in…

2 hours ago

By 2030, console gaming will leave PC gaming far behind, but mobile games will be in the lead

Apparently, in the near future the eternal dispute about what is more popular - games…

5 hours ago

Defective GPUs May Have Leaked Into GeForce RTX 50 Series Laptops — Now They Won’t Be Released on Time

According to German publication Heise, laptop manufacturers are working hard to thoroughly test new models…

6 hours ago

Robocop Returns in Unfinished Business Story DLC for RoboCop: Rogue City — Details and First Gameplay

Publisher Nacon and developers from the Polish studio Teyon (Terminator: Resistance) presented Unfinished Business -…

7 hours ago