Meta✴ presented another version of its latest artificial intelligence model – designed for working with text, Llama 3.3 70B. The new product shows performance at the level of the previous flagship Llama 3.1 405B, but offers higher efficiency: as the name suggests, the new version has 70 billion parameters versus 405 billion for the old one. The company also revealed the audience size of its AI assistant – it turned out to be twice as popular as ChatGPT.

Image source: Igor Omilaev / unsplash.com

«Using the latest advances in post-training techniques, this model has improved core performance at a significantly lower cost,” Ahmad Al-Dahle, vice president of generative AI at Meta✴, said on Social Media X. He also published a chart that shows Llama 3.3 70B outperformed Google Gemini 1.5 Pro, OpenAI GPT-4o and Amazon Nova Pro in a number of industry tests, including MMLU, an assessment of a model’s ability to understand language.

Image source: x.com/Ahmad_Al_Dahle

The new model has been published as open source and is available for download on the Hugging Face platform and other sources, including the official Llama website. Meta✴, however, has partially limited the ability of developers to use Llama models: for projects with more than 700 million users, a special license is required. At the same time, the Llama models themselves, according to Meta✴, have been downloaded more than 650 million times. Llama is also used by the company itself: it is used to build the AI ​​assistant Meta✴ AI, which, according to the head of the company, Mark Zuckerberg, has almost 600 million monthly active users – twice as many as ChatGPT.

Earlier, Zuckerberg said that to train promising Llama 4, the company will need ten times more computing resources than for current Llama 3. For these purposes, Meta✴ purchased more than 100 thousand Nvidia accelerators – Elon Musk’s xAI also has a cluster of this power ). It is also known that in the second quarter of 2024, Meta✴ capital expenditures increased by almost 33% compared to the same period last year and amounted to $8.5 billion – funds were spent on servers, data centers and network infrastructure.

Leave a Reply

Your email address will not be published. Required fields are marked *