Meta outdid OpenAI: its AI turned out to be twice as popular as ChatGPT

Meta✴ presented another version of its latest artificial intelligence model – designed for working with text, Llama 3.3 70B. The new product shows performance at the level of the previous flagship Llama 3.1 405B, but offers higher efficiency: as the name suggests, the new version has 70 billion parameters versus 405 billion for the old one. The company also revealed the audience size of its AI assistant – it turned out to be twice as popular as ChatGPT.

Image source: Igor Omilaev / unsplash.com

«Using the latest advances in post-training techniques, this model has improved core performance at a significantly lower cost,” Ahmad Al-Dahle, vice president of generative AI at Meta✴, said on Social Media X. He also published a chart that shows Llama 3.3 70B outperformed Google Gemini 1.5 Pro, OpenAI GPT-4o and Amazon Nova Pro in a number of industry tests, including MMLU, an assessment of a model’s ability to understand language.

Image source: x.com/Ahmad_Al_Dahle

The new model has been published as open source and is available for download on the Hugging Face platform and other sources, including the official Llama website. Meta✴, however, has partially limited the ability of developers to use Llama models: for projects with more than 700 million users, a special license is required. At the same time, the Llama models themselves, according to Meta✴, have been downloaded more than 650 million times. Llama is also used by the company itself: it is used to build the AI ​​assistant Meta✴ AI, which, according to the head of the company, Mark Zuckerberg, has almost 600 million monthly active users – twice as many as ChatGPT.

Earlier, Zuckerberg said that to train promising Llama 4, the company will need ten times more computing resources than for current Llama 3. For these purposes, Meta✴ purchased more than 100 thousand Nvidia accelerators – Elon Musk’s xAI also has a cluster of this power ). It is also known that in the second quarter of 2024, Meta✴ capital expenditures increased by almost 33% compared to the same period last year and amounted to $8.5 billion – funds were spent on servers, data centers and network infrastructure.

admin

Share
Published by
admin

Recent Posts

Google to Make End-to-End Encryption in Gmail Available to Everyone

Google plans to roll out end-to-end encryption (E2EE) of email to all users, even those…

15 hours ago

Putin Bans Government Agencies and Banks from Communicating with Clients via Foreign Messengers

Vladimir Putin signed a law aimed at protecting citizens from telephone and cyber fraudsters: employees…

15 hours ago

Blue Origin Finds Out Why It Lost Its New Glenn Rocket’s Reusable Stage During Its First Launch

The US Federal Aviation Administration (FAA) has announced that it has received a document containing…

15 hours ago

British woman accidentally throws away her fiance’s £3m Bitcoin wallet

During spring cleaning, UK resident Ellie Hart threw out a USB device with the trash,…

15 hours ago

$100 billion as a blind: experts doubt TSMC’s plans to develop factories in the US

Early last month, TSMC announced plans to spend another $100 billion to expand its U.S.…

15 hours ago