The Chinese presented the open AI model DeepSeek V3 – it is faster than GPT-4o and its training was much cheaper

The Chinese company DeepSeek introduced a powerful open artificial intelligence model DeepSeek V3 – the license allows it to be freely downloaded, modified and used in most projects, including commercial ones.

Image source: and machines / unsplash.com

DeepSeek V3 handles a variety of text processing tasks, including writing articles, emails, translation, and code generation. The model is superior to most open and closed analogues, as shown by the results of testing carried out by the developer. Thus, in programming-related tasks it turned out to be stronger than Meta✴ Llama 3.1 405B, OpenAI GPT-4o and Alibaba Qwen 2.5 72B; DeepSeek V3 also performed better than its competitors in the Aider Polyglot test, which tests, among other things, its ability to generate code for existing projects.

The model was trained on a data set of 14.8 trillion projects; When deployed on the Hugging Face platform, DeepSeek V3 showed a size of 685 billion parameters – about 1.6 times more than Llama 3.1 405B, which, as one might guess, has 405 billion parameters. Typically, the number of parameters, that is, internal variables that models use to predict responses and make decisions, correlates with the model’s skill: the more parameters, the more capable it is. But running such AI systems requires more computing resources.

DeepSeek V3 was trained in two months in a data center on Nvidia H800 accelerators – their deliveries to China are now prohibited by American sanctions. The cost of training the model, the developer claims, was $5.5 million, which is significantly lower than OpenAI’s expenses for the same purposes. At the same time, DeepSeek V3 is politically verified – it refuses to answer questions that official Beijing considers sensitive.

In November, the same developer presented the DeepSeek-R1 model, an analogue of the “reasoning” OpenAI o1. One of DeepSeek’s investors is Chinese hedge fund High-Flyer Capital Management, which makes decisions using AI. It has several of its own clusters for training models. One of the latest, according to some information, contains 10,000 Nvidia A100 accelerators, and its cost was 1 billion yuan ($138 million). High-Flyer aims to help DeepSeek develop “superintelligent” AI that will outperform humans.

admin

Share
Published by
admin

Recent Posts

Apple to Release Updated MacBook Air with M4 Chip in March 2025

Apple is preparing to launch updated 13- and 15-inch versions of the MacBook Air laptop,…

1 hour ago

Official Radeon RX 9070 XT Relative Performance Leaked to Press

The VideoCardz portal writes that AMD held a closed briefing for journalists this week, where…

2 hours ago

Kindergarten of some kind: former German data center converted into preschool

Bonn, Germany, is in dire need of kindergartens, so they are sometimes placed in the…

2 hours ago

Apple to Improve iPhone 17 Pro Camera with Focus on Video

According to online sources, Apple will focus more on improving video recording in the new…

2 hours ago

GeForce RTX 5070 Ti with “fallen off” ROPs loses up to 11% performance in synthetic tests

It was previously reported that some GeForce RTX 5090/RTX 5090D graphics cards, and as it…

2 hours ago

Chinese scientists have figured out how to extend the life of lithium-ion batteries

A group of researchers from China has developed a technology that will restore the capacity…

3 hours ago