Chinese company Alibaba Group Holding has unveiled the QwQ-32B, an open-source artificial intelligence (AI) chip with reasoning capabilities that it says outperforms the DeepSeek R1 in a number of areas while using far fewer resources.
Image source: Alibaba Group Holding
Following the announcement, Alibaba shares rose 7.5% in Hong Kong trading, their biggest intraday gain in nearly two weeks.
Alibaba’s new AI model has 32 billion parameters and outperforms DeepSeek R1’s 671 billion parameters in areas such as mathematical calculations, coding, and general questioning. The team says that having fewer parameters allows the model to operate with lower computational requirements, which will facilitate its wider adoption. To improve the performance of the reasoning model, the team used reinforcement learning, a similar approach DeepSeek used to develop the R1 model. Alibaba also said that the QwQ-32B outperforms OpenAI’s o1-mini model with 100 billion parameters.
The QwQ-32B is available on Hugging Face, the world’s largest open-source AI model platform. You can also test it out via the Qwen chatbot. There, it is listed as a model under the name QwQ-32B-Preview.
Alibaba previously announced plans to invest more than 380 billion yuan ($52 billion) in cloud computing and AI infrastructure over the next three years, the largest AI project ever funded by a single private company in China. Alibaba CEO Eddie Wu said the company’s key focus is on developing Artificial General Intelligence (AGI), which he defined as the point at which AI can achieve 80% of human capabilities.