The AI ​​market, which is experiencing an unprecedented boom, can be assessed by various criteria. The most obvious ones are performance and energy consumption, but Morgan Stanley analysts decided to look at the consumption of silicon wafers for AI processors. As it turns out, Nvidia is aiming for 77% of the global market for these products in 2025.

Image source: NVIDIA

Nvidia continues to operate at an unprecedented scale and sharply increase production, while AMD’s share in terms of wafer use promises to decline over the year. The report also includes data on AWS, Google, Tesla, Microsoft and Chinese suppliers. By the end of 2025, Nvidia will account for up to 535,000 300-mm wafers for AI chips, which will amount to 77% of the global market. For comparison: in 2024, the company’s share was 51%, Morgan Stanley analysts point out.

Alternative chips, including Google’s TPU v6 and AWS Trainium, are rapidly gaining momentum, but they are lagging far behind Nvidia. AWS’s share will decline from 10% to 7% over the course of the year, while Google’s will decline from 19% to 10%. Google will need 85,000 wafers for TPU v6; AWS will need 30,000 for Trainium 2 and 16,000 for Trainium 3.

Image source: x.com/Jukanlosreve

AMD’s share will fall from 9% to 3%. Its Instinct MI300, MI325 and MI355 AI accelerators will require between 5,000 and 25,000 wafers, depending on the model. In absolute terms, AMD does not intend to reduce its wafer consumption, but its market share will decrease. Intel’s Gaudi 3 (Habana) processors will take up only 1%; the shares of Tesla, Microsoft and Chinese suppliers are also insignificant.

Tesla’s Dojo and FSD chips remain small, as the company is a niche player in the AI ​​market. Microsoft’s silicon needs are also modest: its Maia 200 accelerator and its improved version are used in limited quantities, as the company continues to rely on Nvidia solutions to both train and run AI models. The report does not specify whether Nvidia’s dominance this year is due to demand or the amount of capacity reserved at TSMC.

The AI ​​chip market in 2025 is expected to require 688,000 wafers, which will amount to $14.57 billion in monetary terms. However, this figure may be understated, since TSMC earned $64.93 billion in 2024, of which 51% (more than $32 billion) came from the high-performance computing (HPC) segments.

Technically, this area includes not only AI accelerators, but also processors for consumer PCs and chips for gaming consoles. However, a significant portion of revenue is associated with graphics and central processors for data centers.

The B200 model is the biggest contributor to Nvidia’s numbers: its production will require 220,000 wafers, equivalent to $5.84 billion in revenue. The company will strengthen its position with the H100, H200, and B300 accelerators. All of them are manufactured using TSMC’s 4 nm process technology, and the sizes of the computing crystals range from 814 to 850 mm², which explains the high demand for silicon wafers.

Leave a Reply

Your email address will not be published. Required fields are marked *