Sales of AI servers will grow to $187 billion in 2024 – they will occupy 65% ​​of the entire market

High demand for advanced servers for artificial intelligence systems from large cloud service providers and their clients will continue until the end of 2024, TrendForce analysts are confident. Production expansions from TSMC, SK hynix, Samsung and Micron have significantly reduced the shortfall in Q2 2024, with order lead times for Nvidia’s flagship H100 AI accelerators reduced from the previous 40-50 weeks to less than 16 weeks.

Image source: nvidia.com

The volume of shipments of AI servers at the end of the second quarter increased by almost 20% compared to the previous quarter, according to a preliminary estimate from TrendForce – the annual supply forecast was revised to 1.67 million units, which corresponds to an increase of 41.5% year-on-year. Large cloud providers this year are directing their budgets to the purchase of AI servers to the detriment of the growth rate of deliveries of conventional servers – it will be only 1.9%. The share of AI servers in total shipments is expected to reach 12.5%, which is approximately 3.4 p.p. higher than in 2023.

In terms of market value, AI servers contribute more to revenue growth than conventional servers. By the end of 2024, the market value of AI servers will exceed $187 billion with a growth rate of 69%, which will account for 65% of the total value of servers, TrendForce forecasts. Both the North American operators AWS and Meta✴, as well as the Chinese giants Alibaba, Baidu and Huawei, are actively expanding their own ASIC solutions. By the end of 2024, the share of ASIC servers is expected to reach 26% of the total server market, while AI servers with mainstream GPUs will account for about 71%.

Image source: trendforce.com

Nvidia will retain the largest share of about 90% in the market for suppliers of AI chips for AI servers; AMD’s market share will only be around 8%. But if you include all the AI ​​chips used in AI servers (GPUs, ASICs, FPGAs), Nvidia’s market share for the year is about 64%. Demand for advanced AI servers, according to TrendForce analysts, will remain high throughout 2025, especially considering that the Nvidia Hopper will be replaced by a new generation of Blackwell AI accelerators (GB200, B100/B200). Because of this, the demand for TSMC CoWoS chip packaging and HBM memory will increase: on the Nvidia B100 accelerator the chip size is twice as large. TSMC CoWoS production capacity will reach 550-600 thousand units by the end of 2025, with a growth rate of about 80%. The mainstream Nvidia H100 in 2024 will be equipped with 80 GB HMB3; by 2025, Nvidia Blackwell Ultra and AMD MI350 chips will receive up to 288 GB of HBM3e, tripling the consumption of memory components. The total supply of HBM is expected to double by 2025 due to high demand in the AI ​​server market.

admin

Share
Published by
admin

Recent Posts

Former top manager of Intel headed the second largest Chinese chip manufacturer

Hua Hong Semiconductor, China's second-largest chip maker, has made a strategic leadership reshuffle with the…

22 minutes ago

“Nothing can be cooler than this”: the creators of Phantom Blade Zero amazed gamers with new gameplay

On the occasion of the approaching Lunar New Year, developers from the Chinese studio S-Game…

22 minutes ago

Microsoft has joined the CISPE cloud alliance, which has been fighting it for years

Microsoft has become a new member of the CISPE association, which unites mainly small cloud…

2 hours ago

Nvidia said that GeForce RTX 5000 video cards will not have connectors that melt

At the recent GeForce Editors Day press event in South Korea, Nvidia said that the…

3 hours ago

GeForce RTX 5000 video cards will be in short supply and this will not last long, Nvidia partners warned

Nvidia's GeForce RTX 5000 family of graphics cards, introduced at the beginning of the month,…

3 hours ago