Sales of AI servers will grow to $187 billion in 2024 – they will occupy 65% ​​of the entire market

High demand for advanced servers for artificial intelligence systems from large cloud service providers and their clients will continue until the end of 2024, TrendForce analysts are confident. Production expansions from TSMC, SK hynix, Samsung and Micron have significantly reduced the shortfall in Q2 2024, with order lead times for Nvidia’s flagship H100 AI accelerators reduced from the previous 40-50 weeks to less than 16 weeks.

Image source: nvidia.com

The volume of shipments of AI servers at the end of the second quarter increased by almost 20% compared to the previous quarter, according to a preliminary estimate from TrendForce – the annual supply forecast was revised to 1.67 million units, which corresponds to an increase of 41.5% year-on-year. Large cloud providers this year are directing their budgets to the purchase of AI servers to the detriment of the growth rate of deliveries of conventional servers – it will be only 1.9%. The share of AI servers in total shipments is expected to reach 12.5%, which is approximately 3.4 p.p. higher than in 2023.

In terms of market value, AI servers contribute more to revenue growth than conventional servers. By the end of 2024, the market value of AI servers will exceed $187 billion with a growth rate of 69%, which will account for 65% of the total value of servers, TrendForce forecasts. Both the North American operators AWS and Meta✴, as well as the Chinese giants Alibaba, Baidu and Huawei, are actively expanding their own ASIC solutions. By the end of 2024, the share of ASIC servers is expected to reach 26% of the total server market, while AI servers with mainstream GPUs will account for about 71%.

Image source: trendforce.com

Nvidia will retain the largest share of about 90% in the market for suppliers of AI chips for AI servers; AMD’s market share will only be around 8%. But if you include all the AI ​​chips used in AI servers (GPUs, ASICs, FPGAs), Nvidia’s market share for the year is about 64%. Demand for advanced AI servers, according to TrendForce analysts, will remain high throughout 2025, especially considering that the Nvidia Hopper will be replaced by a new generation of Blackwell AI accelerators (GB200, B100/B200). Because of this, the demand for TSMC CoWoS chip packaging and HBM memory will increase: on the Nvidia B100 accelerator the chip size is twice as large. TSMC CoWoS production capacity will reach 550-600 thousand units by the end of 2025, with a growth rate of about 80%. The mainstream Nvidia H100 in 2024 will be equipped with 80 GB HMB3; by 2025, Nvidia Blackwell Ultra and AMD MI350 chips will receive up to 288 GB of HBM3e, tripling the consumption of memory components. The total supply of HBM is expected to double by 2025 due to high demand in the AI ​​server market.

admin

Share
Published by
admin

Recent Posts

Intel has officially begun offering 18A technology to its customers

Intel management has repeatedly stated that it will not delay providing its customers with access…

14 hours ago

Elon Musk has managed to make X profitable, but revenue still lags behind Twitter’s independent days

The sudden surge of investor interest in Elon Musk's X has been reported recently, but…

14 hours ago

Trump’s allies intend to hold tech giants accountable for censorship on social networks and other services

The new head of the US Federal Trade Commission (FTC), appointed by President Donald Trump,…

14 hours ago

Chinese Go Underground to Find Place to Store Energy in Compressed Air

The project of storing energy in compressed air, tested in Germany in the 1970s, has…

14 hours ago