Sales of AI servers will grow to $187 billion in 2024 – they will occupy 65% ​​of the entire market

High demand for advanced servers for artificial intelligence systems from large cloud service providers and their clients will continue until the end of 2024, TrendForce analysts are confident. Production expansions from TSMC, SK hynix, Samsung and Micron have significantly reduced the shortfall in Q2 2024, with order lead times for Nvidia’s flagship H100 AI accelerators reduced from the previous 40-50 weeks to less than 16 weeks.

Image source: nvidia.com

The volume of shipments of AI servers at the end of the second quarter increased by almost 20% compared to the previous quarter, according to a preliminary estimate from TrendForce – the annual supply forecast was revised to 1.67 million units, which corresponds to an increase of 41.5% year-on-year. Large cloud providers this year are directing their budgets to the purchase of AI servers to the detriment of the growth rate of deliveries of conventional servers – it will be only 1.9%. The share of AI servers in total shipments is expected to reach 12.5%, which is approximately 3.4 p.p. higher than in 2023.

In terms of market value, AI servers contribute more to revenue growth than conventional servers. By the end of 2024, the market value of AI servers will exceed $187 billion with a growth rate of 69%, which will account for 65% of the total value of servers, TrendForce forecasts. Both the North American operators AWS and Meta✴, as well as the Chinese giants Alibaba, Baidu and Huawei, are actively expanding their own ASIC solutions. By the end of 2024, the share of ASIC servers is expected to reach 26% of the total server market, while AI servers with mainstream GPUs will account for about 71%.

Image source: trendforce.com

Nvidia will retain the largest share of about 90% in the market for suppliers of AI chips for AI servers; AMD’s market share will only be around 8%. But if you include all the AI ​​chips used in AI servers (GPUs, ASICs, FPGAs), Nvidia’s market share for the year is about 64%. Demand for advanced AI servers, according to TrendForce analysts, will remain high throughout 2025, especially considering that the Nvidia Hopper will be replaced by a new generation of Blackwell AI accelerators (GB200, B100/B200). Because of this, the demand for TSMC CoWoS chip packaging and HBM memory will increase: on the Nvidia B100 accelerator the chip size is twice as large. TSMC CoWoS production capacity will reach 550-600 thousand units by the end of 2025, with a growth rate of about 80%. The mainstream Nvidia H100 in 2024 will be equipped with 80 GB HMB3; by 2025, Nvidia Blackwell Ultra and AMD MI350 chips will receive up to 288 GB of HBM3e, tripling the consumption of memory components. The total supply of HBM is expected to double by 2025 due to high demand in the AI ​​server market.

admin

Share
Published by
admin

Recent Posts

“James Webb” was the first in history to find the “Einstein zigzag” – a unique curvature of space-time

Gravitational lensing, predicted 90 years ago by Einstein, was confirmed by observation four years after…

5 minutes ago

The second Xiaomi electric car will be released a year after the first and will be noticeably different from it

Xiaomi's efforts to carve out its place in China's highly competitive electric vehicle market are…

1 hour ago

New Google Android feature makes it easier to transfer data when changing devices

Google has introduced a new feature for Android called “Restore Credentials” that will greatly simplify…

3 hours ago

Google is ready to permanently cancel the development of the Pixel Tablet 3 tablet

Google appears to be freezing its market presence in the tablet category again, exiting the…

3 hours ago