Sales of AI servers will grow to $187 billion in 2024 – they will occupy 65% ​​of the entire market

High demand for advanced servers for artificial intelligence systems from large cloud service providers and their clients will continue until the end of 2024, TrendForce analysts are confident. Production expansions from TSMC, SK hynix, Samsung and Micron have significantly reduced the shortfall in Q2 2024, with order lead times for Nvidia’s flagship H100 AI accelerators reduced from the previous 40-50 weeks to less than 16 weeks.

Image source: nvidia.com

The volume of shipments of AI servers at the end of the second quarter increased by almost 20% compared to the previous quarter, according to a preliminary estimate from TrendForce – the annual supply forecast was revised to 1.67 million units, which corresponds to an increase of 41.5% year-on-year. Large cloud providers this year are directing their budgets to the purchase of AI servers to the detriment of the growth rate of deliveries of conventional servers – it will be only 1.9%. The share of AI servers in total shipments is expected to reach 12.5%, which is approximately 3.4 p.p. higher than in 2023.

In terms of market value, AI servers contribute more to revenue growth than conventional servers. By the end of 2024, the market value of AI servers will exceed $187 billion with a growth rate of 69%, which will account for 65% of the total value of servers, TrendForce forecasts. Both the North American operators AWS and Meta✴, as well as the Chinese giants Alibaba, Baidu and Huawei, are actively expanding their own ASIC solutions. By the end of 2024, the share of ASIC servers is expected to reach 26% of the total server market, while AI servers with mainstream GPUs will account for about 71%.

Image source: trendforce.com

Nvidia will retain the largest share of about 90% in the market for suppliers of AI chips for AI servers; AMD’s market share will only be around 8%. But if you include all the AI ​​chips used in AI servers (GPUs, ASICs, FPGAs), Nvidia’s market share for the year is about 64%. Demand for advanced AI servers, according to TrendForce analysts, will remain high throughout 2025, especially considering that the Nvidia Hopper will be replaced by a new generation of Blackwell AI accelerators (GB200, B100/B200). Because of this, the demand for TSMC CoWoS chip packaging and HBM memory will increase: on the Nvidia B100 accelerator the chip size is twice as large. TSMC CoWoS production capacity will reach 550-600 thousand units by the end of 2025, with a growth rate of about 80%. The mainstream Nvidia H100 in 2024 will be equipped with 80 GB HMB3; by 2025, Nvidia Blackwell Ultra and AMD MI350 chips will receive up to 288 GB of HBM3e, tripling the consumption of memory components. The total supply of HBM is expected to double by 2025 due to high demand in the AI ​​server market.

admin

Share
Published by
admin

Recent Posts

An insider has revealed the main source of inspiration for the multiplayer Assassin’s Creed Invictus – Fall Guys

Image Source: Mediatonic Among the available formats are team deathmatch, every man for himself, and…

1 hour ago

Seasonic has released a PRIME PX-2200 power supply with a power of 2200 W for $500

Seasonic has released the PRIME PX-2200 2200 W power supply. The new product was first…

1 hour ago

Mercedes-Benz accelerated its third-level autopilot to 95 km/h

The ability of modern automation to control vehicles without human intervention is limited by a…

1 hour ago

GPUs limit programming freedom, so more chips will appear in the field of AI – Lisa Su

GPUs, originally created for creating three-dimensional images, have performed well in the field of accelerating…

2 hours ago

Samsung Display will build an OLED display plant in Vietnam

South Korean electronics maker Samsung Display plans to invest $1.8 billion this year to build…

2 hours ago