The latest research from Omdia shows that the rapid growth in demand for Google’s custom AI accelerators (TPUs) is forming a particular trend. It is possible that it will become strong enough to begin a process that could end NVIDIA’s dominance in the accelerator market, Omdia reports.
The third quarter results of Broadcom, whose Semiconductor Solutions division carries out outsourcing orders to Google, Meta✴ and some other IT giants, provide a new look at the accelerator market. In particular, they allow you to assess purchasing trends and indirectly obtain information that is usually hidden. For example, how many custom processors Google buys.
Broadcom CEO Hock Tan has repeatedly revised revenue plans for AI semiconductors; this year the company intends to earn $12 billion. Based on this, Google’s TPU is expected to account for between $6 billion (close to Omdia’s current estimates) and $9 billion. depending on the distribution of revenue between computing and network solutions. The amount fully includes revenue from Meta✴ MTIA chips. Next year, Broadcom will likely have a mysterious third customer.
According to Omdia experts, even taking into account the fact that the ratio of revenue from computing and networking devices is not precisely determined, TPU shipments even at the “lower limit” of $6 billion indicate growth fast enough to win back part of the market share from NVIDIA for the first time. TechInsights estimates that TPU shipments will reach 2 million units in 2023, and NVIDIA data center accelerator shipments will reach 3.8 million units.
It’s worth noting that revenue from the Google Cloud Platform business continues to grow as part of Google’s revenue, while the division’s profitability is also growing. This may indicate that TPU-based instances are driving the growth of Google Cloud and are highly profitable products. In mid-November, Google and NVIDIA showed the first results of TPU v6 and B200 in the MLPerf Training AI benchmark, where the accelerators showed mixed results in different comparisons.