Amazon is investing billions in developing AI chips to reduce dependence on Nvidia

The AWS division of the American Internet giant Amazon has long been one of the largest players in the cloud services market. It is heavily dependent on Nvidia components and software, but at the same time it is developing its own infrastructure, using the developments of Annapurna Labs, acquired in 2015 for $350 million.

Image source: Amazon

Next month, the Financial Times reports, the company is due to demonstrate to the public its Trainium 2 accelerators, which can handle training large language models. Samples of these accelerators are already being used by the startup Anthropic, in which Amazon has invested $4 billion. Amazon’s clients in this area also include Databricks, Deutsche Telekom, Ricoh and Stockmark.

AWS Vice President of Compute and Networking Services Dave Brown said: “We want to be the absolute best place to run Nvidia, but at the same time we think it’s okay to have an alternative.” Already now, the accelerators of the Inferentia family are 40% cheaper than Nvidia solutions when generating responses from AI models. When it comes to spending tens of millions of dollars, these savings can be critical when choosing a computing platform.

By the end of this year, Amazon’s capital expenditures could reach $75 billion, and next year they will be even higher. Last year they were limited to $48.4 billion, and the size of the increase shows how important the company considers financing its infrastructure in the context of the rapid development of the market for AI systems. Futurum Group experts explain that large cloud service providers are striving to form their own vertically integrated and homogeneous structure of the chips used. Most of them strive to develop their own chips for computing accelerators, this allows them to reduce costs, increase profits, and strengthen control over the availability of chips and business development in general. “It’s not so much about the chip, but about the system as a whole,” explains Rami Sinno, director of development at Annapurna Labs. Few companies can replicate what Amazon does on a large scale, he said.

The proprietary chips allow Amazon to consume less power and improve the efficiency of its own data centers. TechInsights compares Nvidia’s chips to station wagons, while Amazon’s own solutions resemble smaller hatchbacks designed to perform a narrow range of tasks. Amazon is in no hurry to share data on testing the performance of its accelerators, but the Trainium 2 chips should be four times faster than their predecessors, according to available data. The mere emergence of alternatives to Nvidia solutions can already be highly appreciated by AWS customers.

admin

Share
Published by
admin

Recent Posts

Release Calendar – May 5-11: The Adventures of Captain Blood, Hordes of Hunger and The Midnight Walk

We have released the latest issue of the Release Calendar. In the video, we talk…

6 minutes ago

Release Calendar – May 5-11: The Adventures of Captain Blood, Hordes of Hunger and The Midnight Walk

We have released the latest issue of the Release Calendar. In the video, we talk…

10 minutes ago

Core 2 Duo is back, but now you need to wear it on your wrist — Pebble creator shows off upcoming smartwatch

Eric Migicovsky, the creator of the legendary Pebble smartwatch, announced two new models earlier this…

10 minutes ago

Core 2 Duo is back, but now you need to wear it on your wrist — Pebble creator shows off upcoming smartwatch

Eric Migicovsky, the creator of the legendary Pebble smartwatch, announced two new models earlier this…

17 minutes ago

Antitrust lawsuit against Google could lead to Firefox’s disappearance

The US Justice Department's antitrust lawsuit against Google threatens the survival of Firefox. Mozilla has…

51 minutes ago

1 Pbps Across the Atlantic: Meta Set to Build Fastest Undersea Cable

Meta✴ has shared details of its latest ambitious project. The company intends to build the…

1 hour ago