Perhaps in the future, Apple will rely on its own server processors to develop its artificial intelligence systems, but for now it is forced to use third-party components. Unlike many of its competitors, which rely heavily on Nvidia in this area, Apple has relied on Google’s tensor processors.

Image Source: Apple

As CNBC notes, this follows from Apple’s explanatory note, which indicates the configuration of the computing clusters the company used to train its large language models that will form the basis of Apple Intelligence technology. Its preliminary version has already begun to be demonstrated on some devices, and therefore Apple has decided to disclose information about the specifics of the development of its artificial intelligence infrastructure.

Apple does not directly mention the developer of the processors that it used to train its Apple Foundation Model language model, but the text of the note includes the wording “TPU-based cloud clusters.” This is what Google calls its tensor processors for short. Among other things, this revelation from Apple talks about the company’s use of cloud computing resources leased from Google. For the emerging stage of Apple’s artificial intelligence systems, this is a completely justified approach.

The large language model that will run on Apple endpoints was trained using a cluster of 2,048 Google v5p series processors, which are considered state-of-the-art. The server part of the model was trained on a cluster with 8192 v4 processors. Google rents such clusters for $2 per hour for each processor used by the client. At the same time, Google trains its own language models using Nvidia hardware, and not just its own processors. Apple does not disclose the possible use of Nvidia chips in its documentation.

Leave a Reply

Your email address will not be published. Required fields are marked *