Apple says it uses Amazon’s custom AI chips

Companies
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

 

Apple is currently using Amazon Web Services’ custom artificial intelligence chips for services like search and will evaluate if the company’s latest AI chip can be used to pretrain its models like Apple Intelligence.

Apple revealed its usage of Amazon’s proprietary chips at the annual AWS Reinvent conference on Tuesday. Benoit Dupin, Apple’s senior director of machine learning and AI, took the stage to discuss how Apple uses the cloud service. It’s a rare example of the company officially allowing a supplier to tout them as a customer.

“We have a strong relationship, and the infrastructure is both reliable and able to serve our customers worldwide,” Apple’s Dupin said.

Apple’s appearance at Amazon’s conference and its embrace of the company’s chips is a strong endorsement of the cloud service as it vies with Microsoft
 Azure and Google
 Cloud for AI spending. Apple uses those cloud services, too.

Dupin said Apple had used AWS for more than a decade for services including Siri, Apple Maps and Apple Music. Apple has used Amazon’s Inferentia and Graviton chips to serve search services, for example, and Dupin said Amazon’s chips had led to a 40% efficiency gain.

But Dupin also suggested that Apple would use Amazon’s Trainium2 chip to pretrain its proprietary models. It’s a sign that Amazon’s chips aren’t just a cost-effective way to inference AI models compared with x86 central processors made by Intel
 and AMD
, but can also be used to develop new AI. Amazon announced on Tuesday that its Trainium2 chip was generally available to rent.

“In early stages of evaluating Trainium2 we expect early numbers up to 50% improvement in efficiency with pretraining,” Dupin said.

AWS CEO Matt Garman said in an interview with CNBC on Tuesday that Apple had been an early adopter and beta-tester for the company’s Trainium chips.

Apple “came to us, and said to us, ‘how can you help us with our Generative AI capabilities, we need infrastructure in order to go build,’ and they had this vision for building Apple Intelligence,” AWS CEO Matt Garman told CNBC’s Kate Rooney.

Earlier this year, Apple said in a research paper that it had used Google Cloud’s TPU chips to train its iPhone AI service, which it calls Apple Intelligence.

The majority of AI training is done on pricey Nvidia
 graphics processors. Cloud providers and startups are racing to develop alternatives to lower costs and are exploring different approaches that could lead to more efficient processing. Apple’s usage of custom chips could signal to other companies that non-Nvidia training approaches can work.

AWS is expected to announce new details on Tuesday about offering Nvidia Blackwell-based AI servers for rent, too.

Apple released its first major generative AI product this fall. Apple Intelligence is a series of services that can summarize notifications, rewrite emails and generate new emojis. Later this month, it will integrate with OpenAI’s ChatGPT, the company says, and next year, Siri will get new abilities to control apps and speak naturally.

Unlike leading chatbots like OpenAI’s ChatGPT, Apple’s approach to AI isn’t based on large clusters of Nvidia-based servers in the cloud. Instead, Apple uses an iPhone, iPad or Mac chip to do as much of the processing as possible, and then sends complicated queries to Apple-operated servers using its own M-series chips.

source: cnbc.com