Apple Reveals AI Models Were Trained on Google’s Custom Tensor Processing Units

Apple announced this week that its AI models, which power Apple Intelligence, were pretrained on Google’s Tensor Processing Units (TPUs). This move signals Apple’s exploration of alternatives to Nvidia’s GPUs for AI training.

The details were revealed in a 47-page technical paper, where Apple disclosed that its Apple Foundation Model (AFM) was trained using Google’s Cloud TPU clusters. This decision highlights the increasing competition in AI infrastructure, as companies like Apple seek diverse solutions beyond NVIDIA’s high-demand GPUs.

“This system allows us to train the AFM models efficiently and scalably, including AFM-on-device, AFM-server, and larger models,” Apple said in the paper.

Apple’s AI system, Apple Intelligence, introduced on Monday, includes enhanced features like improved Siri capabilities, natural language processing, and AI-generated summaries. The company plans to roll out additional generative AI functions over the next year.

Google’s TPUs, initially created for internal use and later made public, offer a cost-effective alternative for AI training. Despite this, Google remains a major customer of NVIDIA’s GPUs, using both its own TPUs and NVIDIA’s technology for AI system training.

Apple’s adoption of Google’s TPUs marks a significant step in its AI development strategy as it continues to expand its AI capabilities.

AI Insider

Discover the future of AI technology with "AI Insider" - your go-to platform for industry data, market insights, and groundbreaking AI news

Subscribe today for the latest news about the AI landscape