Google’s Gemini to power Apple’s AI features like Siri
What Happened
Apple and Google have embarked on a non-exclusive, multi-year partnership that will involve Apple using Gemini models and Google cloud technology for future foundational models.
Our Take
Apple doesn't want to build LLMs. Google doesn't want Apple locked into Claude. So they made a non-exclusive deal that benefits both. Standard play.
The cynical read: Apple's outsourcing intelligence to Google (again) and pretending it's about choice. 'Non-exclusive' means Apple can swap in Claude later, but the default will be Gemini and that's what matters for UX. Google gets distribution into 2B+ iPhones.
Real question is whether Siri actually gets better. If it's just 'add LLM to Siri's queries' without rearchitecting Siri's broken intent detection, you're wrapping garbage in a fancy model.
What To Do
Don't bet on Siri being usable — keep your voice automation budget for Alexa or Google Assistant where the infrastructure is actually less broken.
Cited By
React
