Google’s first AI glasses expected next year
What Happened
Google is working on various types of AI-powered glasses — one model is designed for screen-free assistance, using built-in speakers, microphones, and cameras to allow the user to communicate with Gemini and take photos. The other model has an in-lens display — which is only visible to the person we
Our Take
Look, Google's been saying "next year" on glasses for five years. The screen-free version is just a speaker that talks back—which, we already have in our pockets. They're banking on the in-lens display model but they're stuck in prototyping hell while the consumer question remains unanswered: why would I want Gemini narrating my walk to the coffee shop instead of just asking my phone?
The real gap is they need an actual use case beyond "it's wearable tech." Until they solve that, this is vaporware marketing.
What To Do
Skip the hype—the glasses won't matter until Google ships something that does something phones can't, and that ain't happening next year.
Cited By
React