This is How Google’s Android XR Glasses Put Gemini on Your Face [Watch]

Reading time icon 2 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Google, in a recent announcement, revealed a closer look at its Android XR glasses, built to work directly with Gemini, its on-device AI.

The glasses come packed with a forward-facing camera, microphones, and speakers. That setup lets Gemini process your surroundings—what you see, what you hear, and what you say. It can give responses through audio or flash context-aware info inside a built-in display that only you can see.

You can take calls, ask questions, or control music, all hands-free. The glasses also sync with your Android phone, giving you access to apps without pulling it out. It’s an attempt to push computing into your line of sight, not just your pocket.

You may also like to read: Google Supercharges Gemini’s Deep Research with File Uploads and Smarter Reports

Google has already started testing the prototype with a closed group. There’s no timeline for a public release yet, but you can sign up for updates and rewatch the demo now live from I/O.

While I/O 2025 focused on broader Gemini integration, these glasses stole attention. They show where Android is heading, toward devices that don’t just respond to you but stay aware of what’s around you.

More about the topics: Google

User forum

0 messages