Toggle light / dark theme

Until now, Google’s Android XR glasses had only appeared in carefully curated teaser videos and limited hands-on previews shared with select publications. These early glimpses hinted at the potential of integrating artificial intelligence into everyday eyewear but left lingering questions about real-world performance. That changed when Shahram Izadi, Google’s Android XR lead, took the TED stage – joined by Nishtha Bhatia – to demonstrate the prototype glasses in action.

The live demo showcased a range of features that distinguish these glasses from previous smart eyewear attempts. At first glance, the device resembles an ordinary pair of glasses. However, it’s packed with advanced technology, including a miniaturized camera, microphones, speakers, and a high-resolution color display embedded directly into the lens.

The glasses are designed to be lightweight and discreet, with support for prescription lenses. They can also connect to a smartphone to leverage its processing power and access a broader range of apps.

Leave a Comment

If you are already a member, you can use this form to update your payment info.

Lifeboat Foundation respects your privacy! Your email address will not be published.