Toggle light / dark theme

The Ray-Ban Meta smart glasses are about to get a serious upgrade, thanks to Meta’s AI smarts finally learning to see and hear.

Meta, formerly known as Facebook, has announced some exciting new features for its Meta Ray-Ban smart glasses that will make them more useful and interactive. The company is testing a new “multimodal” AI assistant that can respond to your queries based on what you see and hear through the glasses’ camera and microphones.

Multimodal AI at play

The multimodal AI assistant can suggest outfits, translate text, caption images, and describe objects you point the glasses at. Meta CEO Mark Zuckerberg showed off some of these capabilities in an Instagram reel, asking the glasses to recommend pants that would go well with a shirt he was holding. The assistant gave him two options and described the shirt’s color and pattern.

The new video demonstrates that Optimus has enhanced its balance and body control and can move more naturally and gracefully.

Perhaps the most impressive advancement lies in its dexterous “brand new hands.” Whereas the Gen 1 prototype struggled with even basic manipulation, Optimus Gen 2 handles delicate objects with surprising finesse. The video showcasing its gentle handling of an egg serves as a testament to its improved dexterity, paving the way for broader applications in the future.

Its potential uses range from ultra-durable smartphone screens to soft, luminous light fixtures.


In pursuit of this objective, researchers have devised innovative strategies for modifying wood in recent years, imbuing it with new capabilities.

These advancements open the door to potentially substituting conventional, non-renewable, petroleum-based materials in diverse applications, including automobiles, energy storage, construction, and environmental remediation. This shift signifies a departure from the traditional domains of construction and paper industries for wood.