The Ray-Ban Meta smart glasses are about to get a serious upgrade, thanks to Meta’s AI smarts finally learning to see and hear.
Meta, formerly known as Facebook, has announced some exciting new features for its Meta Ray-Ban smart glasses that will make them more useful and interactive. The company is testing a new “multimodal” AI assistant that can respond to your queries based on what you see and hear through the glasses’ camera and microphones.
Multimodal AI at play
The multimodal AI assistant can suggest outfits, translate text, caption images, and describe objects you point the glasses at. Meta CEO Mark Zuckerberg showed off some of these capabilities in an Instagram reel, asking the glasses to recommend pants that would go well with a shirt he was holding. The assistant gave him two options and described the shirt’s color and pattern.
Finally, smart glasses that actually look like glasses!!!