Meta CEO Mark Zuckerberg announced updates to the company’s Ray-Ban Meta smart glasses at Meta Connect 2024 on Wednesday. Meta continued to make the case that smart glasses can be the next big consumer device, announcing some new AI capabilities and familiar features from smartphones coming to Ray-Ban Meta later this year.
Some of Meta’s new features include real-time AI video processing and live language translation. Other announcements — like QR code scanning, reminders, and integrations with iHeart Radio and Audible — seem to give Ray-Ban Meta users the features from their smartphones that they already know and love.
Meta says its smart glasses will soon have real-time AI video capabilities, meaning you can ask the Ray Ban Meta glasses questions about what you’re seeing in front of you, and Meta AI will verbally answer you in real time. Currently, the Ray-Ban Meta glasses can only take a picture and describe that to you or answer questions about it, but the video upgrade should make the experience more natural, in theory at least. These multimodal features are slated to come later this year.
In a demo, users could ask Ray-Ban Meta questions about a meal they were cooking, or city scenes taking place in front of them. The real-time video capabilities mean that Meta’s AI should be able to process live action and respond in an audible way.