corporatetechentertainmentresearchmiscwellnessathletics

Meta updates its smart glasses with real-time AI video

By Kyle Wiggers

Meta updates its smart glasses with real-time AI video

Meta's Ray-Ban Meta smart glasses are getting several new AI-powered upgrades, including the ability to have an ongoing conversation and translate between languages.

Ray-Ban Meta owners in Meta's early access program for the U.S. and Canada can now download firmware v11, which adds "live AI." First unveiled this fall, live AI lets wearers continuously converse with Meta's AI assistant, Meta AI, to reference things they discussed earlier in the conversation. Without having to say the "Hey Meta" wakeword, wearers can interrupt Meta AI to ask follow-up questions or change the topic.

Live AI also works with real-time video. Wearers can ask questions about what they're seeing in real time -- for example, what's around their neighborhood.

Real-time AI video for Ray-Ban Meta was a significant focus of Meta's Connect dev conference early this fall. Positioned as an answer to OpenAI's Advanced Voice Mode with Vision and Google's Project Astra, the tech allows Meta's AI to answer questions about what's in view of the glasses' front-facing camera.

With Monday's update, Meta becomes one of the first tech giants to market with real-time AI video on smart glasses. Google recently said it plans to sell AR glasses with similar capabilities, but the company hasn't committed to a concrete timeline.

Meta claims that, in the future, live AI will even give "useful suggestions" before a wearer asks. What sort of suggestions? The company wouldn't say.

Firmware v11 also introduces live translation, which enables Ray-Ban Meta wearers to translate real-time speech between English and Spanish, French, or Italian. When a wearer is talking to someone speaking one of those languages, they'll hear what the speaker says in English through the glasses' open-ear speakers and get a transcript on their paired phone.

Ray-Ban Meta also has Shazam support as of firmware v11. Wearers can say "Hey Meta, Shazam this song" to have the glasses try to find the currently-playing tune.

Meta warns that the new feature, in particular live AI and live translation, might not always get things right. "We're continuing to learn what works best and improving the experience for everyone," the company wrote in a blog post.

Previous articleNext article

POPULAR CATEGORY

corporate

10105

tech

11371

entertainment

12397

research

5618

misc

13126

wellness

9984

athletics

13113