The Ray-Ban Meta Glasses are the first truly successful AI wearable. They’re actually pretty good: They have Ray-Ban chic style and don’t look as goofy as other big, heavy mixed reality face computers. The AI agent on board can answer questions and even identify what you’re looking at using the built-in camera. Plus, you can use voice commands to take photos and videos of whatever’s in front of you without having to pull out your smartphone.
More AI-powered voice capabilities will soon be added to Meta’s smart glasses. Meta CEO Mark Zuckerberg announced the latest update to the smart glasses’ software at the company’s Meta Connect event today.
“The reality is, most of the time the smart features aren’t being used, and people want to have something on their face that they’re proud of, that looks good, and that’s designed in a really nice way,” Zuckerberg said at Connect. “So these are great glasses, and as we continue to update the software and build out the ecosystem, they’re going to get even smarter and be able to do more.”
The company also used Connect to unveil the new Meta Quest 3S, a lower-cost version of its mixed reality headset, and also rolled out a number of other AI features across various platforms, adding new capabilities to its Meta AI and Llama large-scale language models.
Provided by Meta
Provided by Meta
As far as Ray-Ban is concerned, Meta isn’t messing around too much with a good thing. The smart glasses were introduced with AI tech earlier this year, and while Meta has added more features, the enhancements here are pretty minimal. Ask the Meta AI a question and hear its response directly through speakers built into the temples of the frames. Now, there are a few new things you can ask the Meta AI or tell it to do.
Perhaps most impressive is the ability to set reminders. You can look at something while wearing the glasses and say, “Hey, remind me to buy this book next week,” and the glasses will understand what book it is and set a reminder. A week later, the Meta AI will tell you it’s time to buy that book.
Provided by Meta
Provided by Meta
Meta says the glasses will soon have a live transcription service, allowing people who speak different languages to see live, or at least somewhat timed, transcribed speech. It’s unclear how well this will work, as Meta glasses’ previous text translation features have proven hit-and-miss.
Provided by Meta
Provided by Meta
New frame and lens colors have been added, and customers now have the option to add transition lenses that provide more or less shading depending on the current sunlight level.
Meta didn’t say exactly when these additional AI features will arrive in Ray-Bans, but it did say they’ll be coming sometime this year, and with only three months left in 2024, that means it’s almost here.