At the Meta Connect event on Wednesday, CEO Mark Zuckerberg announced a series of new AI-powered features from the company’s collaboration with Ray-Ban, the most intriguing of which was the addition of real-time translation through the glasses’ speakers.
Meta explains:
Soon, your glasses will be able to translate speech in real time. If you’re talking to a Spanish, French, or Italian speaker, you’ll hear what’s being said in English through the glasses’ open-ear speakers. Not only is this great for traveling, it helps break down language barriers and bring people closer together. We plan to add support for even more languages in the future to make this feature even more useful.
The companies have yet to announce a timeline for this particular AI addition, but depending on the implementation, it could be a very useful addition to livestreaming glasses.
Live translation has been something of a holy grail for established hardware companies and startups alike, most notably Google, which unveiled concept glasses with a head-up display capable of real-time translation, but these never made it beyond the prototype stage.
Meta has yet to announce which languages will be available initially, but judging by the statement above, it looks like it will initially be limited to Romance languages like English, Spanish, French, and Italian.