Last Updated:
Meta’s popular smart glasses are regularly updated with new features and Live AI makes the wearable even more useful.
Meta has been consistently adding new features to its Ray-Ban smart glasses over the past few months. Now, the tech giant has rolled out three new features to its smart glasses, including live AI, live translation, and Shazam capabilities. It is to be noted that these capabilities have been introduced to users who signed up for the early access program in Canada and the US.
As per the company, the new AI features coming to the Ray-Ban Meta smart glasses are part of the v11 software update. It is now being unveiled to eligible devices. Among the three, the Live AI feature is one of the main highlights that accesses the cameras and continuously adds real-time video processing capability to Ray-Ban Meta glasses. This will let the AI chatbot continuously see the your surroundings and answer any question related to it in real-time.
The smart glasses users will be able to talk to Meta AI without saying the “Hey Meta” activation phrase. They can also change the topic and go back to previous topics easily. It is to be noted that this update works similarly to Google’s Project Astra, powered by Gemini 2.0.
Secondly, the Ray-Ban Meta glasses now support live translation between English and either Spanish, French, or Italian. This feature allows AI to translate speech in real-time within the supported languages. This means if a user is speaking to someone in one of those three languages, Meta AI can translate them in real time, and you will hear it in English.
Moreover, if they are also wearing Ray-Ban Meta glasses, you will be able to hear the translated audio through the glasses’ open-ear speakers. In addition to this, users can also view the translation on their smartphone as transcription. This update is great for travelers, where language could be a barrier.
Last but not least, the Ray-Ban Meta glasses now support music streaming via platforms like Spotify, Amazon Music, and Be My Eyes. It can also recognise the music playing in the background. Just ask, “Hey Meta, what is this song?” and Meta will come back with an answer in just a few seconds.
These features were also demonstrated by CEO Mark Zuckerberg during Meta Connect 2024 in September, and they are now being rolled out to early adopters.
However, Meta cautions that these new features may not get it right always and that it will continue to take user feedback and improve the AI features. As of now, there is no announcement on when these features will be released globally.