Meta✴ has expanded the functionality of Ray-Ban Meta✴ smart glasses, adding support for “living AI”, a real-time translator and the Shazam music recognition service. Live AI and translation features are only available to members of Meta✴’s Early Access Program, while Shazam support is now available to all Ray-Ban smart glasses owners in the US and Canada.

Image source: Ray-Ban

Live AI and translation features were first introduced by the company at the Meta✴ Connect 2024 event. Live AI is designed to improve interactions with Meta✴’s voice-based AI assistant, which takes into account the user’s surroundings when communicating. For example, while in a grocery store, you can ask the AI ​​to suggest recipes using the products the user is looking at. According to the company, when the battery is fully charged, the “living AI” smart glasses can be used for approximately 30 minutes.

The translation function allows the glasses to translate speech between English, Spanish and Italian in real time. The translation can be heard through the glasses’ speakers or read as text in a smartphone app. To use the function, you must download the required language pairs in advance, as well as indicate the languages ​​spoken by the user and his interlocutor.

The Shazam service on glasses works similarly to the mobile application. Using smart glasses, the user can ask Meta✴ AI by voice to identify the name of the song that is playing nearby.

For the new features to work, glasses firmware version 11 and Meta✴ View application version 196 are required.

Leave a Reply

Your email address will not be published. Required fields are marked *