Meta is on a mission to further enhance its sleek Ray-Ban smart glasses with several exciting new software updates. As of December 16, 2024, users enrolled in the early access program are now privy to three notable features: live AI, live translation, and Shazam capabilities for song identification.
Live AI: A Game Changer
The most significant of these new updates is undoubtedly the live AI feature. This advancement allows the glasses to see what the user sees, providing real-time assistance and responses to questions, akin to Google’s Project Astra, powered by Gemini 2.0.
These enhancements were first introduced at the Meta Connect 2024 event held in September and are now rolling out to early adopters in the United States and Canada.
Seamless Communication Across Languages
The glasses now offer real-time translation between English and three other languages: Spanish, French, and Italian. This feature ensures smoother conversations while traveling, as users can speak in their native language and be understood in English, and vice versa.
Musical Recognition at Your Fingertips
The Shazam feature lets users identify songs playing in their vicinity with a simple voice command—”Hey Meta, what is this song?” This function streamlines the musical discovery process, especially for avid travelers keen to explore local music.
With these enhancements, Meta’s Ray-Ban glasses are transforming into must-have tech gadgets, blending style with intelligent capabilities, making them a noteworthy topic in the tech world.
- 0 Comments
- Ray-Ban
- Smart Glasses