Shazam integration allows users to identify songs by prompting Meta AI. To access these features, users need to ensure their glasses are updated to v11 software and the Meta View app to v196. Those not in the Early Access Program can apply online. The updates align with a broader trend in Big Tech, emphasizing AI assistants as central to smart glasses. Google recently announced Android XR and its Gemini AI assistant, while Meta CTO Andrew Bosworth highlighted 2024 as a pivotal year for AI glasses, suggesting they could become the first hardware category defined entirely by AI.
Key takeaways:
- Meta has introduced three new features for its Ray-Ban smart glasses: live AI, live translations, and Shazam, with live AI and live translations limited to Early Access Program members.
- Live AI enables natural conversation with Meta’s AI assistant, offering suggestions based on surroundings, while live translation provides real-time speech translation between English and Spanish, French, or Italian.
- Shazam support is available for all users in the US and Canada, allowing song identification through a prompt to Meta AI.
- The updates align with a broader trend of AI integration in smart glasses, with Meta and other tech giants like Google emphasizing AI assistants as key features.