The company plans to further develop this "multimodal AI", which combines camera and voice chat, to include more sensory data inputs and become more seamless over time. Despite privacy concerns during the early access phase, Meta is using anonymized query data to improve its AI services. The glasses, which are already available for $300, represent a new frontier in wearable AI products, with their assistive awareness capabilities expected to evolve significantly in the future.
Key takeaways:
- Meta's second-generation Ray-Ban glasses are rolling out a new feature that uses generative AI to interpret images and provide information to the wearer.
- The AI feature can recognize what the wearer sees by taking a photo, which the AI then analyzes. The responses and photos are stored in the Meta View phone app that pairs with the glasses.
- Meta CTO Andrew Bosworth has stated that the company is working towards making the glasses more seamless over time, with sensors that can detect an event and trigger the AI.
- Despite being in early-access beta and having some limitations, the glasses represent a new frontier of wearable AI products, with potential uses for assistive purposes and more.