Future developments for Meta's glasses could include gesture recognition and heads-up displays, as indicated by Meta's CTO, Andrew Bosworth. These advancements could improve user interaction and AI training by allowing users to point at objects. However, these features are not yet available, and the current AI features significantly reduce battery life. The article suggests that future iterations of the glasses might integrate more advanced technology, such as Meta's Orion glasses with 3D displays and gesture tracking, but these are still years away from being realized.
Key takeaways:
```html
- Meta's Ray-Bans have introduced a new live AI feature that offers an always-aware assistant experience, but it currently feels more like a companion than a practical helper.
- The Live AI feature is in early access and can be toggled on and off, but it has limitations such as mixed responses, a lack of clear purpose, and a significant impact on battery life.
- Live translation is available for a few languages and is more immediately useful than Live AI, though it requires downloading specific language packs and has some translation delays.
- Future developments for Meta's glasses could include heads-up displays and gesture recognition, potentially enhanced by a neural input wristband, but these advancements are still in the planning stages.