Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

I Wore Meta Ray-Bans in Montreal to Test Their AI Translation Skills. It Did Not Go Well

Jun 26, 2024 - wired.com
The article discusses the author's experience testing the new AI translation feature on Meta’s Ray-Ban smart sunglasses in Montreal, a city where they don't speak the language. The feature is designed to provide a quick, hands-free way to understand text written in foreign languages. However, the author found that while it could sometimes translate accurately, it often gave broad summaries instead of detailed translations, struggled with certain tasks, and could only interpret written words, not spoken language.

The author tested the AI translation on a variety of texts, including street signs, menus, and books, with mixed results. For instance, it could translate the title of a children's book accurately but failed to provide detailed translations of menu items at restaurants. The author concludes that the AI translation feature is more of a temperamental party trick than a genuinely useful travel tool at this point.

Key takeaways:

  • The author tested the new AI translation feature on Meta’s Ray-Ban smart sunglasses in Montreal, a foreign environment for them.
  • The AI translation feature is designed to provide a quick, hands-free way to understand text written in foreign languages, but it currently only works with written text, not spoken language.
  • The author found the AI translation feature to be inconsistent, with it sometimes providing accurate translations, but often failing to provide detailed or specific translations, especially with menus and signs.
  • To use the AI translation, the user needs to say "Hey Meta, look at …" and then ask it to translate what it’s looking at. The glasses take a snapshot of whatever is in front of you, and then tell you about the text after a few seconds of processing.
View Full Article

Comments (0)

Be the first to comment!