Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Google's Gemini Live Can Now Read Phone Screens, Camera Feeds And Answer Queries

Mar 24, 2025 - ndtvprofit.com
Google's Gemini Live, a conversational AI chatbot, now has the capability to read phone screens and camera feeds and respond to queries in real time. This feature is currently exclusive to select subscribers of Google's Gemini Advanced under the Google One AI Premium plan. Developed by Google's 'Project Astra', led by Demis Hassabis of Google Deepmind, the initiative aims to create a multimodal AI assistant capable of analyzing the physical world. The AI can process audio, video, images, and text, providing real-time answers and suggestions. The feature was first showcased at the Mobile World Congress 2025 in Barcelona, with interactive demonstrations and YouTube videos highlighting its conversational abilities.

The new capabilities allow users to share their screens with Gemini Live and initiate video streams, enabling the AI to scan environments and answer questions about what it observes. The AI's conversational aspect was demonstrated through its ability to engage in organic discussions, adapting to new questions and building on previous answers. However, some AI features may not be available on the Gemini Nano-Powered Google Pixel 9a.

Key takeaways:

  • Google's Gemini Live can read phone screens and camera feeds and answer queries in real time, but is currently available only to select subscribers of the Google One AI Premium plan.
  • The features were developed by Google's 'Project Astra', led by Demis Hassabis, aiming to create a multi-modal AI assistant.
  • The AI can process audio, video, images, and text, providing real-time answers and suggestions.
  • Google demonstrated these capabilities at the Mobile World Congress 2025 and released videos showcasing Gemini Live's conversational abilities.
View Full Article

Comments (0)

Be the first to comment!