fbpx
  • Here’s a demo of Gemini Live’s new live video and screen-sharing capabilities.
  • These features allow Gemini Live to answer questions based on what it sees from your phone’s camera feed or screen.
  • Google is slowly rolling these features out to Gemini Advanced users, so you may not see them yet on your device.

When Google first unveiled Gemini Live, the more conversational version of its Gemini assistant, it was only able to respond to your voice, making it quite limited in what it could do. In recent updates, Google made Gemini Live more useful by allowing it to answer questions about files, images, and YouTube videos. Now, Google is slowly rolling out yet another update to Gemini Live that allows it to answer questions based on what it sees from your phone’s camera feed or screen. We managed to take it for a spin.

Google announced earlier this month that it would bring some Project Astra capabilities to Gemini Live, starting with live video and screen-sharing. Project Astra is Google’s experimental, next-gen AI assistant that can react to your surroundings in real-time. Google demoed Astra at last year’s I/O, illustrating its capabilities by showing a person using a smartphone and smart glasses to ask questions about their surroundings. Nearly a year later, this is now possible with Gemini Live on Android phones.