Google announced that Gemini Live, a feature implemented in the Gemini app that allows the AI assistant to “see” and react to images on the screen and in the camera, is now available to all Android users. Previously, this feature was available with a Gemini Advanced subscription.
Image source: Amanz/unsplash.com
The AI-powered Gemini Live feature officially launched earlier this month for all Pixel 9 and Samsung Galaxy S25 users with the Gemini app pre-installed. “We’ve received great feedback about Gemini Live with camera and screen sharing, so we’ve decided to bring it to more people,” Google said on Platform X. “Starting today and over the coming weeks, we’ll be rolling it out to all Android users with the Gemini app.”
To get more information about a recipe you’re reading or a plant you’ve spotted, you can ask the AI assistant by tapping the “Share Screen with Live” button in the Gemini tab, and it will “see” exactly what you want to know. To share your camera view with the AI assistant, go to the full-screen Gemini Live interface and tap the camera button. This will open a viewfinder that lets you switch between the front and rear cameras.
Today, Microsoft announced that its similar AI tool, Copilot Vision, is now available for free in the Edge browser.