Google has begun to implement new functions of ai in Gemini Live that allow it to “see” its screen or through the camera of your smartphone and answer questions about real time, Google spokesman Alex Joseph has confirmed in an email to The edge. The characteristics occur almost a year after Google first demonstrated the work of the “Astra Project” that feeds them.
TO Reddit's user said The feature appeared on his Xiaomi phone, such as seen by 9to5google. Today, the same user Posted the video Next, demonstrating the new Gemini screen reading capacity. It is one of the two Google characteristics Said at the beginning of March “Subscriber advanced Gemini would begin to be implemented as part of the Google One ai Premium Plan” later in the month.
The other Astra capacity is now displayed is the live video, which allows Gemini to interpret the feed of the camera of his smartphone in real time and answer questions about it. In the demonstration video below that Google published this month, a person uses the characteristic to ask for help to Gemini be the help of deciding on a paint color for use for its freshly glazed ceramics.
(Tagstotranslate) ai