This new assistant will be available on Google Pixel smartphones via the Gemini app later this year.

A demo at Google I/O showed how Gemini analyzes its environment and makes predictions about what’s going on, much like ChatGPT does.

Google demonstrated how AI can identify areas, help users understand code, and even come up with names for a group that includes a tiger and a dog in the frame.

Project Astra is multi-modal, allowing it to process and combine different types of data, such as video and text, to better understand context.

During the presentation, Google demonstrated how the assistant recognizes a person’s glasses in real time using a smartphone camera.

Although the exact release date of Project Astra is not yet known, the company is trying to deliver it to users as soon as possible.

Source: Ferra

Previous articleWhy do the Northern Lights have different colors?
Next articleScientists on the ISS successfully conducted an experiment on the deployment of structuresScience and technology21:19 | May 14, 2024
I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.


Please enter your comment!
Please enter your name here