May 19, 2024

Solid State Lighting Design

Find latest world news and headlines today based on politics, crime, entertainment, sports, lifestyle, technology and many more

Google I/O: Project Astra hands-on

Google I/O: Project Astra hands-on

As usual, Google I/O 2024 It is an absolute whirlwind of news and announcements. This year, instead of focusing on hardware, Android, or Chrome, Google spent most of its developer conference convincing us that Artificial intelligence features It deserves priority. One such project is Project Astra, a multimedia AI assistant that you can partially talk to and that can simultaneously use a camera to recognize objects and people.

I say “almost” because it became clear after the demo that this part of Gemini is still in its infancy. I spent a few short minutes with Project Astra at Pixel 8 Pro Let’s see how it works in real time. I didn’t have time to test it to its fullest or try to trick it out, but I got a feel for what the future might look like as an Android user.

Ask him almost anything

The goal of Project Astra is to be like an assistant who also guides you in the real world. It can answer questions about your environment by identifying objects, faces, moods, and textures. It can also help you remember where you last put something.

A preview of Project Astra’s Pictionary mode, another demo inside the demo room.
picture: Florence Ion/Gizmodo

There were four different demos to choose from for Project Astra. They include Storyteller mode, which asks Gemini to compose a story based on different inputs, and Pictionary mode, which is basically a computer doodle guessing game. There was also an alliteration mode, where the AI ​​showed off its prowess at finding words with the same starting letter, and a free form that lets you chat back and forth.

The demo I got was a version of Free-Form on the Pixel 8 Pro. Another journalist in my group asked for it directly, so most of our demos focused on using the device and this Assistant-like mode together.

An image of a person holding a Pixel 8 pro and an Astra project showing the person holding a phone in their hand

Gemini correctly identifies the phone in the person’s hand.
picture: Florence Ion/Gizmodo

By pointing the camera at another journalist, the Pixel 8 Pro, Gemini was able to determine that the subject was a person, and we explicitly told him that this person was identified as a man. It was then correctly identified that he was holding his phone. In a later question, I asked our group about his clothes. She gave the general answer that “he appears to be dressed casually.” Next, we asked him what he was doing, and Project Astra replied that it looked like he was wearing sunglasses (he was) and striking a casual pose.

I held the Pixel 8 Pro for a quick minute. You have made Gemini identify the artificial flower pot correctly. They were tulips. Gemini notices that they are also colorful. From there, I wasn’t sure what else to do, and then my time was up. I left with more questions than I entered.

Photo of a person holding a Pixel 8 Pro

Gemini correctly identified that the artificial flowers were tulips.
picture: Florence Ion/Gizmodo

With Google’s AI, it seems like a leap of faith. I can see how identifying a person’s identity and actions could be an accessibility tool to help someone who is blind or visually impaired as they navigate the world around them. But that’s not what this demonstration was about. This was to showcase the capabilities of Project Astra and how we will interact with it.

My biggest question is: Will something like Project Astra replace Google Assistant on Android devices? After all, this AI can remember where you put your stuff and pick up on nuances — at least, that’s what the demo conveyed. I couldn’t get an answer from the few people I asked on Google. But I have a strong belief that the future of Android will be less reliant on tapping to interact with the phone and more reliant on talking to it.

See also  UK Sales Charts: Nothing on PS5 and PS4 can compete with Zelda: Tears of the Kingdom