Skip to main content

I saw Google’s futuristic Project Astra, and it was jaw-dropping

Google presenting Project Astra at Google I/O 2024.
Google
Sundar Pichai stands in front of a Google logo at Google I/O 2021.
This story is part of our complete Google I/O coverage

If there’s one thing to come out of Google I/O 2024 that really caught my eye, it’s Google’s Project Astra. In short, Astra is a new AI assistant with speech, vision, text, and memory capabilities. You can talk to it as if it were another person in the room, ask it to describe things it sees, and even ask it to remember information about those things.

During the I/O keynote announcing Astra, one of the most impressive moments happened when a person was running Astra on a phone, asking it to describe things in a room. When the person asked Astra where their glasses were, Astra quickly pointed out where they were in the room — even without being prompted earlier in the video about them.

But does Project Astra actually work like that in the real world? I got to see it in action during a quick 10-minute demo at I/O, and you know what? I’m pretty impressed.

‘That’s a good stick figure!’

Project Astra: Our vision for the future of AI assistants

Google walked us through a handful of Astra demos: Alliteration, Pictionary, Storytelling, and Free Form. They all did what you’d expect, and they were all equally impressive. For some context, the Astra demo Google showed during its I/O keynote had the AI running on a phone. In the demo I saw (which I wasn’t allowed to take photos or videos of), Astra was running on a laptop and connected to a camera plus a touchscreen display.

In the Alliteration demo, members from the Project Astra team had it “look” at random objects (with the camera pointed at a demo table). It accurately identified what it was looking at — a stuffed animal, a toy apple, and a toy hotdog — and talked in alliterations the whole time it was describing what it saw. It was all a bit goofy, but Astra knew everything it was looking at, and it did put a smile on my face.

Another fun moment happened during the Pictionary demo. Someone from the Astra team used the connected touchscreen to draw a stick figure. As she was explaining that she was drawing the stick figure first, Astra — unprompted — exclaimed, “That’s a good stick figure!” with much enthusiasm.

Google Astra on a phone.
Google

It was a subtle moment, but it really drove home just how different Astra is from, say, Google Assistant. No one needed to ask, “Hey Astra, what do you think about this stick figure?” It saw the stick figure, heard the Googler talk about it, and provided feedback all on its own. It was kind of jaw-dropping. From there, the Astra team member put a skull emoji on the stick figure’s outstretched hand. When asked what play the drawing was supposed to represent, Astra immediately guessed Hamlet.

Storytelling and Free Form had their moments, too. For the Storytelling demo, Astra was shown a toy crab and asked to tell a story about it. Astra started telling a detailed story about the crab walking through a beach. A fidget spinner was then placed on the table, and Astra was asked to incorporate it into the story. It did so without skipping a beat.

As the name suggests, the Free Form demo put Astra in a position to do whatever was asked of it. It was shown three stuffed animals and told their names. Someone then asked Astra to recall the names of the various animals, and it got two out of three correct. Just like you and me, Astra remembers things it sees and hears. Google is still figuring out how much Astra should remember and how long it should retain that information, and those are critical details to be ironed out. But the fact that this happens at all is nothing short of magical.

Hearing is believing

Project Astra demonstration on a phone.
Google

Perhaps what stuck out to me the most during my demo was just how natural Astra felt. The Astra team members never needed to say “Hey Astra” or “OK Astra” to get its attention for voice commands. Once Astra was up and running, it was able to continuously listen for questions/commands/comments and respond to those as if it were another person in the room.

The quality of its responses was just as impressive. Listening to Astra, I never once felt like I was hearing a virtual assistant speak to me. The voice inflections and natural speaking pattern Astra delivered was really something. If I closed my eyes, I might be able to trick myself into thinking I was listening to someone else in the room with me — not a computer.

If we’re ever going to get to a point where AI feels like a friendly, helpful, and personable assistant, it needs to feel like you’re talking to a friend. Astra feels like it’s really close to that, and that’s infinitely more exciting than Gems, tokens, or any of the other AI jargon Google spent two hours talking about during its keynote.

Is Astra really the AI of the future?

The Google I/O sign at Google I/O 2024.
Joe Maring / Digital Trends

As the name “Project Astra” suggests, Astra is very much still a work in progress and not something Google is ready to ship anytime soon. Will Astra eventually replace the Google Assistant on my Android phone? Will I even need a phone if I can just have a pair of smart glasses with Astra integrated into them? Perhaps more importantly, are we anywhere close to Astra being ready for normal, everyday use?

Those are all very big questions Google still needs to address, and I imagine it will be a while before we have answers to any of them. But after experiencing Astra for myself and reflecting on my time with it, I can’t help but feel excited about its potential.

A smart, friendly, memorable, and easy-to-talk-to AI assistant that actually feels like something out of a sci-fi movie? That’s something to talk about.

It’s very easy to feel bad about AI, and rightfully so. When Google spent parts of the I/O keynote bragging about AI image generation, using AI to create movies, or having AI summarize Google Search results — which could very well kill the modern internet as we know it — I couldn’t help but dread the AI-riddled future we’re rapidly barreling toward. But a smart, friendly, memorable, and easy-to-talk-to AI assistant that actually feels like something out of a sci-fi movie? That’s something to talk about.

I don’t know if Astra will ever be as cool or encompassing as I’m dreaming it up to be. But it really feels like there could be a future where that happens, and I hope that’s the AI future Google puts its efforts toward.

Joe Maring
Joe Maring has been the Section Editor of Digital Trends' Mobile team since June 2022. He leads a team of 13 writers and…
I’ve had the Google Pixel 9 Pro for three months. Here’s why I still love it
Rose Quartz Pixel 9 Pro on a peppermint background.

While there were a lot of great phones that came out in 2024, the Google Pixel 9 Pro is one of the ones that stood out to me the most. Though my primary device is my iPhone 16 Pro, using the Google Pixel 9 Pro is still just as delightful as the first day I checked it out.

I’ve been a fan of Google’s Pixel phones for a few years now, as Google has one of the cleanest Android interfaces I’ve tried. For a few years, Google has maintained the same basic look and feel.

Read more
I tried 4 of the best earbud and phone combos. Here’s which one you should use
The OnePlus Nord 4 and OnePlus Buds Pro 3, Google Pixel 9 Pro and Google Pixel Buds 3, Apple iPhone 16 Pro Max with Airpods Pro 2, and Samsung Galaxy S24 Ultra with Samsung Galaxy Buds3 Pro.

When you buy a smartphone from Apple, Samsung, Google, or OnePlus, there’s always going to be the temptation to get a matching set of wireless earbuds to go along with it, as each manufacturer makes its own pair. But what exactly does it mean when you stay loyal to the brand, and is it worth it?

I’ve used the latest phones and earbuds from each manufacturer to find out. Here's what you need to know — and which pair is the best.
What have I tested?
(From left) OnePlus Buds Pro 3, Samsung Galaxy Buds 3 Pro, Google Pixel Buds Pro 2, and Apple AirPods Pro 2 Andy Boxall / Digital Trends

Read more
Google teases smart glasses with amazing Project Astra update
google teases smart glasses with amazing project astra update smartglasses

Google has been hard at work improving Project Astra since it was first shown during Google I/O this year. The AI bot that understands the world around you is slated to be one of the major updates to arrive with Gemini 2.0. Even more excitingly, Google says it’s “working to bring the capabilities to Google products like the Gemini app, our AI assistant, and to other form factors like glasses.”

What’s new in Project Astra? Google says language has been given a big performance bump, as Astra can now better understand accents and less commonly used words, plus it can speak in multiple languages, and in combinations of languages too. It means Astra is more conversational, and speaks more like we do every day. Astra “sees” the world around it and now uses Google Lens, Google Maps, and Google Search to inform it.

Read more