Skip to main content

A.I. and Google News: The push to broaden perspectives

Image used with permission by copyright holder
Sundar Pichai stands in front of a Google logo at Google I/O 2021.
This story is part of our complete Google I/O coverage

There is no editorial team at Google News. There is no building filled with hundreds of moderators monitoring the thousands of stories hitting the web every second, making sure the full story is presented. Instead, Google uses artificial intelligence algorithms, as well as its partnerships with fact-checking organizations providing headlines from credible, authoritative sources.

“Humans are generating the content,” Trystan Upstill, Google News engineering and product lead, told Digital Trends. “We think of the whole app as a way to use artificial intelligence to bring forward the best in human intelligence. In a way, the A.I. is controlling this fire hose of human stuff going on.”

Recommended Videos

“We think of the whole app as a way to use artificial intelligence to bring forward the best in human intelligence.”

A.I. is a big part of the redesigned Google News app, which was recently announced at the annual Google I/O developer conference in Mountain View, California. The algorithms filter or demote stories after detecting the spread of misinformation, and they also understand terms and fragments of text coming through the news cycle, aligning them with fact checks from partner organizations.

But one of the A.I.’s main tasks is to provide a full picture of major, nuanced stories through a feature called “Full Coverage.” It’s a small button you can press on stories, which will lead you to similar articles from a variety of publications — including ones you do not follow or may not like. The main section of Google News shows content tailored to you, but “Full Coverage” does not respect your likes and dislikes — everyone sees the same information pulled together by the A.I.

That includes modules for fact checks, frequently asked questions, a timeline of events, international coverage, and even tweets from primary sources. Everyone reading “Full Coverage” sees the same information, which Upstill said is crucial.

“The core premise we have is that in order to have a productive conversation about something, everyone basically needs to be able to see the same thing,” he said.

While the breadth of data the algorithms pull is impressive, it’s entirely on the user to click on the small “Full Coverage” button to read more perspectives on the topic at hand. It’s why the button features Google’s red, green, blue, and yellow colors — it stands out from a page that’s mostly black and white.

“Fundamentally, we’re trying to build tools that are easy, that people can use to develop their understanding,” Upstill said. “A part of the challenge for people to break out of their bubbles and echo chambers is that it’s just hard; it’s hard work, and we set out to make that easy.”

Pulling together a variety of sources has always been a part of Google News’ roots. The desktop service began right after the 9/11 attacks in 2001, when people were scrambling to find as much information as they could about the tragic event.

“It came to the table with this idea that in terms of understanding a story, you shouldn’t read a single article,” Upstill said. “You should read a set of articles around that story to really position what you’re reading. That is a key message that resonates with people even today, in this age of people having increasingly polarized views.”

“You should read a set of articles around that story to really position what you’re reading.”

Google has been criticized for helping people stay in their bubbles. Search results are personalized based on location and previous searches, and people end up seeing what they want to see rather than the full picture. Upstill said Google isn’t in the business of censorship, and “in Search, if you come in and say ‘give me the fake news publication’ or type ‘fakenews.com,’” it will show up. But with Google News, Upstill said you shouldn’t find disreputable sources.

The new Google News app is currently rolling out on both Android and iOS, and the desktop redesign will go live early next week. Both will share the same features, but the desktop version will have a different format.

Julian Chokkattu
Mobile and Wearables Editor
Julian is the mobile and wearables editor at Digital Trends, covering smartphones, fitness trackers, smartwatches, and more…
I tried 4 of the best earbud and phone combos. Here’s which one you should use
The OnePlus Nord 4 and OnePlus Buds Pro 3, Google Pixel 9 Pro and Google Pixel Buds 3, Apple iPhone 16 Pro Max with Airpods Pro 2, and Samsung Galaxy S24 Ultra with Samsung Galaxy Buds3 Pro.

When you buy a smartphone from Apple, Samsung, Google, or OnePlus, there’s always going to be the temptation to get a matching set of wireless earbuds to go along with it, as each manufacturer makes its own pair. But what exactly does it mean when you stay loyal to the brand, and is it worth it?

I’ve used the latest phones and earbuds from each manufacturer to find out. Here's what you need to know — and which pair is the best.
What have I tested?
(From left) OnePlus Buds Pro 3, Samsung Galaxy Buds 3 Pro, Google Pixel Buds Pro 2, and Apple AirPods Pro 2 Andy Boxall / Digital Trends

Read more
iOS 18.2 is rolling out now with a ton of new Apple Intelligence features
Apple Intelligence on the Apple iPhone 16 Plus.

Apple has started the public rollout of iOS 18.2 and the corresponding iPadOS update, and they bring a handful of long-awaited features in its AI kit. The release notes are pretty exhaustive, and they reveal a few features that are minor improvements to the already available Apple Intelligence bundle.

The most notable addition is ChatGPT integration with Siri, which shifts things over to the OpenAI chatbot if Apple’s assistant can’t provide an answer. ChatGPT integration is also expanding within the Writing Tools set, thanks to the compose feature that lets users create fresh content and generate images.

Read more
The iPhone 18 may get a big redesign you won’t be able to see
The back of the Apple iPhone 16 Plus.

The design of the iPhone can only be described as iconic. That rectangular shape has been a major influence on phone aesthetics and design since the first iPhone came onto the market back in 2007, and that isn't likely to change. The internal design of the iPhone might radically shift, however. Apple is supposedly planning to change how the iPhone hardware is designed to accommodate better AI performance.

Essentially, Apple wants to use discrete memory rather than integrated memory. Those are technical terms that basically mean separate and together. On the internal system on a chip (SoC), any memory that is stacked on top is considered integrated memory. Discrete memory would be RAM that is packaged separately from the SoC. If reports are correct, Apple will begin using discrete memory in 2026, and the shift would result in faster memory and better AI performance, according to The Elec.

Read more