26 September 2023

Mission impossible? YouTube launches a fight against fake information

Start the conversation

Adi Robertson* says YouTube is fighting off conspiracy theories and accusations of misinformation by adding ‘authoritative’ context and external links to its search results.


YouTube is adding “authoritative” context to search results about conspiracy-prone topics like the Moon landing and the Oklahoma City Bombing, as well as putting $25 million toward news outlets producing videos.

Last week, the company announced a new step in its Google News Initiative, a program it launched in March.

The update is focused on reducing misinformation on YouTube, including the conspiracy theories that have flourished after events like the Parkland school shooting.

This update includes new features for breaking news updates and longstanding conspiracy theories.

YouTube is implementing a change it announced in March, annotating conspiracy-related pages with text from “trusted sources like Wikipedia and Encyclopedia Britannica.”

And in the hours after a major news event, YouTube will supplement search results with links to news articles, reasoning that rigorous outlets often publish text before producing video.

“It’s very easy to quickly produce and upload low-quality videos spreading misinformation around a developing news event,” said YouTube chief product officer, Neal Mohan, but harder to make an authoritative video about a developing story.

YouTube is also funding a number of partnerships.

It’s establishing a working group that will provide input on how it handles news, and it’s providing money for “sustainable” video operations across 20 markets across the world, in addition to expanding an internal support team for publishers.

It’s previously invested money in digital literacy programs for teenagers, recruiting prominent YouTube creators to promote the cause.

Will this be effective?

It’s hard to say.

YouTube is proposing links to text articles as a cure for misinformation, but Google Search’s featured results — including its Top Stories module — have included links to dubious sites like 4chan and outright false answers to basic questions.

Unlike with deliberate “fake news” purveyors, this obviously isn’t intentional, but it makes it harder to believe that Google will provide truly authoritative answers.

The Wikimedia Foundation was also initially ambivalent about having Wikipedia articles added to YouTube results, worrying that it would increase the burden on Wikipedia’s community of volunteers.

A Wikipedia or “mainstream media” link seems unlikely to convince anyone who’s already invested in a conspiracy theory, especially if that theory is tied to YouTube or the media being politically biased against them.

On the other hand, the new changes could stop some people from going down a conspiracy rabbit hole in the first place.

As Wired reports, YouTube is trying to short-circuit a process in which its algorithms recommend more and more fringe videos based on a user’s viewing history, albeit only for breaking news stories, where it’s limiting recommendations to sources it’s deemed trustworthy.

(This breaking news feature is currently available in 17 countries, and it’s now being expanded to more.)

Like a lot of digital platforms, YouTube is fighting extremely complicated problems by supporting good actors and developing new automated systems — and it’s still not clear how powerful those strategies are.

* Adi Robertson is a senior reporter for The Verge. She tweets at @thedextriarchy.

This article first appeared at www.theverge.com.

Start the conversation

Be among the first to get all the Public Sector and Defence news and views that matter.

Subscribe now and receive the latest news, delivered free to your inbox.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.