After New Zealand Massacre, YouTube's Algorithm Still Promotes Islamophobic Videos

The suspected shooter posted xenophobic YouTube videos online days before the attack.
The suspect behind the New Zealand massacre shared Islamophobic videos on YouTube.
The suspect behind the New Zealand massacre shared Islamophobic videos on YouTube.
ASSOCIATED PRESS

In the days leading up to the New Zealand massacre that left dozens of Muslim worshippers dead, the suspected shooter, a 28-year-old Australian man, left a trail of xenophobic and Islamophobic postings online.

On Tuesday, he appeared to share a series of YouTube videos on Twitter, including a montage of bloodied corpses from previous terror attacks, and a clip showing a crowd of people beating a group of Muslims. On Friday, he posted a detailed manifesto of his white supremacist beliefs, announced his impending “attack against the invaders” online, then live-streamed himself killing 49 people.

The carnage seemed designed to go viral and induce fear, much like the fringe content that may have inspired it. YouTube, despite fierce condemnation for its algorithmic promotion of videos that are used to spread violent ideologies ― such as the ones the suspected shooter appeared to tweet ― has consistently failed to address this issue.

Just hours after the attack, a search of the term “Muslims” from an incognito browser yielded a list of YouTube’s top-recommended videos, including one with 3.7 million views. The video argued, without evidence, that the majority of Muslims are radicalized. From there, YouTube’s autoplay function took over and recommended another round of videos. One apparently exposes “the truth” about Muslims. Another “destroys Islam.”

That’s exactly how the algorithm works: It’s designed to learn what piques your interest, and to keep you watching by recommending increasingly extreme videos on that topic ― facts be damned.

“It’s an echo chamber,” said Joshua Fisher-Birch, a content review specialist at the Counter Extremism Project. “The more an idea is repeated, the more it may reinforce a viewer’s belief that it is true.”

Typing a question such as “Is immigration bad?” into YouTube may ultimately lead to a rabbit hole of anti-immigrant content, Fisher-Birch explained.

The platform’s incentive to promote such videos, he added, is a financial one. The longer people stay hooked on YouTube videos, the more advertising dollars the company rakes in.

YouTube, which is owned by Google, did not immediately respond to a request for comment.

YouTube is aware of the problem with its algorithm: In a rare move, it pledged in January to promote fewer conspiracy theories, which, YouTube stressed, represent less than one percent of all YouTube videos.

But with hundreds of hours of footage uploaded to YouTube per minute, that figure still represents an enormous volume of problematic content, much like the other fringe videos YouTube promotes. Its algorithm generates more than 70 percent of user traffic.

The video giant’s dominance in the media world lends legitimacy to the videos it recommends. More than half of adult users say the site is an important source for understanding what’s happening in the world, according to a recent Pew Research study, and the number of users who turn to YouTube for news nearly doubled between 2013 and 2018.

Fisher-Birch is doubtful that YouTube will take action to slow the spread of fear-mongering extremist content any time soon.

“It all comes down to people spending time on the site,” he said. “YouTube’s in this to make money.”

Popular in the Community

Close

What's Hot