Various large social media platforms have been accused of supporting far-right extremist content for one big reason: it makes them money. They very rarely bounce the white nationalists, racists, and harassing trolls out of their accounts. What constitutes hate speech is very unclear, despite policies that state it isn’t allowed on say, Twitter. Maybe that’s because if they properly enforced their own rules, the president would long ago have been de-platformed.
YouTube is frequently criticized for hosting videos that support conspiracy theories and propaganda created by white supremacists, but one of the most insidious things about it is how easy it is to be directed to that content via YouTube’s algorithm.
When you watch a video and it ends, you’ll usually be shown another one unless you actively pause or click away. The next video is supposed to be related to the original one you watched in some way, but that can mean a lot of different things.
Reporter Emily Gadek shared how YouTube directed her viewing in a series of viral tweets. She says she was watching a video posted by The Atlantic about rats. Within three videos, she was being shown racist videos about immigrants flooding European countries and the U.S.:
So, I was on The Atlantic’s YouTube page looking for a short doc to share with a student as an example of a certain technique. It happened to be about rats. Left the video running on mute when another meeting began. Let’s see where the algorithm took us in just 3 hops, shall we? pic.twitter.com/qBr7HbOZcT— Emily Gadek (@emilygadabout) September 24, 2019
Gadek also says that once the videos got to Fox News clips, it happened to play a bunch of them in a row:
By the way - the autoplay ran for another half hour or so. Once it hit a Fox clip, every single video played after it was a Fox segment - eight stories. How is a documentary I'm watching about animals on The Atlantic taking me there?— Emily Gadek (@emilygadabout) September 24, 2019
“So, uh…@Youtube, you might want to come get that algorithm,” Gadek wrote. “The one that’s apparently equating immigrants to vermin? Unless we’re finally admitting it’s good for your business model to push extreme and racist content.”
Some commenters suggested that the YouTube algorithm was responding to Gadek’s usual searches, but she said this isn’t the kind of content she watches. Someone else tested it in incognito mode and got the same results:
I don’t make a habit of watching the content it aimed me towards, on this computer or any other.— Emily Gadek (@emilygadabout) September 24, 2019
I just ran this on my computer on a guest profile that deletes all files every time you sign out, on private browse mode, and got the exact same results. So, @YouTube fix this trash algorithm.
— The Critic (@The_Critic) September 24, 2019
And as some people commented, once these videos come up in your feed, they’re more likely to come up again. YouTube makes people somewhat involuntarily add this stuff to their view history and then takes that as a sign off on showing them even more:
Even worse, if you were logged in while those videos played, they are now in your watch history, further steering the algorithm to feed you that kind of content since you “engaged” with it previously! ?
— Bret Mogilefsky (@bmogilefsky) September 24, 2019
So why is this a big deal? Well, not everyone is a media literate reporter like Gadek. Lots of people, especially young people, watch YouTube and might get led down a rabbit hole of disinformation and Fox News scare tactics. Check in on the people in your life who might get sucked down into the algorithm—they almost kind of can’t help it.