Building radical identities: how YouTube continually connects its users to extremist content

Artikkel - Matei Norbert Balan

There’s been much debate over the past few years as to what role social media platform plays in the process of right-wing self-radicalization. It’s a difficult context in which the rise of populist political parties and violent white nationalist movements put more and more pressure on researchers to understand what is happening and, if possible, even find some sort of solution. Here how YouTube fits into this confusing picture.

Foto/ Unsplash: Leon Bublitz

Foto: Leon Bublitz / Unsplash: Leon Bublitz

In the winter of 2019 two researchers, Mark Ledwich and Anna Zaitsev, self-published a study in which they claimed that YouTube’s recommendation algorithm appears to be engineered to benefit, above all else, mainstream chanels and content like the one provided by cable news agencies over independently produced content by YouTube creators. Their study also claimed that YouTube’s algorithm favors more left-leaning and centrist or politically neutral channels instead of right-leaning and conspiracy centered ones like we had previously thought.

The study, however, has quite a few hiccups, as a few other researchers have pointed out. The most notable one might be the fact that the authors of the study haven’t experienced how exactly the algorithm works. In their study, Ledwich and Zaitsev point out that their main limitation was the anonymity of the data set and the recommendations, which the algorithm provided and “were not based on videos watched over extensive periods. We expect and have anecdotally observed that the recommendation algorithm gets more fine-tuned and context-specific after each video that is watched. However, we currently do not have a way of collecting such information from individual user accounts, but our study shows that the anonymous user is generally directed towards more mainstream content than extreme.”

What this means is that the researches failed to address the individual experience of the algorithm and the extremely personal nature of the online radicalization.

In a Medium post, titled Algorithmic Radicalization - The making of New York Times Myth, Ledwich writes that “contrary to the narrative promoted by the New York Times, the data suggests that YouTube’s recommendation algorithm actively discourages viewers from visiting content that one could categorize as radicalizing or otherwise questionable.”. Ledwich is refering to a series of articles that the NYT has been constantly publishing throught the past five years with focuse on right wing extremism and radicalization. At first glance Ledwich might certainly seem eager to prove NYT wrong. But in the same Medium post he mentions the case of Caleb, a college drop out whose process of self radicalization and then self deradicalization has been documented by the NYT, including as a part of recent podcast, Rabbit Hole, that aims to paint a broader picture of what happens when our live is moved online. Ledwich accurately points out that “just as Caleb Cain was radicalized by far-right videos, he was also then de-radicalized by left-wing ones.”. He then quotes Kevin Roose, the reporter who interviewed Caleb, who observed that “What is most surprising about Mr. Cain’s new life, on the surface, is how similar it feels to his old one. He still watches dozens of YouTube videos every day and hangs on the words of his favorite creators. It is still difficult, at times, to tell where the YouTube algorithm stops and his personality begins.”. Instead of telling him something very specific about Caleb’s personality and about how important it is to study YouTube as a real user, it seems like Caleb’s case was for Ledwich a good enough reason to dismiss the whole article as a NYT conspiracy theory.

However, in the Rabbit Hole podcast, Roose interviews Guillaume Chaslot, a former Google engineer, who helped YouTube put together its current algorithmic system of recommendation. What Guillaume observed while working to improve the AI that’s at the core of YouTube’s recommendation system was that filter bubbles were created. In the episode of the podcast called Wonderland, Guillaume explains that what these bubbles mean is that if you watch a cat video, and then another one, then YouTube pours a giant bucket of cat videos on your head. But then he talks about the violent protests that errupted in Cairo in 2011 and of how “You would see a video from the site of the protesters, and then it will recommend another video from the site of protesters. So you would only see the site of protesters. If you start with the side of the police, you would only see the side of the police.”. On a simillar note, towards the end of the episode he recounts talking to a man who was sitting next to him on a train to Lyon, and who was watching a lot of YouTube. Guillaume tried to find what he was watching and he was surprised to discover that it was videos about Q-anon style conspiracy theories fed one after another by the AI he helped create. While he worked at Google Guillaume constantly raised these problems to his superiors and even discussed fixing them, but he claims that in a company focused on profit, the answer was always the same: this is not our goal.

The fact that we can not pinpoint who’s on the right track here is not due to bias or a job poorly done. The truth is that YouTube’s algorithm is not something that can be easily accessed by researchers. That is why almost everyone who studies it has to find some method to reverse engineer the algorithm and figure out the correct way to do things in what we could metaphorically call pitch black darkness. Of course, engineers like Guillaume might have a more precise idea of what’s going on inside a black box algorithm. But the truth is nobody can know the answer to these questions unless YouTube decides to cooperate.

Until then, people like Caleb and that man on the train to Lyon are stuck in bubbles of content that is certainly not good for them, but that rewards the content creators and YouTube itself with quite some revenue.