Online radicalization is a social phenomenon that has been a popular point of discussion as of late due to recent affairs in the modern political landscape. In light of events such as the Charlottesville protest, Christchurch shooting, and overall rise in hate crimes, many have pointed to major social media platforms in playing a contributing factor. By hosting and promoting extremist fringe content through their recommender systems, platforms like Youtube are becoming fertile grounds for radical ideologues. Caleb Cain is an example of an individual who, as he describes, “had fallen down the alt-right rabbit hole.” In a New York Times published piece, he speaks of his slow indoctrination into far-right ideology through Youtube’s own recommender system as it would suggest further dubious content for him to view.

In a report from DataSociety by Rebecca Lewis in 2018, she outlines the potential pathways to extremist ideology one might have on Youtube by producing a graph of various political pundits and their relationships through appearances on the same video. Nodes are treated as their respective YouTube channels while the edges represent videos in which both parties contributed. As the algorithm tends to suggest videos from content creators and collaborators that the user has viewed, she lays out potential links between how one could be watching videos made by mainstream political pundits like Ben Shapiro and eventually be lead to viewing content from devout white nationalists such as Richard Spencer.


The findings in this report were further built upon in a study this year titled, “Auditing Radicalization Pathways on YouTube”. Researchers found a strong overlap between the users in comment sections of videos ranging from conservative talk show hosts to alt-right figureheads. To test Youtube’s recommender system, they took a snapshot by beginning at a Youtube channel that belonged to one of their devised community groups while performing a random walk across five suggested channel links and recorded their destination. The researchers found that they would hit a channel belonging to their “Alt-Right” cluster once every five times that they ran their recommender system, meaning one out of a total twenty-five attempts. While the percentages may seem small, they can make a meaningful difference given the scale at which Youtube operates.
Having encountered recommender systems during our discussions about signed networks in lecture, I thought this would be a good opportunity to look into the applications of these systems in the real world. They are no doubt effective at engaging users given how they’re employed in virtually every social media platform in some shape or form. Whether it be for finding products you might enjoy, movies, restaurants, friends and more, these recommender systems do a great job in analyzing a network and giving users more of what already suits their tastes. They are clearly optimized for engagement given the monetary incentives involved. The more time you spend clicking that next video suggestion on Youtube, the more opportunities they have to make money from advertisers. At the cost of only optimizing for engagement, however, you have issues with these recommender systems having unintended consequences.
This brings forward the idea that with recommender systems being so ubiquitous on the web, perhaps the internet isn’t the open sea that people pictured it to be. It might seem, in certain instances, that it’s more prone to forming bubbles instead. The ways in which these recommender systems provide great convenience can be damaging in some aspects as well. It will be interesting to see how companies akin to Google choose to augment their systems, if at all, to address problems such as user radicalization in the future.
Relevant Links:
The Making of a Youtube Radical –
https://www.nytimes.com/interactive/2019/06/08/technology/youtube-radical.html
Alternative Influence: Broadcasting the Reactionary Right on Youtube –
https://datasociety.net/output/alternative-influence/
Auditing Radicalization Pathways on Youtube –
https://arxiv.org/abs/1908.08313