YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users

YouTube tends to recommend videos that are similar to what people have already watched. New research has found that those recommendations can lead users down a rabbit hole of extremist political content.

A multidisciplinary research team in communication and computer science at the University of California, Davis, performed a systematic audit of YouTube’s video recommendations in 2021 and 2022. They tested how a person’s ideological leaning might affect what videos YouTube’s algorithms recommend to them. For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content. Recommendations for left-leaning users on YouTube were markedly fewer, researchers said.

YouTube has to be the fastest path to accidental algorithmic fascism I am aware of – it’s nuts. I’m an extreme socialist (like, radical left) and I constantly have to say I’m not interested in Jordan Peterson recommendations.

Your fascism is leaking, Google.