Many people like YouTube recommendations, but they can also lead some into extremism or have other serious consequences, and YouTube does have a history of recommending harmful content — from pandemic conspiracies to political disinformation — to its users, even if they’ve previously viewed harmless content.
Despite the serious consequences, YouTube’s recommendation algorithm is entirely mysterious to its users. With a new browser extension called The Regrets Reporter from Mozilla, users can take action if they are recommended harmful videos. This will help Mozilla better understand the challenges about YouTube’s recommendations and may be help illuminate the right path forward for a more trustworthy recommendation system.
Why is Mozilla doing it?
YouTube’s recommendation AI is one of the most powerful curators on the internet. YouTube is the second-most visited website in the world, and its AI-enabled recommendation engine drives 70% of total viewing time on the site. It’s no exaggeration to say that YouTube significantly shapes the public’s awareness and understanding of key issues across the globe.
For years, people have raised the alarm about YouTube recommending conspiracy theories, misinformation, and other harmful content. One of YouTube’s most consistent responses is to say that they are making progress on this and have reduced harmful recommendations by 70%. But there is no way to verify those claims or understand where YouTube still has work to do.
Download the extension at mzl.la/regrets-reporter