YouTube is still suggesting videos with misinformation, violent content, and COVID-19 misinformation, according to a major new study published this month. Notably, this isn't just an issue with ...
"If you randomly follow the algorithm, you probably would consume less radical content using YouTube as you typically do!" So says Manoel Ribeiro, co-author of a new paper on YouTube's recommendation ...
For years YouTube’s video-recommending algorithm has stood accused of fuelling a grab bag of societal ills by feeding users an AI-amplified diet of hate speech, political extremism and/or conspiracy ...
Researchers found that clicking on YouTube’s filters didn’t stop it from recommending disturbing videos of war footage, scary movies, or Tucker Carlson’s face. Reading time 3 minutes My YouTube ...
New research from Mozilla shows that user controls have little effect on which videos YouTube’s influential AI recommends. YouTube’s recommendation algorithm drives 70% of what people watch on the ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results