On YouTube
The buttons "I don't care", "remove from history" or "I don't like" on YouTube are of little use, reveals a new study from the Mozilla Foundation. Contents similar to those discarded continue to be offered on the users' feed.The research used a sample of about five hundred thousand videos and twenty thousand Youtube users. The study represents, as Mozilla writes in the final report, "the largest experimental audit on YouTube by independent researchers".
Using a control group, analysts found that "I don't like" buttons and "I'm not interested" are effective at blocking only about twelve percent of irrelevant recommended videos. "Don't recommend channel" and "remove from history" are slightly more useful, stopping just over forty percent of irrelevant recommendations .
Some examples reported are those of users who ask to stop watching videos about firearms or cryptocurrencies, and continue to be advised by the platform of contents of the same type.
The problem central lies in the lack of effective control by users over their own homepage and the videos they are viewing. Platforms are not transparent about their recommendation algorithms. and this not only has an effect on the quality of the user experience, but also political implications. The YouTube algorithm has in fact been, in various instances, linked to phenomena such as radicalization, political extremism and the polarization of ideological biases. In this sense, the fact that violent content or political disinformation is continually recommended despite users' choices represents, according to the researchers, a worrying aspect.
A Youtube spokesperson commented to The Verge: “Our control systems do not exclude entire topics or points of view, as this could have negative effects for viewers, such as creating echo chambers. ". She added that the Mozilla study does not take into consideration how YouTube's referral systems actually work, making it very difficult to draw meaningful conclusions.
Clicking ‘dislike’ on YouTube probably doesn’t do much to customize your feed
A new report from Mozilla, the makers of the privacy-focused Firefox browser, suggests that YouTube’s user controls are ineffective at controlling what people see on the platform—despite what Google claims. Using data from almost 23,000 volunteers, Mozilla was able to show that YouTube kept recommending similar videos even when people used the various different options to indicate that they didn’t want to see that kind of content.
YouTube is the second-most popular website in the world (the first is Google) and according to Mozilla, an estimated 70 percent of the 1 billion hours viewed daily on the platform are as a result of algorithmic recommendations. Various reports have shown how the algorithm can polarize people and recommend misinformation and harmful content—something that Google claims it has worked hard to fix. In this study, Mozilla set out to test the effectiveness of the controls YouTube offers to users to manage the recommended videos they see.
In a previous report released in July last year, Mozilla found that people were routinely recommended videos they didn’t want to see and felt that the controls available to them were ineffective. This new study used a browser plug-in Mozilla developed called RegretsReport to see if this was true.
Mozilla looked at four different Google-suggested controls: Clicking that thumbs-down “Dislike” button, “Not interested,” “Don’t recommend channel,” and “Remove from watch history.” Meanwhile, users of the RegretsReport plug-in see a “Stop Recommending” button on YouTube videos. When they clicked it, the control-option (such as for the Dislike button) corresponding to their test group was sent to YouTube, while data about future recommended videos were sent to Mozilla. (There was also a control group where clicking the button did nothing.)
[Related: Why YouTube is hiding dislikes on videos]
Over the course of the study, 22,722 participants used the RegretsReporter, allowing Mozilla to analyze 567,880,195 recommended videos. To assess this huge amount of data, the researchers reviewed 40,000 pairs of recommended videos and rated their similarity. This allowed the team to quantitatively study whether the videos participants were being recommended were similar to videos that they had previously rejected. In other words, to look at whether YouTube’s tools effectively reduced the number of bad recommendations.
For example, if someone saw an anti-vax video recommended to them, and clicked “Not interested,” and then got recommended a cat video, that would be a good recommendation. On the other hand, if they kept getting suggested anti-vax videos after indicating that they weren’t interested in them, those would be bad recommendations. Page 22 of the report [PDF] has some good visual examples.
Mozilla’s report found that no user control was especially effective at preventing unwanted recommendations. The “Don’t recommend channel” option had the biggest impact, preventing 43 percent of bad recommendations, with “Remove from watch history” preventing 29 percent, and “Dislike” and “Not interested” preventing 12 percent and 11 percent, respectively. Mozilla argues that its “research suggests that YouTube is not really that interested in hearing what its users really want, preferring to rely on opaque methods that drive engagement regardless of the best interests of its users.”
As a result of its findings, Mozilla is calling on people to sign a petition asking YouTube to fix its feedback tools and give users actual control over the videos they get recommended. It also has four specific recommendations for YouTube and policy makers based on its study.
Mozilla suggests that YouTube’s user controls should be easy to use and understand, and be designed to put “people in the driver’s seat.” It also wants YouTube to grant researchers better access to data (so they don’t have to use browser extensions to study these kinds of things). Finally, it calls on policy makers to pass laws providing legal protections for those engaged in public interest.
Whether this report is enough to get Google to add some real user controls to YouTube remains to be seen. For now, it’s a fairly damning indictment of the ineffective controls that are currently in place.