YouTube says its system is working as expected. “Mozilla’s report doesn’t take into account how our systems actually work, so it’s hard for us to gather a lot of insights,” said YouTube spokeswoman Elena Hernandez, adding that viewers can control their recommendations. This includes “the ability to prevent videos or channels from being recommended to them in the future”.
Mozilla and YouTube’s interpretation of the success of “not recommended” input appears to differ by similarity in subject matter, person, or content. YouTube said asking its algorithms to not recommend a video or channel will only prevent the algorithm from recommending that particular video or channel, and will not affect user access to a particular topic, point of view or speaker. “Our controls don’t filter out entire themes or viewpoints that could negatively impact the audience, such as creating an echo chamber,” Hernandez said.
That’s not entirely clear from YouTube’s public statements and published research on its recommender system, said Jesse McCrosky, a data scientist who worked with Mozilla on the research. “We have a little understanding of the black box,” he said, suggesting that YouTube considers two types of feedback broadly: positive, engagement, such as how long users watch YouTube and how many videos they watch; and explicit feedback, including dislikes. “They have some balance, the degree to which they respect both kinds of feedback,” McCloskey said. “What we saw in this study is that the emphasis on engagement is very detailed, while other types of feedback are rarely respected.”
Robyn Caplan, a senior fellow at Data & Society, a New York-based nonprofit that has previously investigated YouTube believes the distinction between what it says to its algorithms and what Mozilla says is important YouTube’s algorithm. “Some of these findings don’t contradict what the platform says, but show that users don’t have a good understanding of which features control their experience, rather than which features provide feedback to content creators,” she said. Caplan welcomed the study and its findings, saying that while Mozilla’s slam dunk revelation may be more low-key than the researchers had hoped, it still highlights an important issue: users’ confusion over their control over YouTube’s recommendations. “This study does illustrate a broader need for users to regularly survey website functionality,” Kaplan said. “If these feedback mechanisms don’t work as expected, it could drive people away.”
Confusion about the expected functionality of user input was a key theme in the second part of the Mozilla study: a qualitative survey of roughly one in 10 people who installed the RegretsReporter extension and participated in the study followed. People who spoke to Mozilla said they appreciated the input being specific to videos and channels, but they hoped it would inform YouTube’s recommendation algorithm more broadly.
“I think it’s an interesting subject because it shows that it’s people saying: ‘It’s not just me telling you that I blocked this channel. It’s me trying to exert more control over other types of advice I get in the future ,’” Ricks said. Mozilla suggested in its research that YouTube gives users more options to actively shape their own experience by outlining their content preferences — and the company did a better job explaining how its recommendation system works.
For McCrosky, the key issue is the gap between what users think YouTube delivers through its algorithmic input and what they actually do. “There’s a disconnect in how well they respect these signals,” he said.