Study: What content makes YouTube's recommendation algorithms visible?

Current study shows: YouTube algorithm recommends hardly any disinformation, but also hardly any in-depth information.

How do recommendations on YouTube work in times of crisis? Are disinforming videos recommended for topics such as the "Covid 19 pandemic," "climate change" or "refugees," or do the recommendations enable in-depth, informed opinion-forming? These questions are examined in the current study "Recommendations in times of crisis - What content do YouTube's recommendation algorithms make visible?" by the Medienanstalt Berlin-Brandenburg (mabb), the Berlin Senate Chancellery, the Bavarian Regulatory Authority for Commercial Broadcasting (BLM), the Media Authority of North Rhine-Westphalia and the Media Authority of Rhineland-Palatinate. Using systematic video views, the study explored the extent to which YouTube's non-personalized recommendations help point users:in to sources and information that adhere to journalistic due diligence and are supported by scholarly consensus.

"The good news is that YouTube's recommendation algorithm is not a disinformation catalyst. Only six percent of the recommended content studied came from potentially disinforming channels," said mabb director Dr. Anja Zimmer. "At the same time, our study indicates that the algorithm rarely recommends niche offerings or content that delves deeper into topics. Established media providers are made visible more often than average. This aspect should be discussed as part of the commitment to securing and promoting diversity of opinion on media platforms, as there is still clear potential for more diversity in the platform's recommendations. The challenge is to better present the diversity of information offerings and perspectives to users. And to recommend even less disinformation in the process."

Key findings of the study

  • YouTube recommendation algorithm presents hardly any disinformative content. With the controversial topics of "Covid-19," "climate change" and "refugees" as the starting point of the study, only six percent of the videos that the algorithm subsequently recommended were classified as potentially disinforming.
  •  In-depth topics are rarely recommended. At the same time, only a total of eleven percent of the recommendations recorded included videos on the topics selected as a starting point, "Covid-19," "Climate Change," and "Refugees."
  • YouTube algorithm relies on a limited range of popular channels. 69 percent of all recommendations captured promote videos from just 61 channels. These include many public service and established private media providers. Niche programs and content from lesser-known providers are very rarely referred to.

The study results thus indicate that YouTube's recommendations serve less to provide in-depth content and more to inform users broadly about established media offerings and providers. Disinformative content is often left out.

About the study
With 7.2 million daily users, YouTube has become an important source of information for the population in Germany (cf. Diversity Report of the Media Authorities 2020). A central service of the video-sharing platform are the recommendations of the algorithm. Against the backdrop of the Corona pandemic, socio-political debates on climate change and migration, and elections at federal and state level, it is currently particularly relevant how YouTube deals with disinforming content. To gain a better overview of how YouTube's recommendation algorithm works in times of crisis, the state media authorities from Bavaria, Berlin-Brandenburg, North Rhine-Westphalia and Rhineland-Palatinate, together with the Berlin Senate Chancellery, commissioned the study "Recommendations in Times of Crisis - What Content Does YouTube's Recommendation Algorithms Make Visible?". The study was conducted by Kantar, Public Division and the Rheinisch-Westfälische Technische Hochschule Aachen (RWTH).

The study determined a total of 90 starting videos and search terms on the topics of "Covid-19 pandemic," "climate change" and "refugee movements." To determine the viewability of disinformative videos, one-third of the starting points were videos with disinformative content. Subsequent recommendations were automatically recorded in August 2020, ensuring that previous usage did not affect the results (laboratory conditions). Over 33,000 possible recommendations were collected in this way, of which over 8,000 are different, multiple-recommended German-language videos.

Source: press release "In algorithm we trust - (dis)information offers by recommendation?"

For press inquiries:
Medienanstalt Rheinland-Pfalz, Hans-Uwe Daumann
e-mail: daumann@medienanstalt-rlp.de

More on the topics at klicksafe: