Social Science Research Scholar, Policy Institutes
Subscriptions and external links help drive resentful users to alternative and extremist YouTube channels.
2023; 9 (35): eadd8080
Do online platforms facilitate the consumption of potentially harmful content? Using paired behavioral and survey data provided by participants recruited from a representative sample in 2020 (n = 1181), we show that exposure to alternative and extremist channel videos on YouTube is heavily concentrated among a small group of people with high prior levels of gender and racial resentment. These viewers often subscribe to these channels (prompting recommendations to their videos) and follow external links to them. In contrast, nonsubscribers rarely see or follow recommendations to videos from these channels. Our findings suggest that YouTube's algorithms were not sending people down "rabbit holes" during our observation window in 2020, possibly due to changes that the company made to its recommender system in 2019. However, the platform continues to play a key role in facilitating exposure to content from alternative and extremist channels among dedicated audiences.
View details for DOI 10.1126/sciadv.add8080
View details for PubMedID 37647396
Challenges in Understanding Human-Algorithm Entanglement During Online Information Consumption.
Perspectives on psychological science : a journal of the Association for Psychological Science
Most content consumed online is curated by proprietary algorithms deployed by social media platforms and search engines. In this article, we explore the interplay between these algorithms and human agency. Specifically, we consider the extent of entanglement or coupling between humans and algorithms along a continuum from implicit to explicit demand. We emphasize that the interactions people have with algorithms not only shape users' experiences in that moment but because of the mutually shaping nature of such systems can also have longer-term effects through modifications of the underlying social-network structure. Understanding these mutually shaping systems is challenging given that researchers presently lack access to relevant platform data. We argue that increased transparency, more data sharing, and greater protections for external researchers examining the algorithms are required to help researchers better understand the entanglement between humans and algorithms. This better understanding is essential to support the development of algorithms with greater benefits and fewer risks to the public.
View details for DOI 10.1177/17456916231180809
View details for PubMedID 37427579
Users choose to engage with more partisan news than they are exposed to on Google Search.
If popular online platforms systematically expose their users to partisan and unreliable news, they could potentially contribute to societal issues such as rising political polarization1,2. This concern is central to the 'echo chamber'3-5 and 'filter bubble'6,7 debates, which critique the roles that user choice and algorithmic curation play in guiding users to different online information sources8-10. These roles can be measured as exposure, defined as the URLs shown to users by online platforms, and engagement, defined as the URLs selected by users. However, owing to the challenges of obtaining ecologically valid exposure data-what real users were shown during their typical platform use-research in this vein typically relies on engagement data4,8,11-16 or estimates of hypothetical exposure17-23. Studies involving ecological exposure have therefore been rare, and largely limited to social media platforms7,24, leaving open questions about web search engines. To address these gaps, we conducted a two-wave study pairing surveys with ecologically valid measures of both exposure and engagement on Google Search during the 2018 and 2020 US elections. In both waves, we found more identity-congruent and unreliable news sources in participants' engagement choices, both within Google Search and overall, than they were exposed to in their Google Search results. These results indicate that exposure to and engagement with partisan or unreliable news on Google Search are driven not primarily by algorithmic curation but by users' own choices.
View details for DOI 10.1038/s41586-023-06078-5
View details for PubMedID 37225979
View details for PubMedCentralID 7936330