
Ruth Elisabeth Appel
Ph.D. Student in Communication, admitted Autumn 2019
Masters Student in Computer Science, admitted Autumn 2023
Honors & Awards
-
SAP Stanford Graduate Fellowship in Science and Engineering, Stanford University (09/2019 - present)
-
Research Fellowship, Siegel Family Endowment (09/2022 – 07/2023)
-
PhD Research Fellowship, Stanford Center on Philanthropy and Civil Society (09/2022 - 06/2023)
-
Summer Collaborative Research Fellowship, Stanford Impact Labs (06/2022 - 09/2022)
-
American Democracy Fellowship, Stanford Center for American Democracy (02/2020 – 01/2021)
-
Student Fellowship, German Academic Scholarship Foundation (01/2014 - 06/2019)
Professional Affiliations and Activities
-
Member, Society for Personality and Social Psychology (SPSP) (2019 - Present)
Current Research and Scholarly Interests
Ruth Appel combines insights and methods from psychology, political science and computer science to develop and evaluate evidence-based personalized interventions to promote the social good. She is particularly passionate about preventing the spread of misinformation, encouraging political participation, promoting wellbeing and mental health, and addressing ethical challenges related to new technologies. Her current research projects include the 2020 Facebook Election Research Project and an online game to combat vaccine misinformation. She has also written about the ethics and privacy implications of new technologies.
Work Experience
-
User Experience Research Intern, Google LLC (6/2020 - 9/2020)
Location
San Francisco, CA
-
Associate in Research in the Health Division, Duke University, Center for Advanced Hindsight (2/2019 - 6/2019)
Location
Durham, NC, USA
-
Intern in the Fifth Committee Section, Delegation of the European Union to the United Nations (9/2016 - 12/2016)
Location
New York, NY, USA
-
Intern in the Strategy Department, Telekom Deutschland GmbH (4/2017 - 6/2017)
Location
Bonn, Germany
All Publications
-
Partisan conflict over content moderation is more than disagreement about facts.
Science advances
2023; 9 (44): eadg6799
Abstract
Social media companies have come under increasing pressure to remove misinformation from their platforms, but partisan disagreements over what should be removed have stymied efforts to deal with misinformation in the United States. Current explanations for these disagreements center on the "fact gap"-differences in perceptions about what is misinformation. We argue that partisan differences could also be due to "party promotion"-a desire to leave misinformation online that promotes one's own party-or a "preference gap"-differences in internalized preferences about whether misinformation should be removed. Through an experiment where respondents are shown false headlines aligned with their own or the opposing party, we find some evidence of party promotion among Democrats and strong evidence of a preference gap between Democrats and Republicans. Even when Republicans agree that content is false, they are half as likely as Democrats to say that the content should be removed and more than twice as likely to consider removal as censorship.
View details for DOI 10.1126/sciadv.adg6799
View details for PubMedID 37922349
View details for PubMedCentralID PMC10624338
-
Privacy and ethics in the age of Big Data
The psychology of technology: Social science research in the age of Big Data
American Psychological Association. 2022: 379-420
View details for DOI 10.1037/0000290-012
-
Psychological targeting in the age of Big Data
Measuring and Modeling Persons and Situations
Elsevier. 2021: 193-222
View details for DOI https://doi.org/10.1016/b978-0-12-819200-9.00015-6
-
Privacy in the age of psychological targeting
Current Opinion in Psychology
2020; 31: 116–121
View details for DOI 10.1016/j.copsyc.2019.08.010