Welcome to my page! I am an Interdisciplinary Ethics Fellow. I am based between the Center for Ethics in Society and the Institute for Human-Centered Artificial Intelligence. My work is about the ethics of robotics and data-driven technologies. I completed a PhD in May 2020 at the University of Bristol on the ethics of automated vehicle decision making. My supervisors were Richard Pettigrew and Brad Hooker. I previously worked as a Research Assistant at the Leverhulme Centre for the Future of Intelligence at the University of Cambridge, looking at how tools from the philosophy of science might inform disputes about explainable machine learning in medicine.

Professional Education

  • Doctor of Philosophy, University of Bristol (2020)
  • Bachelor of Science, London School of Economics (2016)
  • Master of Arts, University of Bristol (2017)

Stanford Advisors

Research Interests

  • Data Sciences
  • Legal Issues
  • Philosophy
  • Technology and Education

All Publications

  • Why Trolley Problems Matter for the Ethics of Automated Vehicles SCIENCE AND ENGINEERING ETHICS Keeling, G. 2020; 26 (1): 293–307


    This paper argues against the view that trolley cases are of little or no relevance to the ethics of automated vehicles. Four arguments for this view are outlined and rejected: the Not Going to Happen Argument, the Moral Difference Argument, the Impossible Deliberation Argument and the Wrong Question Argument. In making clear where these arguments go wrong, a positive account is developed of how trolley cases can inform the ethics of automated vehicles.

    View details for DOI 10.1007/s11948-019-00096-1

    View details for Web of Science ID 000511675100015

    View details for PubMedID 30830593

    View details for PubMedCentralID PMC6978292

  • Four Perspectives on What Matters for the Ethics of Automated Vehicles Road Vehicle Automation 6 Keeling, G., Evans, K., Thornton, S. M., Mecacci, G., Santoni de Sio, F. Springer, Cham. 2019: 49–60
  • Autonomy, nudging and post-truth politics JOURNAL OF MEDICAL ETHICS Keeling, G. 2018; 44 (10): 721–22


    In his excellent essay, 'Nudges in a post-truth world', Neil Levy argues that 'nudges to reason', or nudges which aim to make us more receptive to evidence, are morally permissible. A strong argument against the moral permissibility of nudging is that nudges fail to respect the autonomy of the individuals affected by them. Levy argues that nudges to reason do respect individual autonomy, such that the standard autonomy objection fails against nudges to reason. In this paper, I argue that Levy fails to show that nudges to reason respect individual autonomy.

    View details for DOI 10.1136/medethics-2017-104616

    View details for Web of Science ID 000446526400014

    View details for PubMedID 29146713

  • Legal Necessity, Pareto Efficiency & Justified Killing in Autonomous Vehicle Collisions ETHICAL THEORY AND MORAL PRACTICE Keeling, G. 2018; 21 (2): 413–27
  • The sensitivity argument against child euthanasia JOURNAL OF MEDICAL ETHICS Keeling, G. 2018; 44 (2): 143–44


    Is there a moral difference between euthanasia for terminally ill adults and euthanasia for terminally ill children? Luc Bovens considers five arguments to this effect, and argues that each is unsuccessful. In this paper, I argue that Bovens' dismissal of the sensitivity argument is unconvincing.

    View details for DOI 10.1136/medethics-2017-104221

    View details for Web of Science ID 000423506500017

    View details for PubMedID 28381583

  • Against Leben's Rawlsian Collision Algorithm for Autonomous Vehicles Keeling, G., Muller, V. C. SPRINGER-VERLAG BERLIN. 2018: 259–72
  • Commentary: Using Virtual Reality to Assess Ethical Decisions in Road Traffic Scenarios: Applicability of Value-of-Life-Based Models and Influences of Time Pressure FRONTIERS IN BEHAVIORAL NEUROSCIENCE Keeling, G. 2017; 11: 247

    View details for DOI 10.3389/fnbeh.2017.00247

    View details for Web of Science ID 000417715000001

    View details for PubMedID 29311864

    View details for PubMedCentralID PMC5733039