
Nicole Martinez-Martin
Assistant Professor (Research) of Pediatrics (Biomedical Ethics)
Pediatrics - Center for Biomedical Ethics
Bio
Nicole Martinez-Martin received her JD from Harvard Law School and her doctorate in social sciences (comparative development/medical anthropology) from the University of Chicago. Her broader research interests concern the impact of new technologies on the treatment of vulnerable populations. Her graduate research included the study of cross-cultural approaches to mental health services in the Latine community and the use neuroscience in criminal cases. Her recent work in bioethics and neuroethics has focused on the ethics of AI and digital health technology, such as digital phenotyping or computer vision, for medical and behavioral applications.
She has served as PI for research projects examining ethical issues regarding machine learning in health care, digital health technology, digital contact tracing, and digital phenotyping. She has examined policy and regulatory issues related to privacy and data governance, bias and oversight of machine learning and digital health technology. Her K01 career development grant, funded through NIMH, focuses on the ethics of machine learning and digital mental health technology. Recent research has included examining bias, equity and inclusion as it pertains to machine learning and digital health, as well as social implications of privacy and data protections on marginalized groups.
Academic Appointments
-
Assistant Professor (Research), Pediatrics - Center for Biomedical Ethics
-
Member, Wu Tsai Neurosciences Institute
Boards, Advisory Committees, Professional Organizations
-
Neuroethics Framework - Legal System Working Group, Co-Chair, IEEE (2020 - Present)
-
Ethics Committee, International Society for Psychiatric Genetics (2019 - Present)
-
Program Committee, co-chair, International Neuroethics Society (2020 - Present)
-
Emerging Issues Task Force - Chair, International Neuroethics Society (2021 - Present)
-
Diversity & Inclusion Task Force, International Neuroethics Society (2020 - Present)
Current Research and Scholarly Interests
NIH/National Institute of Mental Health
K01 MH118375-01A1
“Ethical, Legal and Social Implications in the Use of Digital Technology for Mental Health Applications”
Greenwall Foundation Making a Difference in Bioethics Grant
“Ethical, Legal and Social Implications of Digital Phenotyping”
2022-23 Courses
- Introduction to Science, Technology & Society
STS 1 (Win) - Where Does it Hurt?: Medicine and Suffering in Global Context
COLLEGE 108 (Spr) -
Independent Studies (1)
- Directed Readings in Public Policy
PUBLPOL 198 (Win)
- Directed Readings in Public Policy
-
Prior Year Courses
2021-22 Courses
- Introduction to Science, Technology & Society
CSRE 1T, STS 1 (Spr)
2020-21 Courses
- The Public Life of Science and Technology
CSRE 1T, STS 1 (Win)
2019-20 Courses
- The Public Life of Science and Technology
CSRE 1T, STS 1 (Win)
- Introduction to Science, Technology & Society
All Publications
-
Dimensions of Research-Participant Interaction: Engagement is Not a Replacement for Consent.
The Journal of law, medicine & ethics : a journal of the American Society of Law, Medicine & Ethics
2020; 48 (1): 183–84
View details for DOI 10.1177/1073110520917008
View details for PubMedID 32342787
-
What Are Important Ethical Implications of Using Facial Recognition Technology in Health Care?
AMA journal of ethics
2019; 21 (2): E180–187
Abstract
Applications of facial recognition technology (FRT) in health care settings have been developed to identify and monitor patients as well as to diagnose genetic, medical, and behavioral conditions. The use of FRT in health care suggests the importance of informed consent, data input and analysis quality, effective communication about incidental findings, and potential influence on patient-clinician relationships. Privacy and data protection are thought to present challenges for the use of FRT for health applications.
View details for DOI 10.1001/amajethics.2019.180
View details for PubMedID 30794128
-
Data mining for health: staking out the ethical territory of digital phenotyping
NPJ DIGITAL MEDICINE
2018; 1
View details for DOI 10.1038/s41746-018-0075-8
View details for Web of Science ID 000453910600001
-
Is It Ethical to Use Prognostic Estimates from Machine Learning to Treat Psychosis?
AMA journal of ethics
2018; 20 (9): E804–811
Abstract
Machine learning is a method for predicting clinically relevant variables, such as opportunities for early intervention, potential treatment response, prognosis, and health outcomes. This commentary examines the following ethical questions about machine learning in a case of a patient with new onset psychosis: (1) When is clinical innovation ethically acceptable? (2) How should clinicians communicate with patients about the ethical issues raised by a machine learning predictive model?
View details for PubMedID 30242810
-
Surveillance and Digital Health.
The American journal of bioethics : AJOB
2018; 18 (9): 67–68
View details for PubMedID 30235099
-
Ethical Issues for Direct-to-Consumer Digital Psychotherapy Apps: Addressing Accountability, Data Protection, and Consent
JMIR MENTAL HEALTH
2018; 5 (2): e32
Abstract
This paper focuses on the ethical challenges presented by direct-to-consumer (DTC) digital psychotherapy services that do not involve oversight by a professional mental health provider. DTC digital psychotherapy services can potentially assist in improving access to mental health care for the many people who would otherwise not have the resources or ability to connect with a therapist. However, the lack of adequate regulation in this area exacerbates concerns over how safety, privacy, accountability, and other ethical obligations to protect an individual in therapy are addressed within these services. In the traditional therapeutic relationship, there are ethical obligations that serve to protect the interests of the client and provide warnings. In contrast, in a DTC therapy app, there are no clear lines of accountability or associated ethical obligations to protect the user seeking mental health services. The types of DTC services that present ethical challenges include apps that use a digital platform to connect users to minimally trained nonprofessional counselors, as well as services that provide counseling steered by artificial intelligence and conversational agents. There is a need for adequate oversight of DTC nonprofessional psychotherapy services and additional empirical research to inform policy that will provide protection to the consumer.
View details for DOI 10.2196/mental.9423
View details for Web of Science ID 000430917500002
View details for PubMedID 29685865
View details for PubMedCentralID PMC5938696