
Nicole Martinez-Martin
Assistant Professor (Research) of Pediatrics (Biomedical Ethics)
Pediatrics - Center for Biomedical Ethics
Bio
Nicole Martinez-Martin received her JD from Harvard Law School and her doctorate in social sciences (psychological anthropology) from the University of Chicago. Her broader research interests concern the impact of new technologies on the treatment of vulnerable populations. Her graduate research included the study of cross-cultural approaches to mental health services in the Latinx community and the use neuroscience in criminal cases. Her work in bioethics and neuroethics has focused on the use of AI and digital health approaches for mental health applications.
Current Research and Scholarly Interests
NIH/National Institute of Mental Health
K01 MH118375-01A1
“Ethical, Legal and Social Implications in the Use of Digital Technology for Mental Health Applications”
Greenwall Foundation Making a Difference in Bioethics Grant
“Ethical, Legal and Social Implications of Digital Phenotyping”
2020-21 Courses
- The Public Life of Science and Technology
CSRE 1T, STS 1 (Win) -
Prior Year Courses
2019-20 Courses
- The Public Life of Science and Technology
CSRE 1T, STS 1 (Win)
- The Public Life of Science and Technology
All Publications
-
Dimensions of Research-Participant Interaction: Engagement is Not a Replacement for Consent.
The Journal of law, medicine & ethics : a journal of the American Society of Law, Medicine & Ethics
2020; 48 (1): 183–84
View details for DOI 10.1177/1073110520917008
View details for PubMedID 32342787
-
What Are Important Ethical Implications of Using Facial Recognition Technology in Health Care?
AMA journal of ethics
2019; 21 (2): E180–187
Abstract
Applications of facial recognition technology (FRT) in health care settings have been developed to identify and monitor patients as well as to diagnose genetic, medical, and behavioral conditions. The use of FRT in health care suggests the importance of informed consent, data input and analysis quality, effective communication about incidental findings, and potential influence on patient-clinician relationships. Privacy and data protection are thought to present challenges for the use of FRT for health applications.
View details for DOI 10.1001/amajethics.2019.180
View details for PubMedID 30794128
-
Data mining for health: staking out the ethical territory of digital phenotyping
NPJ DIGITAL MEDICINE
2018; 1
View details for DOI 10.1038/s41746-018-0075-8
View details for Web of Science ID 000453910600001
-
Is It Ethical to Use Prognostic Estimates from Machine Learning to Treat Psychosis?
AMA journal of ethics
2018; 20 (9): E804–811
Abstract
Machine learning is a method for predicting clinically relevant variables, such as opportunities for early intervention, potential treatment response, prognosis, and health outcomes. This commentary examines the following ethical questions about machine learning in a case of a patient with new onset psychosis: (1) When is clinical innovation ethically acceptable? (2) How should clinicians communicate with patients about the ethical issues raised by a machine learning predictive model?
View details for PubMedID 30242810
-
Surveillance and Digital Health.
The American journal of bioethics : AJOB
2018; 18 (9): 67–68
View details for PubMedID 30235099
-
Ethical Issues for Direct-to-Consumer Digital Psychotherapy Apps: Addressing Accountability, Data Protection, and Consent
JMIR MENTAL HEALTH
2018; 5 (2): e32
Abstract
This paper focuses on the ethical challenges presented by direct-to-consumer (DTC) digital psychotherapy services that do not involve oversight by a professional mental health provider. DTC digital psychotherapy services can potentially assist in improving access to mental health care for the many people who would otherwise not have the resources or ability to connect with a therapist. However, the lack of adequate regulation in this area exacerbates concerns over how safety, privacy, accountability, and other ethical obligations to protect an individual in therapy are addressed within these services. In the traditional therapeutic relationship, there are ethical obligations that serve to protect the interests of the client and provide warnings. In contrast, in a DTC therapy app, there are no clear lines of accountability or associated ethical obligations to protect the user seeking mental health services. The types of DTC services that present ethical challenges include apps that use a digital platform to connect users to minimally trained nonprofessional counselors, as well as services that provide counseling steered by artificial intelligence and conversational agents. There is a need for adequate oversight of DTC nonprofessional psychotherapy services and additional empirical research to inform policy that will provide protection to the consumer.
View details for DOI 10.2196/mental.9423
View details for Web of Science ID 000430917500002
View details for PubMedID 29685865
View details for PubMedCentralID PMC5938696