- Emergency Medicine
- Machine Learning
- Population Health
- Network Science
Assistant Professor - University Medical Line, Emergency Medicine
Member, Wu Tsai Neurosciences Institute
Boards, Advisory Committees, Professional Organizations
Member, Society of Academic Emergency Medicine (2016 - Present)
Member, American College of Emergency Physicians (2016 - Present)
Board Certification: American Board of Emergency Medicine, Emergency Medicine (2021)
Residency: Stanford University Emergency Medicine Residency (2020) CA
Medical Education: Harvard Medical School (2016) MA
MD, Harvard Medical School (2016)
PhD, Harvard Graduate School of Arts & Sciences (2015)
BSc, University of Toronto (2007)
A framework for integrating artificial intelligence for clinical care with continuous therapeutic monitoring.
Nature biomedical engineering
The complex relationships between continuously monitored health signals and therapeutic regimens can be modelled via machine learning. However, the clinical implementation of the models will require changes to clinical workflows. Here we outline ClinAIOps ('clinical artificial-intelligence operations'), a framework that integrates continuous therapeutic monitoring and the development of artificial intelligence (AI) for clinical care. ClinAIOps leverages three feedback loops to enable the patient to make treatment adjustments using AI outputs, the clinician to oversee patient progress with AI assistance, and the AI developer to receive continuous feedback from both the patient and the clinician. We lay out the central challenges and opportunities in the deployment of ClinAIOps by means of examples of its application in the management of blood pressure, diabetes and Parkinson's disease. By enabling more frequent and accurate measurements of a patient's health and more timely adjustments to their treatment, ClinAIOps may substantially improve patient outcomes.
View details for DOI 10.1038/s41551-023-01115-0
View details for PubMedID 37932379
Development and external validation of a pretrained deep learning model for the prediction of non-accidental trauma.
NPJ digital medicine
2023; 6 (1): 131
Non-accidental trauma (NAT) is deadly and difficult to predict. Transformer models pretrained on large datasets have recently produced state of the art performance on diverse prediction tasks, but the optimal pretraining strategies for diagnostic predictions are not known. Here we report the development and external validation of Pretrained and Adapted BERT for Longitudinal Outcomes (PABLO), a transformer-based deep learning model with multitask clinical pretraining, to identify patients who will receive a diagnosis of NAT in the next year. We develop a clinical interface to visualize patient trajectories, model predictions, and individual risk factors. In two comprehensive statewide databases, approximately 1% of patients experience NAT within one year of prediction. PABLO predicts NAT events with area under the receiver operating characteristic curve (AUROC) of 0.844 (95% CI 0.838-0.851) in the California test set, and 0.849 (95% CI 0.846-0.851) on external validation in Florida, outperforming comparator models. Multitask pretraining significantly improves model performance. Attribution analysis shows substance use, psychiatric, and injury diagnoses, in the context of age and racial demographics, as influential predictors of NAT. As a clinical decision support system, PABLO can identify high-risk patients and patient-specific risk factors, which can be used to target secondary screening and preventive interventions at the point-of-care.
View details for DOI 10.1038/s41746-023-00875-y
View details for PubMedID 37468526
View details for PubMedCentralID PMC10356774
Maximizing Equity in Acute Coronary Syndrome Screening across Sociodemographic Characteristics of Patients.
Diagnostics (Basel, Switzerland)
2023; 13 (12)
We compared four methods to screen emergency department (ED) patients for an early electrocardiogram (ECG) to diagnose ST-elevation myocardial infarction (STEMI) in a 5-year retrospective cohort through observed practice, objective application of screening protocol criteria, a predictive model, and a model augmenting human practice. We measured screening performance by sensitivity, missed acute coronary syndrome (ACS) and STEMI, and the number of ECGs required. Our cohort of 279,132 ED visits included 1397 patients who had a diagnosis of ACS. We found that screening by observed practice augmented with the model delivered the highest sensitivity for detecting ACS (92.9%, 95%CI: 91.4-94.2%) and showed little variation across sex, race, ethnicity, language, and age, demonstrating equity. Although it missed a few cases of ACS (7.6%) and STEMI (4.4%), it did require ECGs on an additional 11.1% of patients compared to current practice. Screening by protocol performed the worst, underdiagnosing young, Black, Native American, Alaskan or Hawaiian/Pacific Islander, and Hispanic patients. Thus, adding a predictive model to augment human practice improved the detection of ACS and STEMI and did so most equitably across the groups. Hence, combining human and model screening--rather than relying on either alone--may maximize ACS screening performance and equity.
View details for DOI 10.3390/diagnostics13122053
View details for PubMedID 37370948
View details for PubMedCentralID PMC10297640
Predicting patient decompensation from continuous physiologic monitoring in the emergency department.
NPJ digital medicine
2023; 6 (1): 60
Anticipation of clinical decompensation is essential for effective emergency and critical care. In this study, we develop a multimodal machine learning approach to predict the onset of new vital sign abnormalities (tachycardia, hypotension, hypoxia) in ED patients with normal initial vital signs. Our method combines standard triage data (vital signs, demographics, chief complaint) with features derived from a brief period of continuous physiologic monitoring, extracted via both conventional signal processing and transformer-based deep learning on ECG and PPG waveforms. We study 19,847 adult ED visits, divided into training (75%), validation (12.5%), and a chronologically sequential held-out test set (12.5%). The best-performing models use a combination of engineered and transformer-derived features, predicting in a 90-minute window new tachycardia with AUROC of 0.836 (95% CI, 0.800-0.870), new hypotension with AUROC 0.802 (95% CI, 0.747-0.856), and new hypoxia with AUROC 0.713 (95% CI, 0.680-0.745), in all cases significantly outperforming models using only standard triage data. Salient features include vital sign trends, PPG perfusion index, and ECG waveforms. This approach could improve the triage of apparently stable patients and be applied continuously for the prediction of near-term clinical deterioration.
View details for DOI 10.1038/s41746-023-00803-0
View details for PubMedID 37016152
View details for PubMedCentralID 6340461
Development and external validation of a pretrained deep learning model for the prediction of non-accidental trauma
npj Digital Medicine
2023; 6 (131)
View details for DOI 10.1038/s41746-023-00875-y
Association of Physician Malpractice Claims Rates with Admissions for Low-Risk Chest Pain
American Journal of Medicine Open
View details for DOI 10.1016/j.ajmo.2023.100041
Factors Predicting Misidentification of Acute Ischemic Stroke and Large Vessel Occlusion by Paramedics.
Critical pathways in cardiology
2022; 21 (4): 172-175
The emergence of thrombectomy for large vessel occlusions has increased the importance of accurate prehospital identification and triage of acute ischemic stroke (AIS). Despite available clinical scores, prehospital identification is suboptimal. Our objective was to improve the sensitivity of prehospital AIS identification by combining dispatch information with paramedic impression. We performed a retrospective cohort review of emergency medical services and hospital records of all patients for whom a stroke alert was activated in 1 urban, academic emergency department from January 1, 2018, to December 31, 2019. Using admission diagnosis of acute stroke as outcome, we calculated the sensitivity and specificity of dispatch and paramedic impression in identifying AIS and large vessel occlusion. We identified factors that, when included together, would improve the sensitivity of prehospital AIS identification. Two-hundred twenty-six stroke alerts were activated by emergency department physicians after transport by Indianapolis emergency medical services. Forty-four percent (99/226) were female, median age was 58 years (interquartile range, 50-67 years), and median National Institutes of Health Stroke Scale was 6 (interquartile range, 2-12). Paramedics demonstrated superior sensitivity (59% vs. 48%) but inferior specificity (56% vs. 73%) for detection of stroke as compared with dispatch. A strategy incorporating dispatch code of stroke, or paramedic impression of altered mental status or weakness in addition to stroke, would be 84% sensitive and 27% specific for identification of stroke. To optimize rapid and sensitive stroke detection, prehospital systems should consider inclusion of patients with dispatch code of stroke and provider impression of altered mental status or generalized weakness.
View details for DOI 10.1097/HPC.0000000000000307
View details for PubMedID 36413394
- Natural language processing to classify electrocardiograms in patients with syncope: A preliminary study. Health science reports 2022; 5 (6): e904
Using a 29-mRNA Host Response Classifier To Detect Bacterial Coinfections and Predict Outcomes in COVID-19 Patients Presenting to the Emergency Department.
Clinicians in the emergency department (ED) face challenges in concurrently assessing patients with suspected COVID-19 infection, detecting bacterial coinfection, and determining illness severity since current practices require separate workflows. Here, we explore the accuracy of the IMX-BVN-3/IMX-SEV-3 29 mRNA host response classifiers in simultaneously detecting severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection and bacterial coinfections and predicting clinical severity of COVID-19. A total of 161 patients with PCR-confirmed COVID-19 (52.2% female; median age, 50.0 years; 51% hospitalized; 5.6% deaths) were enrolled at the Stanford Hospital ED. RNA was extracted (2.5 mL whole blood in PAXgene blood RNA), and 29 host mRNAs in response to the infection were quantified using Nanostring nCounter. The IMX-BVN-3 classifier identified SARS-CoV-2 infection in 151 patients with a sensitivity of 93.8%. Six of 10 patients undetected by the classifier had positive COVID tests more than 9 days prior to enrollment, and the remaining patients oscillated between positive and negative results in subsequent tests. The classifier also predicted that 6 (3.7%) patients had a bacterial coinfection. Clinical adjudication confirmed that 5/6 (83.3%) of the patients had bacterial infections, i.e., Clostridioides difficile colitis (n = 1), urinary tract infection (n = 1), and clinically diagnosed bacterial infections (n = 3), for a specificity of 99.4%. Two of 101 (2.8%) patients in the IMX-SEV-3 "Low" severity classification and 7/60 (11.7%) in the "Moderate" severity classification died within 30 days of enrollment. IMX-BVN-3/IMX-SEV-3 classifiers accurately identified patients with COVID-19 and bacterial coinfections and predicted patients' risk of death. A point-of-care version of these classifiers, under development, could improve ED patient management, including more accurate treatment decisions and optimized resource utilization. IMPORTANCE We assay the utility of the single-test IMX-BVN-3/IMX-SEV-3 classifiers that require just 2.5 mL of patient blood in concurrently detecting viral and bacterial infections as well as predicting the severity and 30-day outcome from the infection. A point-of-care device, in development, will circumvent the need for blood culturing and drastically reduce the time needed to detect an infection. This will negate the need for empirical use of broad-spectrum antibiotics and allow for antibiotic use stewardship. Additionally, accurate classification of the severity of infection and the prediction of 30-day severe outcomes will allow for appropriate allocation of hospital resources.
View details for DOI 10.1128/spectrum.02305-22
View details for PubMedID 36250865
Development and Comparative Performance of Physiologic Monitoring Strategies in the Emergency Department.
JAMA network open
2022; 5 (9): e2233712
Accurate and timely documentation of vital signs affects all aspects of triage, diagnosis, and management. The adequacy of current patient monitoring practices and the potential to improve on them are poorly understood.To develop measures of fit between documented and actual patient vital signs throughout the visit, as determined from continuous physiologic monitoring, and to compare the performance of actual practice with alternative patient monitoring strategies.This cross-sectional study evaluated 25 751 adult visits to continuously monitored emergency department (ED) beds between August 1, 2020, and December 31, 2021. A series of monitoring strategies for the documentation of vital signs (heart rate [HR], respiratory rate [RR], oxygen saturation by pulse oximetry [Spo2], mean arterial pressure [MAP]) was developed and the strategies' ability to capture physiologic trends and vital sign abnormalities simulated. Strategies included equal spacing of charting events, charting at variable intervals depending on the last observed values, and discrete optimization of charting events.Coverage was defined as the proportion of monitor-derived vital sign measurements (at 1-minute resolution) that fall within the bounds of nursing-charted values over the course of an ED visit (HR ± 5 beats/min, RR ± 3 breaths/min, Spo2 ± 2%, and MAP ± 6 mm Hg). Capture was defined as the documentation of a vital sign abnormality detected by bedside monitor (tachycardia [HR >100 beats/min], bradycardia [HR <60 beats/min], hypotension [MAP <65 mm Hg], and hypoxia [Spo2 <95%]).Median patient age was 60 years (IQR, 43-75 years), and 13 329 visits (51.8%) were by women. Monitored visits had a median of 4 (IQR, 2-5) vital sign charting events per visit. Compared with actual practice, a simple rule, which observes vital signs more frequently if the last observation fell outside the bounds of the previous values, and using the same number of observations as actual practice, produced relative coverage improvements of 31.5% (95% CI, 30.5%-32.5%) for HR, 31.0% (95% CI, 30.0%-32.0%) for MAP, 16.8% (95% CI, 16.0%-17.6%) for RR, and 7.8% (95% CI, 7.3%-8.3%) for Spo2. The same strategy improved capture of abnormalities by 38.9% (95% CI, 26.8%-52.2%) for tachycardia, 38.1% (95% CI, 29.0%-47.9%) for bradycardia, 39.0% (95% CI, 24.2%-55.7%) for hypotension, and 123.1% (95% CI, 110.7%-136.3%) for hypoxia. Analysis of optimal coverage suggested an additional scope for improvement through more sophisticated strategies.In this cross-sectional study, actual documentation of ED vital signs was variable and incomplete, missing important trends and abnormalities. Alternative monitoring strategies may improve on current practice without increasing the overall frequency of patient monitoring.
View details for DOI 10.1001/jamanetworkopen.2022.33712
View details for PubMedID 36169956
Predicting malnutrition from longitudinal patient trajectories with deep learning.
2022; 17 (7): e0271487
Malnutrition is common, morbid, and often correctable, but subject to missed and delayed diagnosis. Better screening and prediction could improve clinical, functional, and economic outcomes. This study aimed to assess the predictability of malnutrition from longitudinal patient records, and the external generalizability of a predictive model. Predictive models were developed and validated on statewide emergency department (ED) and hospital admission databases for California, Florida and New York, including visits from October 1, 2015 to December 31, 2018. Visit features included patient demographics, diagnosis codes, and procedure categories. Models included long short-term memory (LSTM) recurrent neural networks trained on longitudinal trajectories, and gradient-boosted tree and logistic regression models trained on cross-sectional patient data. The dataset used for model training and internal validation (California and Florida) included 62,811 patient trajectories (266,951 visits). Test sets included 63,997 (California), 63,112 (Florida), and 62,472 (New York) trajectories, such that each cohort's composition was proportional to the prevalence of malnutrition in that state. Trajectories contained seven patient characteristics and up to 2,008 diagnosis categories. Area under the receiver-operating characteristic (AUROC) and precision-recall curves (AUPRC) were used to characterize prediction of first malnutrition diagnoses in the test sets. Data analysis was performed from September 2020 to May 2021. Between 4.0% (New York) and 6.2% (California) of patients received malnutrition diagnoses. The longitudinal LSTM model produced the most accurate predictions of malnutrition, with comparable predictive performance in California (AUROC 0.854, AUPRC 0.258), Florida (AUROC 0.869, AUPRC 0.234), and New York (AUROC 0.869, AUPRC 0.190). Deep learning models can reliably predict malnutrition from existing longitudinal patient records, with better predictive performance and lower data-collection requirements than existing instruments. This approach may facilitate early nutritional intervention via automated screening at the point of care.
View details for DOI 10.1371/journal.pone.0271487
View details for PubMedID 35901027
Detection of bacterial co-infections and prediction of fatal outcomes in COVID-19 patients presenting to the emergency department using a 29 mRNA host response classifier.
medRxiv : the preprint server for health sciences
Objective: Clinicians in the emergency department (ED) face challenges in concurrently assessing patients with suspected COVID-19 infection, detecting bacterial co-infection, and determining illness severity since current practices require separate workflows. Here we explore the accuracy of the IMX-BVN-3/IMX-SEV-3 29 mRNA host response classifiers in simultaneously detecting SARS-CoV-2 infection, bacterial co-infections, and predicting clinical severity of COVID-19.Methods: 161 patients with PCR-confirmed COVID-19 (52.2% female, median age 50.0 years, 51% hospitalized, 5.6% deaths) were enrolled at the Stanford Hospital ED. RNA was extracted (2.5 mL whole blood in PAXgene Blood RNA) and 29 host mRNAs in response to the infection were quantified using Nanostring nCounter.Results: The IMX-BVN-3 classifier identified SARS-CoV-2 infection in 151 patients with a sensitivity of 93.8%. Six of 10 patients undetected by the classifier had positive COVID tests more than 9 days prior to enrolment and the remaining oscillated between positive and negative results in subsequent tests. The classifier also predicted that 6 (3.7%) patients had a bacterial co-infection. Clinical adjudication confirmed that 5/6 (83.3%) of the patients had bacterial infections, i.e. Clostridioides difficile colitis (n=1), urinary tract infection (n=1), and clinically diagnosed bacterial infections (n=3) for a specificity of 99.4%. 2/101 (2.8%) patients in the IMX-SEV-3 Low and 7/60 (11.7%) in the Moderate severity classifications died within thirty days of enrollment.Conclusions: IMX-BVN-3/IMX-SEV-3 classifiers accurately identified patients with COVID-19, bacterial co-infections, and predicted patientsa risk of death. A point-of-care version of these classifiers, under development, could improve ED patient management including more accurate treatment decisions and optimized resource utilization.
View details for DOI 10.1101/2022.03.14.22272394
View details for PubMedID 35313598
- A Novel Emergency Medical Services Protocol To Improve Treatment Time For Large Vessel Occlusion Strokes LIPPINCOTT WILLIAMS & WILKINS. 2022
A novel emergency medical services protocol to improve treatment time for large vessel occlusion strokes.
2022; 17 (2): e0264539
In many systems, patients with large vessel occlusion (LVO) strokes experience delays in transport to thrombectomy-capable centers. This pilot study examined use of a novel emergency medical services (EMS) protocol to expedite transfer of patients with LVOs to a comprehensive stroke center (CSC). From October 1, 2020 to February 22, 2021, Indianapolis EMS piloted a protocol, in which paramedics, after transporting a patient with a possible stroke remained at the patient's bedside until released by the emergency department or neurology physician. In patients with possible LVO, EMS providers remained at the bedside until the clinical assessment and CT angiography (CTA) were complete. If indicated, the paramedics at bedside transferred the patient, via the same ambulance, to a nearby thrombectomy-capable CSC with which an automatic transfer agreement had been arranged. This five-month mixed methods study included case-control assessment of use of the protocol, number of transfers, safety during transport, and time saved in transfer compared to emergent transfers via conventional interfacility transfer agencies. In qualitative analysis EMS providers, and ED physicians and neurologists at both sending and receiving institutions, completed e-mail surveys on the process, and offered suggestions for process improvement. Responses were coded with an inductive content analysis approach. The protocol was used 42 times during the study period; four patients were found to have LVOs and were transferred to the CSC. There were no adverse events. Median time from decision-to-transfer to arrival at the CSC was 27.5 minutes (IQR 24.5-29.0), compared to 314.5 minutes (IQR 204.0-459.3) for acute non-stroke transfers during the same period. Major themes of provider impressions included: incomplete awareness of the protocol, smooth process, challenges when a stroke alert was activated after EMS left the hospital, greater involvement of EMS in patient care, and comments on communication and efficiency. This pilot study demonstrated the feasibility, safety, and efficiency of a novel approach to expedite endovascular therapy for patients with LVOs.
View details for DOI 10.1371/journal.pone.0264539
View details for PubMedID 35213646
Transfer learning enables prediction of myocardial injury from continuous single-lead electrocardiography
Journal of the American Medical Informatics Association
View details for DOI 10.1093/jamia/ocac135
Natural Language Processing to Classify Electrocardiograms in Patients With Syncope
Academic Emergency Medicine
2022; 29 (S1)
View details for DOI 10.1111/acem.14511
Predicting Myocardial Injury From Continuous Single-Lead Electrocardiography in the Emergency Department
Academic Emergency Medicine
2022; 29 (S1)
View details for DOI 10.1111/acem.14511
Association Between SARS-CoV-2 RNAemia and Postacute Sequelae of COVID-19.
Open forum infectious diseases
2022; 9 (2): ofab646
Determinants of Post-Acute Sequelae of COVID-19 are not known. Here we show that 83.3% of patients with viral RNA in blood (RNAemia) at presentation were symptomatic in the post-acute phase. RNAemia at presentation successfully predicted PASC, independent of patient demographics, worst disease severity, and length of symptoms.
View details for DOI 10.1093/ofid/ofab646
View details for PubMedID 35111870
View details for PubMedCentralID PMC8802799
SARS-CoV-2 RNAemia predicts clinical deterioration and extrapulmonary complications from COVID-19.
Clinical infectious diseases : an official publication of the Infectious Diseases Society of America
The determinants of COVID-19 disease severity and extrapulmonary complications (EPCs) are poorly understood. We characterized relationships between SARS-CoV-2 RNAemia and disease severity, clinical deterioration, and specific EPCs.We used quantitative (qPCR) and digital (dPCR) PCR to quantify SARS-CoV-2 RNA from plasma in 191 patients presenting to the Emergency Department (ED) with COVID-19. We recorded patient symptoms, laboratory markers, and clinical outcomes, with a focus on oxygen requirements over time. We collected longitudinal plasma samples from a subset of patients. We characterized the role of RNAemia in predicting clinical severity and EPCs using elastic net regression.23.0% (44/191) of SARS-CoV-2 positive patients had viral RNA detected in plasma by dPCR, compared to 1.4% (2/147) by qPCR. Most patients with serial measurements had undetectable RNAemia within 10 days of symptom onset, reached maximum clinical severity within 16 days, and symptom resolution within 33 days. Initially RNAaemic patients were more likely to manifest severe disease (OR 6.72 [95% CI, 2.45 - 19.79]), worsening of disease severity (OR 2.43 [95% CI, 1.07 - 5.38]), and EPCs (OR 2.81 [95% CI, 1.26 - 6.36]). RNA load correlated with maximum severity (r = 0.47 [95% CI, 0.20 - 0.67]).dPCR is more sensitive than qPCR for the detection of SARS-CoV-2 RNAemia, which is a robust predictor of eventual COVID-19 severity and oxygen requirements, as well as EPCs. Since many COVID-19 therapies are initiated on the basis of oxygen requirements, RNAemia on presentation might serve to direct early initiation of appropriate therapies for the patients most likely to deteriorate.
View details for DOI 10.1093/cid/ciab394
View details for PubMedID 33949665
Malpractice Claim Rates Are Associated with Admission of Low-Risk Chest Pain
Academic Emergency Medicine
2021; 28 (S1)
View details for DOI 10.1111/acem.14249
Association Between SARS-CoV-2 RNAemia and Post-Acute Sequelae of COVID-19.
medRxiv : the preprint server for health sciences
Determinants of Post-Acute Sequelae of COVID-19 are not known. Here we show that 75% of patients with viral RNA in blood (RNAemia) at presentation were symptomatic in the post-acute phase. RNAemia at presentation successfully predicted PASC, independent of patient demographics, initial disease severity, and length of symptoms.
View details for DOI 10.1101/2021.09.03.21262934
View details for PubMedID 34518843
View details for PubMedCentralID PMC8437320
Social Determinants of Hallway Bed Use.
The western journal of emergency medicine
2020; 21 (4): 949–58
INTRODUCTION: Hallway beds in the emergency department (ED) produce lower patient satisfaction and inferior care. We sought to determine whether socioeconomic factors influence which visits are assigned to hallway beds, independent of clinical characteristics at triage.METHODS: We studied 332,919 visits, across 189,326 patients, to two academic EDs from 2013-2016. We estimated a logistic model of hallway bed assignment, conditioning on payor, demographics, triage acuity, chief complaint, patient visit frequency, and ED volume. Because payor is not generally known at the time of triage, we interpreted it as a proxy for other observable characteristics that may influence bed assignment. We estimated a Cox proportional hazards model of hallway bed assignment on length of stay.RESULTS: Median patient age was 53. 54.0% of visits were by women. 42.1% of visits were paid primarily by private payors, 37.1% by Medicare, and 20.7% by Medicaid. A total of 16.2% of visits were assigned to hallway beds. Hallway bed assignment was more likely for frequent ED visitors, for lower acuity presentations, and for psychiatric, substance use, and musculoskeletal chief complaints, which were more common among visits paid primarily by Medicaid. In a logistic model controlling for these factors, as well as for other patient demographics and for the volume of recent ED arrivals, Medicaid status was nevertheless associated with 22% greater odds of assignment to a hallway bed (odds ratio 1.22, [95% confidence interval, CI, 1.18-1.26]), compared to private insurance. Visits assigned to hallway beds had longer lengths of stay than roomed visits of comparable acuity (hazard ratio for departure 0.91 [95% CI, 0.90-0.92]).CONCLUSION: We find evidence of social determinants of hallway bed use, likely involving epidemiologic, clinical, and operational factors. Even after accounting for different distributions of chief complaints and for more frequent ED use by the Medicaid population, as well as for other visit characteristics known at the time of triage, visits paid primarily by Medicaid retain a disproportionate association with hallway bed assignment. Further research is needed to eliminate potential bias in the use of hallway beds. [West J Emerg Med. 2020;21(4)949-958.].
View details for DOI 10.5811/westjem.2020.4.45976
View details for PubMedID 32726269
- Rates of Co-infection Between SARS-CoV-2 and Other Respiratory Pathogens. JAMA 2020
SARS-CoV-2 RNAaemia predicts clinical deterioration and extrapulmonary complications from COVID-19.
medRxiv : the preprint server for health sciences
The determinants of COVID-19 disease severity and extrapulmonary complications (EPCs) are poorly understood. We characterise the relationships between SARS-CoV-2 RNAaemia and disease severity, clinical deterioration, and specific EPCs.We used quantitative (qPCR) and digital (dPCR) PCR to quantify SARS-CoV-2 RNA from nasopharyngeal swabs and plasma in 191 patients presenting to the Emergency Department (ED) with COVID-19. We recorded patient symptoms, laboratory markers, and clinical outcomes, with a focus on oxygen requirements over time. We collected longitudinal plasma samples from a subset of patients. We characterised the role of RNAaemia in predicting clinical severity and EPCs using elastic net regression.23·0% (44/191) of SARS-CoV-2 positive patients had viral RNA detected in plasma by dPCR, compared to 1·4% (2/147) by qPCR. Most patients with serial measurements had undetectable RNAaemia 10 days after onset of symptoms, but took 16 days to reach maximum severity, and 33 days for symptoms to resolve. Initially RNAaemic patients were more likely to manifest severe disease (OR 6·72 [95% CI, 2·45 - 19·79]), worsening of disease severity (OR 2·43 [95% CI, 1·07 - 5·38]), and EPCs (OR 2·81 [95% CI, 1·26 - 6·36]). RNA load correlated with maximum severity ( r = 0·47 [95% CI, 0·20 - 0·67]).dPCR is more sensitive than qPCR for the detection of SARS-CoV-2 RNAaemia, which is a robust predictor of eventual COVID-19 severity and oxygen requirements, as well as EPCs. Since many COVID-19 therapies are initiated on the basis of oxygen requirements, RNAaemia on presentation might serve to direct early initiation of appropriate therapies for the patients most likely to deteriorate.NIH/NIAID (Grants R01A153133, R01AI137272, and 3U19AI057229 - 17W1 COVID SUPP #2) and a donation from Eva Grove.Evidence before this study: The varied clinical manifestations of COVID-19 have directed attention to the distribution of SARS-CoV-2 in the body. Although most concentrated and tested for in the nasopharynx, SARS-CoV-2 RNA has been found in blood, stool, and numerous tissues, raising questions about dissemination of viral RNA throughout the body, and the role of this process in disease severity and extrapulmonary complications. Recent studies have detected low levels of SARS-CoV-2 RNA in blood using either quantitative reverse transcriptase real-time PCR (qPCR) or droplet digital PCR (dPCR), and have associated RNAaemia with disease severity and biomarkers of dysregulated immune response.Added value of this study: We quantified SARS-CoV-2 RNA in the nasopharynx and plasma of patients presenting to the Emergency Department with COVID-19, and found an array-based dPCR platform to be markedly more sensitive than qPCR for detection of SARS-CoV-2 RNA, with a simplified workflow well-suited to clinical adoption. We collected serial plasma samples during patients' course of illness, and showed that SARS-CoV-2 RNAaemia peaks early, while clinical condition often continues to worsen. Our findings confirm the association between RNAaemia and disease severity, and additionally demonstrate a role for RNAaemia in predicting future deterioration and specific extrapulmonary complications.Implications of all the available evidence: Variation in SARS-CoV-2 RNAaemia may help explain disparities in disease severity and extrapulmonary complications from COVID-19. Testing for RNAaemia with dPCR early in the course of illness may help guide patient triage and management.
View details for DOI 10.1101/2020.12.19.20248561
View details for PubMedID 33398290
View details for PubMedCentralID PMC7781329
Validation of the NUE rule to predict futile resuscitation of out-of-hospital cardiac arrest.
Prehospital emergency care : official journal of the National Association of EMS Physicians and the National Association of State EMS Directors
We validated the NUE rule, using three criteria (Non-shockable initial rhythm, Unwitnessed arrest, Eighty years or older) to predict futile resuscitation of patients with out-of-hospital cardiac arrest (OHCA).We performed a retrospective cohort analysis of all recorded OHCA in Marion County, Indiana, from January 1, 2014 to December 31, 2019. We described patient, arrest, and emergency medical services (EMS) response characteristics, and assessed the performance of the NUE rule in identifying patients unlikely to survive to hospital discharge.From 2014 to 2019, EMS responded to 4370 patients who sustained OHCA. We excluded 329 (7.5%) patients with incomplete data. Median patient age was 62 years (IQR 49 -73), 1599 (39.6%) patients were female, and 1728 (42.8%) arrests were witnessed. The NUE rule identified 290 (7.2%) arrests, of whom none survived to hospital discharge.In external validation, the NUE rule (Non-shockable initial rhythm, Unwitnessed arrest, Eighty years or older) correctly identified 7.2% of OHCA patients unlikely to survive to hospital discharge. The NUE rule could be used in EMS protocols and policies to identify OHCA patients very unlikely to benefit from aggressive resuscitation.
View details for DOI 10.1080/10903127.2020.1831666
View details for PubMedID 33026273
A body bag can save your life: a novel method of cold water immersion for heat stroke treatment.
Journal of the American College of Emergency Physicians open
2020; 1 (1): 49–52
Non-exertional heat stroke is a life-threatening condition characterized by passive exposure to high ambient heat, a core body temperature of 40°C (104°F) or greater, and central nervous system dysfunction. Rapid cooling is imperative to minimize mortality and morbidity. Although evaporative and convective measures are often used for cooling heat stroke patients, cold water immersion produces the fastest cooling. However, logistical difficulties make cold water immersion challenging to implement in the emergency department. To our knowledge, there is no documented case utilizing a body bag (ie, human remains pouch) as a cold water immersion tank for rapid resuscitation of heat stroke. During a regional heat wave an elderly woman was found unconscious in a parking lot with an oral temperature of 40°C (104°F) and altered mental status. She was cooled to 38.4°C (101.1°F) in 10 minutes by immersion in an ice- and water-filled body bag. The patient rapidly regained normal mentation and was discharged home from the ED. This case highlights a novel method for efficient and convenient cold water immersion for heat stroke treatment in the emergency department.
View details for DOI 10.1002/emp2.12007
View details for PubMedID 33000014
View details for PubMedCentralID PMC7493529
- An Interpretable Deep Learning Model for the Prevention of Self-Harm and Suicide MOSBY-ELSEVIER. 2019: S6
- Reply to "the futility of resuscitating an out-of-hospital cardiac arrest cannot be summarized by three simple criteria." Resuscitation 2019
A simple decision rule predicts futile resuscitation of out-of-hospital cardiac arrest.
Resuscitation of cardiac arrest involves invasive and traumatic interventions and places a large burden on limited EMS resources. Our aim was to identify prehospital cardiac arrests for which resuscitation is extremely unlikely to result in survival to hospital discharge.We performed a retrospective cohort analysis of all cardiac arrests in San Mateo County, California, for which paramedics were dispatched, from January 1, 2015 to December 31, 2018, using the Cardiac Arrest Registry to Enhance Survival (CARES) database. We described characteristics of patients, arrests, and EMS responses, and used recursive partitioning to develop decision rules to identify arrests unlikely to survive to hospital discharge, or to survive with good neurologic function.From 2015-2018, 1750 patients received EMS dispatch for cardiac arrest in San Mateo County. We excluded 44 patients for whom resuscitation was terminated due to DNR directives. Median age was 69 years (IQR 57 - 81), 563 (33.0%) patients were female, 816 (47.8%) had witnessed arrests, 651 (38.2%) received bystander CPR, 421 (24.7%) had an initial shockable rhythm, and 1178 (69.1%) arrested at home. A simple rule (non-shockable initial rhythm, unwitnessed arrest, and age 80 or greater) excludes 223 (13.1%) arrests, of whom none survived to hospital discharge.A simple decision rule (non-shockable rhythm, unwitnessed arrest, age ≥ 80) identifies arrests for which resuscitation is futile. If validated, this rule could be applied by EMS policymakers to identify cardiac arrests for which the trauma and expense of resuscitation are extremely unlikely to result in survival.
View details for DOI 10.1016/j.resuscitation.2019.06.011
View details for PubMedID 31228547
- Predicting First Episodes of Non-Accidental Trauma With Machine Learning MOSBY-ELSEVIER. 2018: S145
Exposure, hazard, and survival analysis of diffusion on social networks
STATISTICS IN MEDICINE
2018; 37 (17): 2561–85
Sociologists, economists, epidemiologists, and others recognize the importance of social networks in the diffusion of ideas and behaviors through human societies. To measure the flow of information on real-world networks, researchers often conduct comprehensive sociometric mapping of social links between individuals and then follow the spread of an "innovation" from reports of adoption or change in behavior over time. The innovation is introduced to a small number of individuals who may also be encouraged to spread it to their network contacts. In conjunction with the known social network, the pattern of adoptions gives researchers insight into the spread of the innovation in the population and factors associated with successful diffusion. Researchers have used widely varying statistical tools to estimate these quantities, and there is disagreement about how to analyze diffusion on fully observed networks. Here, we describe a framework for measuring features of diffusion processes on social networks using the epidemiological concepts of exposure and competing risks. Given a realization of a diffusion process on a fully observed network, we show that classical survival regression models can be adapted to estimate the rate of diffusion, and actor/edge attributes associated with successful transmission or adoption, while accounting for the topology of the social network. We illustrate these tools by applying them to a randomized network intervention trial conducted in Honduras to estimate the rate of adoption of 2 health-related interventions-multivitamins and chlorine bleach for water purification-and determine factors associated with successful social transmission.
View details for PubMedID 29707798
Social connectedness is associated with fibrinogen level in a human social network
PROCEEDINGS OF THE ROYAL SOCIETY B-BIOLOGICAL SCIENCES
2016; 283 (1837)
Socially isolated individuals face elevated rates of illness and death. Conventional measures of social connectedness reflect an individual's perceived network and can be subject to bias and variation in reporting. In this study of a large human social network, we find that greater indegree, a sociocentric measure of friendship and familial ties identified by a subject's social connections rather than by the subject, predicts significantly lower concentrations of fibrinogen (a biomarker of inflammation and cardiac risk), after adjusting for demographics, education, medical history and known predictors of cardiac risk. The association between fibrinogen and social isolation, as measured by low indegree, is comparable to the effect of smoking, and greater than that of low education, a conventional measure of socioeconomic disadvantage. By contrast, outdegree, which reflects an individual's perceived connectedness, displays a significantly weaker association with fibrinogen concentrations.
View details for DOI 10.1098/rspb.2016.0958
View details for PubMedID 27559060
Social network targeting to maximise population behaviour change: a cluster randomised controlled trial.
Lancet (London, England)
2015; 386 (9989): 145-53
Information and behaviour can spread through interpersonal ties. By targeting influential individuals, health interventions that harness the distributive properties of social networks could be made more effective and efficient than those that do not. Our aim was to assess which targeting methods produce the greatest cascades or spillover effects and hence maximise population-level behaviour change.In this cluster randomised trial, participants were recruited from villages of the Department of Lempira, Honduras. We blocked villages on the basis of network size, socioeconomic status, and baseline rates of water purification, for delivery of two public health interventions: chlorine for water purification and multivitamins for micronutrient deficiencies. We then randomised villages, separately for each intervention, to one of three targeting methods, introducing the interventions to 5% samples composed of either: randomly selected villagers (n=9 villages for each intervention); villagers with the most social ties (n=9); or nominated friends of random villagers (n=9; the last strategy exploiting the so-called friendship paradox of social networks). Participants and data collectors were not aware of the targeting methods. Primary endpoints were the proportions of available products redeemed by the entire population under each targeting method. This trial is registered with ClinicalTrials.gov, number NCT01672580.Between Aug 4, and Aug 14, 2012, 32 villages in rural Honduras (25-541 participants each; total study population of 5773) received public health interventions. For each intervention, nine villages (each with 1-20 initial target individuals) were randomised, using a blocked design, to each of the three targeting methods. In nomination-targeted villages, 951 (74·3%) of 1280 available multivitamin tickets were redeemed compared with 940 (66·2%) of 1420 in randomly targeted villages and 744 (61·0%) of 1220 in indegree-targeted villages. All pairwise differences in redemption rates were significant (p<0·01) after correction for multiple comparisons. Targeting nominated friends increased adoption of the nutritional intervention by 12·2% compared with random targeting (95% CI 6·9-17·9). Targeting the most highly connected individuals, by contrast, produced no greater adoption of either intervention, compared with random targeting.Introduction of a health intervention to the nominated friends of random individuals can enhance that intervention's diffusion by exploiting intrinsic properties of human social networks. This method has the additional advantage of scalability because it can be implemented without mapping the network. Deployment of certain types of health interventions via network targeting, without increasing the number of individuals targeted or the resources used, could enhance the adoption and efficiency of those interventions, thereby improving population health.National Institutes of Health, The Bill & Melinda Gates Foundation, Star Family Foundation, and the Canadian Institutes of Health Research.
View details for DOI 10.1016/S0140-6736(15)60095-2
View details for PubMedID 25952354
View details for PubMedCentralID PMC4638320
In Bad Taste: Evidence for the Oral Origins of Moral Disgust
2009; 323 (5918): 1222–26
In common parlance, moral transgressions "leave a bad taste in the mouth." This metaphor implies a link between moral disgust and more primitive forms of disgust related to toxicity and disease, yet convincing evidence for this relationship is still lacking. We tested directly the primitive oral origins of moral disgust by searching for similarity in the facial motor activity evoked by gustatory distaste (elicited by unpleasant tastes), basic disgust (elicited by photographs of contaminants), and moral disgust (elicited by unfair treatment in an economic game). We found that all three states evoked activation of the levator labii muscle region of the face, characteristic of an oralnasal rejection response. These results suggest that immorality elicits the same disgust as disease vectors and bad tastes.
View details for DOI 10.1126/science.1165565
View details for Web of Science ID 000263687600041
View details for PubMedID 19251631