Jeremy Goldhaber-Fiebert
Professor of Health Policy
Bio
Jeremy Goldhaber-Fiebert, PhD, is a Professor of Health Policy and a Core Faculty Member in the Centers for Health Policy and Primary Care and Outcomes Research. His research focuses on complex policy decisions surrounding the prevention and management of increasingly common, chronic diseases and the life course impact of exposure to their risk factors. In the context of both developing and developed countries including the US, India, China, and South Africa, he has examined chronic conditions including type 2 diabetes and cardiovascular diseases, human papillomavirus and cervical cancer, tuberculosis, and hepatitis C and on risk factors including smoking, physical activity, obesity, malnutrition, and other diseases themselves. He combines simulation modeling methods and cost-effectiveness analyses with econometric approaches and behavioral economic studies to address these issues. Dr. Goldhaber-Fiebert graduated magna cum laude from Harvard College in 1997, with an A.B. in the History and Literature of America. After working as a software engineer and consultant, he conducted a year-long public health research program in Costa Rica with his wife in 2001. Winner of the Lee B. Lusted Prize for Outstanding Student Research from the Society for Medical Decision Making in 2006 and in 2008, he completed his PhD in Health Policy concentrating in Decision Science at Harvard University in 2008. He was elected as a Trustee of the Society for Medical Decision Making in 2011 and Secretary/Treasurer in 2021.
Past and current research topics:
- Type 2 diabetes and cardiovascular risk factors: Randomized and observational studies in Costa Rica examining the impact of community-based lifestyle interventions and the relationship of gender, risk factors, and care utilization.
-Cervical cancer: Model-based cost-effectiveness analyses and costing methods studies that examine policy issues relating to cervical cancer screening and human papillomavirus vaccination in countries including the United States, Brazil, India, Kenya, Peru, South Africa, Tanzania, and Thailand.
- Measles, haemophilus influenzae type b, and other childhood infectious diseases: Longitudinal regression analyses of country-level data from middle and upper income countries that examine the link between vaccination, sustained reductions in mortality, and evidence of herd immunity.
- Patient adherence: Studies in both developing and developed countries of the costs and effectiveness of measures to increase successful adherence. Adherence to cervical cancer screening as well as to disease management programs targeting depression and obesity is examined from both a decision-analytic and a behavioral economics perspective.
- Simulation modeling methods: Research examining model calibration and validation, the appropriate representation of uncertainty in projected outcomes, the use of models to examine plausible counterfactuals at the biological and epidemiological level, and the reflection of population and spatial heterogeneity.
Academic Appointments
-
Professor, Health Policy
-
Member, Stanford Cancer Institute
-
Affiliate, Stanford Woods Institute for the Environment
2024-25 Courses
- Advanced Decision Science Methods and Modeling in Health
HRP 263 (Win) - Analysis of Costs, Risks, and Benefits of Health Care
BIOMEDIN 432, HRP 392 (Aut) - Why College? Your Education and the Good Life
COLLEGE 101 (Aut) -
Independent Studies (11)
- Directed Reading in Environment and Resources
ENVRES 398 (Aut, Win, Spr, Sum) - Directed Reading in Health Research and Policy
HRP 299 (Aut, Win, Spr, Sum) - Directed Reading in Medicine
MED 299 (Aut, Win, Spr, Sum) - Directed Research in Environment and Resources
ENVRES 399 (Aut, Win, Spr, Sum) - Early Clinical Experience in Medicine
MED 280 (Aut, Win, Spr, Sum) - Graduate Research
HRP 399 (Aut, Win, Spr, Sum) - Graduate Research
MED 399 (Aut, Win, Spr, Sum) - Medical Scholars Research
HRP 370 (Aut, Win, Spr, Sum) - Medical Scholars Research
MED 370 (Aut, Win, Spr, Sum) - Second Year Health Policy PHD Tutorial
HRP 800 (Aut, Win, Spr) - Undergraduate Research
MED 199 (Aut, Win, Spr, Sum)
- Directed Reading in Environment and Resources
-
Prior Year Courses
2023-24 Courses
2022-23 Courses
- Advanced Decision Science Methods and Modeling in Health
HRP 263, MED 263 (Win) - Analysis of Costs, Risks, and Benefits of Health Care
BIOMEDIN 432, HRP 392 (Aut) - Models for Understanding and Controlling Global Infectious Diseases
HRP 204, HUMBIO 154D (Spr)
2021-22 Courses
- Analysis of Costs, Risks, and Benefits of Health Care
BIOMEDIN 432, HRP 392 (Aut)
- Advanced Decision Science Methods and Modeling in Health
Stanford Advisees
-
Doctoral Dissertation Reader (AC)
Gabi Basel -
Doctoral Dissertation Advisor (AC)
Valeria Gracia Olvera, Hannah Thomas -
Postdoctoral Research Mentor
Mauricio Lopez Mendez
All Publications
-
State and National Estimates of the Cost of Emergency Department Pediatric Readiness and Lives Saved.
JAMA network open
2024; 7 (11): e2442154
Abstract
High emergency department (ED) pediatric readiness is associated with improved survival among children receiving emergency care, but state and national costs to reach high ED readiness and the resulting number of lives that may be saved are unknown.To estimate the state and national annual costs of raising all EDs to high pediatric readiness and the resulting number of pediatric lives that may be saved each year.This cohort study used data from EDs in 50 US states and the District of Columbia from 2012 through 2022. Eligible children were ages 0 to 17 years receiving emergency services in US EDs and requiring admission, transfer to another hospital for admission, or dying in the ED (collectively termed at-risk children). Data were analyzed from October 2023 to May 2024.EDs considered to have high readiness, with a weighted pediatric readiness score of 88 or above (range 0 to 100, with higher numbers representing higher readiness).Annual hospital expenditures to reach high ED readiness from current levels and the resulting number of pediatric lives that may be saved through universal high ED readiness.A total 842 of 4840 EDs (17.4%; range, 2.9% to 100% by state) had high pediatric readiness. The annual US cost for all EDs to reach high pediatric readiness from current levels was $207 335 302 (95% CI, $188 401 692-$226 268 912), ranging from $0 to $11.84 per child by state. Of the 7619 child deaths occurring annually after presentation, 2143 (28.1%; 95% CI, 678-3608) were preventable through universal high ED pediatric readiness, with population-adjusted state estimates ranging from 0 to 69 pediatric lives per year.In this cohort study, raising all EDs to high pediatric readiness was estimated to prevent more than one-quarter of deaths among children receiving emergency services, with modest financial investment. State and national policies that raise ED pediatric readiness may save thousands of children's lives each year.
View details for DOI 10.1001/jamanetworkopen.2024.42154
View details for PubMedID 39485354
-
Populationwide Screening for Chronic Kidney Disease: A Cost-Effectiveness Analysis.
JAMA health forum
2024; 5 (11): e243892
Abstract
Sodium-glucose cotransporter-2 (SGLT2) inhibitors have changed clinical management of chronic kidney disease (CKD) and made populationwide screening for CKD a viable strategy. Optimal age of screening initiation has yet to be evaluated.To compare the clinical benefits, costs, and cost-effectiveness of population-wide CKD screening at different initiation ages and screening frequencies.This cost-effectiveness study used a previously published decision-analytic Markov cohort model that simulated progression of CKD among US adults from age 35 years and older and was calibrated to population-level data from the National Health and Nutrition Examination Survey (NHANES). Effectiveness of SGLT2 inhibitors was derived from the Dapagliflozin and Prevention of Adverse Outcomes in Chronic Kidney Disease (DAPA-CKD) trial. Mortality, quality-of-life weights, and cost estimates were obtained from published cohort studies, randomized clinical trials, and US Centers for Medicare & Medicaid Services data. Analyses were performed from June 2023 through September 2024.One-time or periodic (every 10 or 5 years) screening for albuminuria, initiated at ages between 35 and 75 years, with and without addition of SGLT2 inhibitors to conventional CKD therapy (angiotensin-converting enzyme inhibitors/angiotensin receptor blockers).Cumulative incidence of kidney failure requiring kidney replacement therapy (KRT); life years, quality-adjusted life years (QALYs), lifetime health care costs (2024 US currency), and incremental cost-effectiveness ratios discounted at 3% annually.For those aged 35 years, starting screening at age 55 years, and continuing every 5 years through age 75 years, combined with SGLT2 inhibitors, decreased the cumulative incidence of kidney failure requiring KRT from 2.4% to 1.9%, increased life expectancy by 0.13 years, and cost $128 400 per QALY gained. Although initiation of screening every 5 years at age 35 or 45 years yielded greater gains in population-wide health benefits, these strategies cost more than $200 000 per additional QALY gained. The comparative values of starting screening at different ages were sensitive to the cost and effectiveness of SGLT2 inhibitors; if SGLT2 inhibitor prices drop due to patent expirations, screening at age 55 years continued to be cost-effective even if SGLT2 inhibitor effectiveness were 30% lower than in the base case.This study found that, based on conventional benchmarks for cost-effectiveness in medicine, initiating population-wide CKD screening with SGLT2 inhibitors at age 55 years would be cost-effective.
View details for DOI 10.1001/jamahealthforum.2024.3892
View details for PubMedID 39514193
-
Mass incarceration as a driver of the tuberculosis epidemic in Latin America and projected effects of policy alternatives: a mathematical modelling study.
The Lancet. Public health
2024
Abstract
Tuberculosis incidence is increasing in Latin America, where the incarcerated population has nearly quadrupled since 1990. We aimed to quantify the impact of historical and future incarceration policies on the tuberculosis epidemic, accounting for effects in and beyond prisons.In this modelling study, we calibrated dynamic compartmental transmission models to historical and contemporary data from Argentina, Brazil, Colombia, El Salvador, Mexico, and Peru, which comprise approximately 80% of the region's incarcerated population and tuberculosis burden. The model was fit independently for each country to incarceration and tuberculosis data from 1990 to 2023 (specific dates were country dependent). The model does not include HIV, drug resistance, gender or sex, or age structure. Using historical counterfactual scenarios, we estimated the transmission population attributable fraction (tPAF) for incarceration and the excess population-level burden attributable to increasing incarceration prevalence since 1990. We additionally projected the effect of alternative incarceration policies on future population tuberculosis incidence.Population tuberculosis incidence in 2019 was 29·4% (95% uncertainty interval [UI] 23·9-36·8) higher than expected without the rise in incarceration since 1990, corresponding to 34 393 (28 295-42 579) excess incident cases across countries. The incarceration tPAF in 2019 was 27·2% (20·9-35·8), exceeding estimates for other risk factors like HIV, alcohol use disorder, and undernutrition. Compared with a scenario where incarceration rates remain stable at current levels, a gradual 50% reduction in prison admissions and duration of incarceration by 2034 would reduce population tuberculosis incidence by over 10% in all countries except Mexico.The historical rise in incarceration in Latin America has resulted in a large excess tuberculosis burden that has been under-recognised to date. International health agencies, ministries of justice, and national tuberculosis programmes should collaborate to address this health crisis with comprehensive strategies, including decarceration.National Institutes of Health.
View details for DOI 10.1016/S2468-2667(24)00192-0
View details for PubMedID 39419058
-
Cost-Effectiveness And Health Impact Of Increasing Emergency Department Pediatric Readiness In The US.
Health affairs (Project Hope)
2024; 43 (10): 1370-1378
Abstract
The quality of emergency department (ED) care for children in the US is highly variable. The National Pediatric Readiness Project aims to improve survival for children receiving emergency services. We conducted a cost-effectiveness analysis of increasing ED pediatric readiness, using a decision-analytic simulation model. Previously published primary analyses of a nationally representative, population-based cohort of children receiving emergency services at 747 EDs in eleven states provided clinical and cost parameters. From a health care sector perspective, we used a 3 percent annual discount rate and quantified lifetime costs, quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios (ICERs). We performed probabilistic, one-way, and subgroup sensitivity analyses. Increasing ED pediatric readiness yields 69,100 QALYs for the eleven-state cohort, costing $9,300 per QALY gained. Achieving high readiness nationally yields 179,000 QALYs at the same ICER (with implementation costs of approximately $260 million). Implementing high ED pediatric readiness for all EDs in the US is highly cost-effective.
View details for DOI 10.1377/hlthaff.2023.01489
View details for PubMedID 39374456
-
Effects of a Triage Checklist to Optimize Insomnia Treatment Outcomes and Reduce Hypnotic Use: The RESTING Study.
Sleep
2024
Abstract
Evaluate a triaged stepped-care strategy among adults 50 and older with insomnia disorder.Participants (N=245) were classified at baseline by a Triage-Checklist. Those projected to do better if they start treatment with therapist versus digitally delivered CBT-I (tCBT-I versus dCBT-I) constituted the YES stratum (n=137); the rest constituted the NO stratum (n=108). Participants were randomized within stratum to a strategy that utilized only dCBT-I (ONLN) or to a strategy that prospectively allocated the first step of care to dCBT-I or tCBT-I based on the Triage-Checklist and switched dCBT-I non-responders at 2-months to tCBT-I (STEP). Co-primary outcomes were the insomnia severity index (ISI) and the average nightly amount of prescription hypnotic medications used (MEDS), assessed at 2,4,6,9, and 12 months post-randomization.Mixed effects models revealed that, compared to ONLN, participants in STEP had greater reductions in ISI (p=0.001; η2=0.01) and MEDS (p=0.019, η2=0.01). Within the YES stratum, compared to ONLN, those in STEP had greater reductions in ISI (p=0.0001, η2=0.023) and MEDS (p=0.018, η2=0.01). Within the ONLN arm, compared to the YES stratum, those in the NO stratum had greater reductions in ISI (p=0.015, η2=0.01) but not in MEDS. Results did not change with treatment-dose covariate adjustment.Triaged-stepped care can help guide allocation of limited CBT-I treatment resources to promote effective and safe treatment of chronic insomnia among middle age and older adults. Further refinement of the Triage-Checklist and optimization of the timing and switching criteria may improve the balance between effectiveness and use of resources.
View details for DOI 10.1093/sleep/zsae182
View details for PubMedID 39115347
-
Mass incarceration as a driver of the tuberculosis epidemic in Latin America and projected impacts of policy alternatives: A mathematical modeling study.
medRxiv : the preprint server for health sciences
2024
Abstract
Tuberculosis incidence is increasing in Latin America, where the incarcerated population has nearly quadrupled since 1990. The full impact of incarceration on the tuberculosis epidemic, accounting for effects beyond prisons, has never been quantified.We calibrated dynamic compartmental transmission models to historical and contemporary data from Argentina, Brazil, Colombia, El Salvador, Mexico, and Peru, which comprise approximately 80% of the region's incarcerated population and tuberculosis burden. Using historical counterfactual scenarios, we estimated the transmission population attributable fraction (tPAF) for incarceration and the excess population-level burden attributable to increasing incarceration prevalence since 1990. We additionally projected the impact of alternative incarceration policies on future population tuberculosis incidence.Population tuberculosis incidence in 2019 was 29.4% (95% UI, 23.9-36.8) higher than expected without the rise in incarceration since 1990, corresponding to 34,393 (95% UI, 28,295-42,579) excess incident cases across countries. The incarceration tPAF in 2019 was 27.2% (95% UI, 20.9-35.8), exceeding estimates for other risk factors like HIV, alcohol use disorder, and undernutrition. Compared to a scenario where incarceration rates remain stable at current levels, a gradual 50% reduction in prison admissions and duration of incarceration by 2034 would reduce population tuberculosis incidence by over 10% in all countries except Mexico.The historical rise in incarceration in Latin America has resulted in a large excess tuberculosis burden that has been under-recognized to-date. International health agencies, ministries of justice, and national tuberculosis programs should collaborate to address this health crisis with comprehensive strategies, including decarceration.National Institutes of Health.
View details for DOI 10.1101/2024.04.23.24306238
View details for PubMedID 39108530
View details for PubMedCentralID PMC11302613
-
Changes in Emergency Department Pediatric Readiness and Mortality.
JAMA network open
2024; 7 (7): e2422107
Abstract
High emergency department (ED) pediatric readiness is associated with improved survival, but the impact of changes to ED readiness is unknown.To evaluate the association of changes in ED pediatric readiness at US trauma centers between 2013 and 2021 with pediatric mortality.This retrospective cohort study was performed from January 1, 2012, through December 31, 2021, at EDs of trauma centers in 48 states and the District of Columbia. Participants included injured children younger than 18 years with admission or injury-related death at a participating trauma center, including transfers to other trauma centers. Data analysis was performed from May 2023 to January 2024.Change in ED pediatric readiness, measured using the weighted Pediatric Readiness Score (wPRS, range 0-100, with higher scores denoting greater readiness) from national assessments in 2013 and 2021. Change groups included high-high (wPRS ≥93 on both assessments), low-high (wPRS <93 in 2013 and wPRS ≥93 in 2021), high-low (wPRS ≥93 in 2013 and wPRS <93 in 2021), and low-low (wPRS <93 on both assessments).The primary outcome was lives saved vs lost, according to ED and in-hospital mortality. The risk-adjusted association between changes in ED readiness and mortality was evaluated using a hierarchical, mixed-effects logistic regression model based on a standardized risk-adjustment model for trauma, with a random slope-random intercept to account for clustering by the initial ED.The primary sample included 467 932 children (300 024 boys [64.1%]; median [IQR] age, 10 [4 to 15] years; median [IQR] Injury Severity Score, 4 [4 to 15]) at 417 trauma centers. Observed mortality by ED readiness change group was 3838 deaths of 144 136 children (2.7%) in the low-low ED group, 1804 deaths of 103 767 children (1.7%) in the high-low ED group, 1288 deaths of 64 544 children (2.0%) in the low-high ED group, and 2614 deaths of 155 485 children (1.7%) in the high-high ED group. After risk adjustment, high-readiness EDs (persistent or change to) had 643 additional lives saved (95% CI, -328 to 1599 additional lives saved). Low-readiness EDs (persistent or change to) had 729 additional preventable deaths (95% CI, -373 to 1831 preventable deaths). Secondary analysis suggested that a threshold of wPRS 90 or higher may optimize the number of lives saved. Among 716 trauma centers that took both assessments, the median (IQR) wPRS decreased from 81 (63 to 94) in 2013 to 77 (64 to 93) in 2021 because of reductions in care coordination and quality improvement.Although the findings of this study of injured children in US trauma centers were not statistically significant, they suggest that trauma centers should increase their level of ED pediatric readiness to reduce mortality and increase the number of pediatric lives saved after injury.
View details for DOI 10.1001/jamanetworkopen.2024.22107
View details for PubMedID 39037816
View details for PubMedCentralID PMC11265139
-
Likelihood of COVID-19 Outbreaks in US Immigration and Customs Enforcement (ICE) Detention Centers, 2020‒2021.
American journal of public health
2024: e1-e4
Abstract
Objectives. To determine facility-level factors associated with COVID-19 outbreaks in US Immigration and Customs Enforcement (ICE) detention centers. Methods. We obtained COVID-19 case counts at 88 ICE detention facilities from May 6, 2020, through June 21, 2021, from the COVID Prison Project. We obtained information about facility population size, facility type (dedicated to immigrants or mixed with other incarcerated populations), and facility operator (public vs private contractor) from third-party sources. We defined the threshold for a COVID-19 outbreak as a cumulative 3-week incidence of 10% or more of the detained population. Results. Sixty-three facilities (72%) had at least 1 outbreak. Facilities with any outbreak were significantly more likely to be privately operated (P < .001), to have larger populations (113 vs 37; P = .002), and to have greater changes in their population size over the study period (‒56% vs -26%; P < .001). Conclusions. Several facility-level factors were associated with the occurrence of COVID-19 outbreaks in ICE facilities. Public Health Implications. Structural and organizational factors that promote respiratory infection spread in ICE facilities must be addressed to protect detainee health. (Am J Public Health. Published online ahead of print June 20, 2024:e1-e4. https://doi.org/10.2105/AJPH.2024.307704).
View details for DOI 10.2105/AJPH.2024.307704
View details for PubMedID 38900981
-
The hospital costs of high emergency department pediatric readiness.
Journal of the American College of Emergency Physicians open
2024; 5 (3): e13179
Abstract
We estimate annual hospital expenditures to achieve high emergency department (ED) pediatric readiness (HPR), that is, weighted Pediatric Readiness Score (wPRS) ≥ 88 (0-100 scale) across EDs with different pediatric volumes of children, overall and after accounting for current levels of readiness.We calculated the annual hospital costs of HPR based on two components: (1) ED pediatric equipment and supplies and (2) labor costs required for a Pediatric Emergency Care Coordinator (PECC) to perform pediatric readiness tasks. Data sources to generate labor cost estimates included: 2021 national salary information from U.S. Bureau of Labor Statistics, detailed patient and readiness data from 983 EDs in 11 states, the 2021 National Pediatric Readiness Project assessment; a national PECC survey; and a regional PECC survey. Data sources for equipment and supply costs included: purchasing costs from seven healthcare organizations and equipment usage per ED pediatric volume. We excluded costs of day-to-day ED operations (ie, direct clinical care and routine ED supplies).The total annual hospital costs for HPR ranged from $77,712 (95% CI 54,719-100,694) for low volume EDs to $279,134 (95% CI 196,487-362,179) for very high volume EDs; equipment costs accounted for 0.9-5.0% of expenses. The total annual cost-per-patient ranged from $3/child (95% CI 2-4/child) to $222/child (95% CI 156-288/child). After accounting for current readiness levels, the cost to reach HPR ranged from $23,775 among low volume EDs to $145,521 among high volume EDs, with costs per patient of $4/child to $48/child.Annual hospital costs for HPR are modest, particularly when considered per child.
View details for DOI 10.1002/emp2.13179
View details for PubMedID 38835787
View details for PubMedCentralID PMC11147684
-
Cost-effectiveness and public health impact of typhoid conjugate vaccine introduction strategies in Bangladesh.
Vaccine
2024
Abstract
Typhoid fever causes substantial morbidity and mortality in Bangladesh. The government of Bangladesh plans to introduce typhoid conjugate vaccines (TCV) in its expanded program on immunization (EPI) schedule. However, the optimal introduction strategy in addition to the costs and benefits of such a program are unclear.We extended an existing mathematical model of typhoid transmission to integrate cost data, clinical incidence data, and recently conducted serosurveys in urban, semi-urban, and rural areas. In our primary analysis, we evaluated the status quo (i.e., no vaccination) and eight vaccine introduction strategies including routine and 1-time campaign strategies, which differed by age groups targeted and geographic focus. Model outcomes included clinical incidence, seroincidence, deaths, costs, disability-adjusted life years (DALYs), and incremental cost-effectiveness ratios (ICERs) for each strategy. We adopted a societal perspective, 10-year model time horizon, and 3 % annual discount rate. We performed probabilistic, one-way, and scenario sensitivity analyses including adopting a healthcare perspective and alternate model time horizons.We projected that all TCV strategies would be cost saving compared to the status quo. The preferred strategy was a nationwide introduction of TCV at 9-12 months of age with a single catch-up campaign for children ages 1-15, which was cost saving compared to all other strategies and the status quo. In the 10 years following implementation, we projected this strategy would avert 3.77 million cases (95 % CrI: 2.60 - 5.18), 11.31 thousand deaths (95 % CrI: 3.77 - 23.60), and save $172.35 million (95 % CrI: -14.29 - 460.59) compared to the status quo. Our findings were broadly robust to changes in parameter values and willingness-to-pay thresholds.We projected that nationwide TCV introduction with a catch-up campaign would substantially reduce typhoid incidence and very likely be cost saving in Bangladesh.
View details for DOI 10.1016/j.vaccine.2024.03.035
View details for PubMedID 38531727
-
Estimated effectiveness and cost-effectiveness of opioid use disorder treatment under proposed U.S. regulatory relaxations: A model-based analysis.
Drug and alcohol dependence
2024; 256: 111112
Abstract
AIM: To assess the effectiveness and cost-effectiveness of buprenorphine and methadone treatment in the U.S. if exemptions expanding coverage for substance use disorder services via telehealth and allowing opioid treatment programs to supply a greater number of take-home doses of medications for opioid use disorder (OUD) continue (Notice of Proposed Rule Making, NPRM).DESIGN SETTING AND PARTICIPANTS: Model-based analysis of buprenorphine and methadone treatment for a cohort of 100,000 individuals with OUD, varying treatment retention and overdose risk among individuals receiving and not receiving methadone treatment compared to the status quo (no NPRM).INTERVENTION: Buprenorphine and methadone treatment under NPRM.MEASUREMENTS: Fatal and nonfatal overdoses and deaths over five years, discounted lifetime per person QALYs and costs.FINDINGS: For buprenorphine treatment under the status quo, 1.21 QALYs are gained at a cost of $19,200/QALY gained compared to no treatment; with 20% higher treatment retention, 1.28 QALYs are gained at a cost of $17,900/QALY gained compared to no treatment, and the strategy dominates the status quo. For methadone treatment under the status quo, 1.11 QALYs are gained at a cost of $17,900/QALY gained compared to no treatment. In all scenarios, methadone provision cost less than $20,000/QALY gained compared to no treatment, and less than $50,000/QALY gained compared to status quo methadone treatment.CONCLUSIONS: Buprenorphine and methadone OUD treatment under NPRM are likely to be effective and cost-effective. Increases in overdose risk with take-home methadone would reduce health benefits. Clinical and technological strategies could mitigate this risk.
View details for DOI 10.1016/j.drugalcdep.2024.111112
View details for PubMedID 38335797
-
Resource Utilization and Costs Associated with Approaches to Identify Infants with Early-Onset Sepsis.
MDM policy & practice
2024; 9 (1): 23814683231226129
Abstract
Objective. To compare resource utilization and costs associated with 3 alternative screening approaches to identify early-onset sepsis (EOS) in infants born at ≥35 wk of gestational age, as recommended by the American Academy of Pediatrics (AAP) in 2018. Study Design. Decision tree-based cost analysis of the 3 AAP-recommended approaches: 1) categorical risk assessment (categorization by chorioamnionitis exposure status), 2) neonatal sepsis calculator (a multivariate prediction model based on perinatal risk factors), and 3) enhanced clinical observation (assessment based on serial clinical examinations). We evaluated resource utilization and direct costs (2022 US dollars) to the health system. Results. Categorical risk assessment led to the greatest neonatal intensive care unit usage (210 d per 1,000 live births) and antibiotic exposure (6.8%) compared with the neonatal sepsis calculator (112 d per 1,000 live births and 3.6%) and enhanced clinical observation (99 d per 1,000 live births and 3.1%). While the per-live birth hospital costs of the 3 approaches were similar-categorical risk assessment cost $1,360, the neonatal sepsis calculator cost $1,317, and enhanced clinical observation cost $1,310-the cost of infants receiving intervention under categorical risk assessment was approximately twice that of the other 2 strategies. Results were robust to variations in data parameters. Conclusion. The neonatal sepsis calculator and enhanced clinical observation approaches may be preferred to categorical risk assessment as they reduce the number of infants receiving intervention and thus antibiotic exposure and associated costs. All 3 approaches have similar costs over all live births, and prior literature has indicated similar health outcomes. Inclusion of downstream effects of antibiotic exposure in the neonatal period should be evaluated within a cost-effectiveness analysis.Of the 3 approaches recommended by the American Academy of Pediatrics in 2018 to identify early-onset sepsis in infants born at ≥35 weeks, the categorical risk assessment approach leads to about twice as many infants receiving evaluation to rule out early-onset sepsis compared with the neonatal sepsis calculator and enhanced clinical observation approaches.While the hospital costs of the 3 approaches were similar over the entire population of live births, the neonatal sepsis calculator and enhanced clinical observation approaches reduce antibiotic exposure, neonatal intensive care unit admission, and hospital costs associated with interventions as part of the screening approach compared with the categorical risk assessment approach.
View details for DOI 10.1177/23814683231226129
View details for PubMedID 38293656
View details for PubMedCentralID PMC10826394
-
Population-Wide Screening for Chronic Kidney Disease.
Annals of internal medicine
2024; 177 (1): eL230370
View details for DOI 10.7326/L23-0370
View details for PubMedID 38224602
-
Bias-Adjusted Predictions of County-Level Vaccination Coverage from the COVID-19 Trends and Impact Survey.
Medical decision making : an international journal of the Society for Medical Decision Making
2023: 272989X231218024
Abstract
BACKGROUND: The potential for selection bias in nonrepresentative, large-scale, low-cost survey data can limit their utility for population health measurement and public health decision making. We developed an approach to bias adjust county-level COVID-19 vaccination coverage predictions from the large-scale US COVID-19 Trends and Impact Survey.DESIGN: We developed a multistep regression framework to adjust for selection bias in predicted county-level vaccination coverage plateaus. Our approach included poststratification to the American Community Survey, adjusting for differences in observed covariates, and secondary normalization to an unbiased reference indicator. As a case study, we prospectively applied this framework to predict county-level long-run vaccination coverage among children ages 5 to 11y. We evaluated our approach against an interim observed measure of 3-mo coverage for children ages 5 to 11y and used long-term coverage estimates to monitor equity in the pace of vaccination scale up.RESULTS: Our predictions suggested a low ceiling on long-term national vaccination coverage (46%), detected substantial geographic heterogeneity (ranging from 11% to 91% across counties in the United States), and highlighted widespread disparities in the pace of scale up in the 3 mo following Emergency Use Authorization of COVID-19 vaccination for 5- to 11-y-olds.LIMITATIONS: We relied on historical relationships between vaccination hesitancy and observed coverage, which may not capture rapid changes in the COVID-19 policy and epidemiologic landscape.CONCLUSIONS: Our analysis demonstrates an approach to leverage differing strengths of multiple sources of information to produce estimates on the time scale and geographic scale necessary for proactive decision making.IMPLICATIONS: Designing integrated health measurement systems that combine sources with different advantages across the spectrum of timeliness, spatial resolution, and representativeness can maximize the benefits of data collection relative to costs.HIGHLIGHTS: The COVID-19 pandemic catalyzed massive survey data collection efforts that prioritized timeliness and sample size over population representativeness.The potential for selection bias in these large-scale, low-cost, nonrepresentative data has led to questions about their utility for population health measurement.We developed a multistep regression framework to bias adjust county-level vaccination coverage predictions from the largest public health survey conducted in the United States to date: the US COVID-19 Trends and Impact Survey.Our study demonstrates the value of leveraging differing strengths of multiple data sources to generate estimates on the time scale and geographic scale necessary for proactive public health decision making.
View details for DOI 10.1177/0272989X231218024
View details for PubMedID 38159263
-
Effects of Mitigation and Control Policies in Realistic Epidemic Models Accounting for Household Transmission Dynamics.
Medical decision making : an international journal of the Society for Medical Decision Making
2023: 272989X231205565
Abstract
Compartmental infectious disease (ID) models are often used to evaluate nonpharmaceutical interventions (NPIs) and vaccines. Such models rarely separate within-household and community transmission, potentially introducing biases in situations in which multiple transmission routes exist. We formulated an approach that incorporates household structure into ID models, extending the work of House and Keeling.We developed a multicompartment susceptible-exposed-infectious-recovered-susceptible-vaccinated (MC-SEIRSV) modeling framework, allowing nonexponentially distributed duration in exposed and infectious compartments, that tracks within-household and community transmission. We simulated epidemics that varied by community and household transmission rates, waning immunity rate, household size (3 or 5 members), and numbers of exposed and infectious compartments (1-3 each). We calibrated otherwise identical models without household structure to the early phase of each parameter combination's epidemic curve. We compared each model pair in terms of epidemic forecasts and predicted NPI and vaccine impacts on the timing and magnitude of the epidemic peak and its total size. Meta-analytic regressions characterized the relationship between household structure inclusion and the size and direction of biases.Otherwise similar models with and without household structure produced equivalent early epidemic curves. However, forecasts from models without household structure were biased. Without intervention, they were upward biased on peak size and total epidemic size, with biases also depending on the number of exposed and infectious compartments. Model-estimated NPI effects of a 60% reduction in community contacts on peak time and size were systematically overestimated without household structure. Biases were smaller with a 20% reduction NPI. Because vaccination affected both community and household transmission, their biases were smaller.ID models without household structure can produce biased outcomes in settings in which within-household and community transmission differ.Infectious disease models rarely separate household transmission from community transmission. The pace of household transmission may differ from community transmission, depends on household size, and can accelerate epidemic growth.Many infectious disease models assume exponential duration distributions for infected states. However, the duration of most infections is not exponentially distributed, and distributional choice alters modeled epidemic dynamics and intervention effectiveness.We propose a mathematical framework for household and community transmission that allows for nonexponential duration times and a suite of interventions and quantified the effect of accounting for household transmission by varying household size and duration distributions of infected states on modeled epidemic dynamics.Failure to include household structure induces biases in the modeled overall course of an epidemic and the effects of interventions delivered differentially in community settings. Epidemic dynamics are faster and more intense in populations with larger household sizes and for diseases with nonexponentially distributed infectious durations. Modelers should consider explicitly incorporating household structure to quantify the effects of non-pharmaceutical interventions (e.g., shelter-in-place).
View details for DOI 10.1177/0272989X231205565
View details for PubMedID 37953597
-
Establishing Costs for Commercial Chimeric Antigen Receptor T-Cell (Tisagenlecleucel; Kymriah) in Children and Young Adult B-Cell Acute Lymphoblastic Leukemia; A Merged Analysis from the Prwcc and PHIS
AMER SOC HEMATOLOGY. 2023
View details for DOI 10.1182/blood-2023-187462
View details for Web of Science ID 001159900806281
-
Pricing Treatments Cost-Effectively when They Have Multiple Indications: Not Just a Simple Threshold Analysis.
Medical decision making : an international journal of the Society for Medical Decision Making
2023: 272989X231197772
Abstract
Economic evaluations of treatments increasingly employ price-threshold analyses. When a treatment has multiple indications, standard price-threshold analyses can be overly simplistic. We examine how rules governing indication-specific prices and reimbursement decisions affect value-based price analyses.We analyze a 2-stage game between 2 players: the therapy's manufacturer and the payer purchasing it for patients. First, the manufacturer selects a price(s) that may be indication specific. Then, the payer decides whether to provide reimbursement at the offered price(s). We assume known indication-specific demand. The manufacturer seeks to maximize profit. The payer seeks to maximize total population incremental net monetary benefit and will not pay more than their willingness-to-pay threshold. We consider game variants defined by constraints on the manufacturer's ability to price and payer's ability to provide reimbursement differentially by indication.When both the manufacturer and payer can make indication-specific decisions, the problem simplifies to multiple single-indication price-threshold analyses, and the manufacturer captures all the consumer surplus. When the manufacturer is restricted to one price and the payer must make an all-or-nothing reimbursement decision, the selected price is a weighted average of indication-specific threshold prices such that reimbursement of more valuable indications subsidizes reimbursement of less valuable indications. With a single price and indication-specific coverage decisions, the manufacturer may select a high price where fewer patients receive treatment because the payer restricts reimbursement to the set of indications providing value commensurate with the high price. However, the manufacturer may select a low price, resulting in reimbursement for more indications and positive consumer surplus.When treatments have multiple indications, economic evaluations including price-threshold analyses should carefully consider jurisdiction-specific rules regarding pricing and reimbursement decisions.With treatment prices rising, economic evaluations increasingly employ price-threshold analyses to identify value-based prices. Standard price-threshold analyses can be overly simplistic when treatments have multiple indications.Jurisdiction-specific rules governing indication-specific prices and reimbursement decisions affect value-based price analyses.When the manufacturer is restricted to one price for all indications and the payer must make an all-or-nothing reimbursement decision, the selected price is a weighted average of indication-specific threshold prices such that reimbursement of the more valuable indications subsidize reimbursement of the less valuable indications.With a single price and indication-specific coverage decisions, the manufacturer may select a high price with fewer patients treated than in the first-best solution. There are also cases in which the manufacturer selects a lower price, resulting in reimbursement for more indications and positive consumer surplus.
View details for DOI 10.1177/0272989X231197772
View details for PubMedID 37698120
-
Emergency Department Pediatric Readiness and Disparities in Mortality Based on Race and Ethnicity.
JAMA network open
2023; 6 (9): e2332160
Abstract
Presentation to emergency departments (EDs) with high levels of pediatric readiness is associated with improved pediatric survival. However, it is unclear whether children of all races and ethnicities benefit equitably from increased levels of such readiness.To evaluate the association of ED pediatric readiness with in-hospital mortality among children of different races and ethnicities with traumatic injuries or acute medical emergencies.This cohort study of children requiring emergency care in 586 EDs across 11 states was conducted from January 1, 2012, through December 31, 2017. Eligible participants included children younger than 18 years who were hospitalized for an acute medical emergency or traumatic injury. Data analysis was conducted between November 2022 and April 2023.Hospitalization for acute medical emergency or traumatic injury.The primary outcome was in-hospital mortality. ED pediatric readiness was measured through the weighted Pediatric Readiness Score (wPRS) from the 2013 National Pediatric Readiness Project assessment and categorized by quartile. Multivariable, hierarchical, mixed-effects logistic regression was used to evaluate the association of race and ethnicity with in-hospital mortality.The cohort included 633 536 children (median [IQR] age 4 [0-12] years]). There were 557 537 children (98 504 Black [17.7%], 167 838 Hispanic [30.1%], 311 157 White [55.8%], and 147 876 children of other races or ethnicities [26.5%]) who were hospitalized for acute medical emergencies, of whom 5158 (0.9%) died; 75 999 children (12 727 Black [16.7%], 21 604 Hispanic [28.4%], 44 203 White [58.2%]; and 21 609 of other races and ethnicities [27.7%]) were hospitalized for traumatic injuries, of whom 1339 (1.8%) died. Adjusted mortality of Black children with acute medical emergencies was significantly greater than that of Hispanic children, White children, and of children of other races and ethnicities (odds ratio [OR], 1.69; 95% CI, 1.59-1.79) across all quartile levels of ED pediatric readiness; but there were no racial or ethnic disparities in mortality when comparing Black children with traumatic injuries with Hispanic children, White children, and children of other races and ethnicities with traumatic injuries (OR 1.01; 95% CI, 0.89-1.15). When compared with hospitals in the lowest quartile of ED pediatric readiness, children who were treated at hospitals in the highest quartile had significantly lower mortality in both the acute medical emergency cohort (OR 0.24; 95% CI, 0.16-0.36) and traumatic injury cohort (OR, 0.39; 95% CI, 0.25-0.61). The greatest survival advantage associated with high pediatric readiness was experienced for Black children in the acute medical emergency cohort.In this study, racial and ethnic disparities in mortality existed among children treated for acute medical emergencies but not traumatic injuries. Increased ED pediatric readiness was associated with reduced disparities; it was estimated that increasing the ED pediatric readiness levels of hospitals in the 3 lowest quartiles would result in an estimated 3-fold reduction in disparity for pediatric mortality. However, increased pediatric readiness did not eliminate disparities, indicating that organizations and initiatives dedicated to increasing ED pediatric readiness should consider formal integration of health equity into efforts to improve pediatric emergency care.
View details for DOI 10.1001/jamanetworkopen.2023.32160
View details for PubMedID 37669053
-
The cost of emergency care for children across differing levels of emergency department pediatric readiness.
Health affairs scholar
2023; 1 (1): qxad015
Abstract
High emergency department (ED) pediatric readiness is associated with improved survival in children, but the cost is unknown. We evaluated the costs of emergency care for children across quartiles of ED pediatric readiness. This was a retrospective cohort study of children aged 0-17 years receiving emergency services in 747 EDs in 9 states from January 1, 2012, through December 31, 2017. We measured ED pediatric readiness using the weighted Pediatric Readiness Score (range: 0-100). The primary outcome was the total cost of acute care (ED and inpatient) in 2022 dollars, adjusted for ED case mix and hospital characteristics. A total of 15 138 599 children received emergency services, including 27.6% with injuries and 72.4% with acute medical illness. The average adjusted per-patient cost by quartile of ED pediatric readiness ranged from $991 (quartile 1) to $1064 (quartile 4) for injured children and $1104-$1217 for medical children. The resulting cost differences were $72 (95% CI: -$6 to $151) and $113 (95% CI: $20-$206), respectively. Receiving emergency care in high-readiness EDs was not associated with marked increases in the cost of delivering services.
View details for DOI 10.1093/haschl/qxad015
View details for PubMedID 38756836
View details for PubMedCentralID PMC10986251
-
Cost-effectiveness of trastuzumab deruxtecan in HER2 low metastatic breast cancer
LIPPINCOTT WILLIAMS & WILKINS. 2023
View details for Web of Science ID 001053772000312
-
Population-Wide Screening for Chronic Kidney Disease : A Cost-Effectiveness Analysis.
Annals of internal medicine
2023
Abstract
BACKGROUND: Sodium-glucose cotransporter-2 (SGLT2) inhibitors have the potential to alter the natural history of chronic kidney disease (CKD), and they should be included in cost-effectiveness analyses of screening for CKD.OBJECTIVE: To determine the cost-effectiveness of adding population-wide screening for CKD.DESIGN: Markov cohort model.DATA SOURCES: NHANES (National Health and Nutrition Examination Survey), U.S. Centers for Medicare & Medicaid Services data, cohort studies, and randomized clinical trials, including the DAPA-CKD (Dapagliflozin and Prevention of Adverse Outcomes in Chronic Kidney Disease) trial.TARGET POPULATION: Adults.TIME HORIZON: Lifetime.PERSPECTIVE: Health care sector.INTERVENTION: Screening for albuminuria with and without adding SGLT2 inhibitors to the current standard of care for CKD.OUTCOME MEASURES: Costs, quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios (ICERs), all discounted at 3% annually.RESULTS OF BASE-CASE ANALYSIS: One-time CKD screening at age 55 years had an ICER of $86300 per QALY gained by increasing costs from $249800 to $259000 and increasing QALYs from 12.61 to 12.72; this was accompanied by a decrease in the incidence of kidney failure requiring dialysis or kidney transplant of 0.29 percentage points and an increase in life expectancy from 17.29 to 17.45 years. Other options were also cost-effective. During ages 35 to 75 years, screening once prevented dialysis or transplant in 398000 people and screening every 10 years until age 75 years cost less than $100000 per QALY gained.RESULTS OF SENSITIVITY ANALYSIS: When SGLT2 inhibitors were 30% less effective, screening every 10 years during ages 35 to 75 years cost between $145400 and $182600 per QALY gained, and price reductions would be required for screening to be cost-effective.LIMITATION: The efficacy of SGLT2 inhibitors was derived from a single randomized controlled trial.CONCLUSION: Screening adults for albuminuria to identify CKD could be cost-effective in the United States.PRIMARY FUNDING SOURCE: Agency for Healthcare Research and Quality, Veterans Affairs Office of Academic Affiliations, and National Institute of Diabetes and Digestive and Kidney Diseases.
View details for DOI 10.7326/M22-3228
View details for PubMedID 37216661
-
Deceased Donor Kidney Transplantation for Older Transplant Candidates: A New Microsimulation Model for Determining Risks and Benefits.
Medical decision making : an international journal of the Society for Medical Decision Making
2023: 272989X231172169
Abstract
Under the current US kidney allocation system, older candidates receive a disproportionately small share of deceased donor kidneys despite a reserve of potentially usable kidneys that could shorten their wait times. To consider potential health gains from increasing access to kidneys for these candidates, we developed and calibrated a microsimulation model of the transplantation process and long-term outcomes for older deceased donor kidney transplant candidates.We estimated risk equations for transplant outcomes using the Scientific Registry of Transplant Recipients (SRTR), which contains data on all US transplants (2010-2019). A microsimulation model combined these equations to account for competing events. We calibrated the model to key transplant outcomes and used acceptance sampling, retaining the best-fitting 100 parameter sets. We then examined life expectancy gains from allocating kidneys even of lower quality across patient subgroups defined by age and designated race/ethnicity.The best-fitting 100 parameter sets (among 4,000,000 sampled) enabled our model to closely match key transplant outcomes. The model demonstrated clear survival benefits for those who receive a deceased donor kidney, even a lower quality one, compared with remaining on the waitlist where there is a risk of removal. The expected gain in survival from receiving a lower quality donor kidney was consistent gains across age and race/ethnic subgroups.Limited available data on socioeconomic factors.Our microsimulation model accurately replicates a range of key kidney transplant outcomes among older candidates and demonstrates that older candidates may derive substantial benefits from transplantation with lower quality kidneys. This model can be used to evaluate policies that have been proposed to address concerns that the current system disincentivizes deceased donor transplants for older patients.The microsimulation model was consistent with the data after calibration and accurately simulated the transplantation process for older deceased donor kidney transplant candidates.There are clear survival benefits for older transplant candidates who receive deceased donor kidneys, even lower quality ones, compared with remaining on the waitlist.This model can be used to evaluate policies aimed at increasing transplantation among older candidates.
View details for DOI 10.1177/0272989X231172169
View details for PubMedID 37170943
-
State Perinatal Quality Collaborative for Reducing Severe Maternal Morbidity From Hemorrhage: A Cost-Effectiveness Analysis.
Obstetrics and gynecology
2023
Abstract
To evaluate the cost effectiveness of California's statewide perinatal quality collaborative for reducing severe maternal morbidity (SMM) from hemorrhage.A decision-analytic model using open source software (Amua 0.30) compared outcomes and costs within a simulated cohort of 480,000 births to assess the annual effect in the state of California. Our model captures both the short-term costs and outcomes that surround labor and delivery and long-term effects over a person's remaining lifetime. Previous studies that evaluated the effectiveness of the CMQCC's (California Maternal Quality Care Collaborative) statewide perinatal quality collaborative initiative-reduction of hemorrhage-related SMM by increasing recognition, measurement, and timely response to postpartum hemorrhage-provided estimates of intervention effectiveness. Primary cost data received from select hospitals within the study allowed for the estimation of collaborative costs, with all other model inputs derived from literature. Costs were inflated to 2021 dollars with a cost-effectiveness threshold of $100,000 per quality-adjusted life-year (QALY) gained. Various sensitivity analyses were performed including one-way, scenario-based, and probabilistic sensitivity (Monte Carlo) analysis.The collaborative was cost effective, exhibiting strong dominance when compared with the baseline or standard of care. In a theoretical cohort of 480,000 births, collaborative implementation added 182 QALYs (0.000379/birth) by averting 913 cases of SMM, 28 emergency hysterectomies, and one maternal mortality. Additionally, it saved $9 million ($17.78/birth) due to averted SMM costs. Although sensitivity analyses across parameter uncertainty ranges provided cases where the intervention was not cost saving, it remained cost effective throughout all analyses. Additionally, scenario-based sensitivity analysis found the intervention cost effective regardless of birth volume and implementation costs.California's statewide perinatal quality collaborative initiative to reduce SMM from hemorrhage was cost effective-representing an inexpensive quality-improvement initiative that reduces the incidence of maternal morbidity and mortality, and potentially provides cost savings to the majority of birthing hospitals.
View details for DOI 10.1097/AOG.0000000000005060
View details for PubMedID 36649352
-
Emergency Department Pediatric Readiness and Short-term and Long-term Mortality Among Children Receiving Emergency Care.
JAMA network open
2023; 6 (1): e2250941
Abstract
Emergency departments (EDs) with high pediatric readiness (coordination, personnel, quality improvement, safety, policies, and equipment) are associated with lower mortality among children with critical illness and those admitted to trauma centers, but the benefit among children with more diverse clinical conditions is unknown.To evaluate the association between ED pediatric readiness, in-hospital mortality, and 1-year mortality among injured and medically ill children receiving emergency care in 11 states.This is a retrospective cohort study of children receiving emergency care at 983 EDs in 11 states from January 1, 2012, through December 31, 2017, with follow-up for a subset of children through December 31, 2018. Participants included children younger than 18 years admitted, transferred to another hospital, or dying in the ED, stratified by injury vs medical conditions. Data analysis was performed from November 1, 2021, through June 30, 2022.ED pediatric readiness of the initial ED, measured through the weighted Pediatric Readiness Score (wPRS; range, 0-100) from the 2013 National Pediatric Readiness Project assessment.The primary outcome was in-hospital mortality, with a secondary outcome of time to death to 1 year among children in 6 states.There were 796 937 children, including 90 963 (11.4%) in the injury cohort (mean [SD] age, 9.3 [5.8] years; median [IQR] age, 10 [4-15] years; 33 516 [36.8%] female; 1820 [2.0%] deaths) and 705 974 (88.6%) in the medical cohort (mean [SD] age, 5.8 [6.1] years; median [IQR] age, 3 [0-12] years; 329 829 [46.7%] female, 7688 [1.1%] deaths). Among the 983 EDs, the median (IQR) wPRS was 73 (59-87). Compared with EDs in the lowest quartile of ED readiness (quartile 1, wPRS of 0-58), initial care in a quartile 4 ED (wPRS of 88-100) was associated with 60% lower in-hospital mortality among injured children (adjusted odds ratio, 0.40; 95% CI, 0.26-0.60) and 76% lower mortality among medical children (adjusted odds ratio, 0.24; 95% CI, 0.17-0.34). Among 545 921 children followed to 1 year, the adjusted hazard ratio of death in quartile 4 EDs was 0.59 (95% CI, 0.42-0.84) for injured children and 0.34 (95% CI, 0.25-0.45) for medical children. If all EDs were in the highest quartile of pediatric readiness, an estimated 288 injury deaths (95% CI, 281-297 injury deaths) and 1154 medical deaths (95% CI, 1150-1159 medical deaths) may have been prevented.These findings suggest that children with injuries and medical conditions treated in EDs with high pediatric readiness had lower mortality during hospitalization and to 1 year.
View details for DOI 10.1001/jamanetworkopen.2022.50941
View details for PubMedID 36637819
-
Protection against Omicron from Vaccination and Previous Infection
NEW ENGLAND JOURNAL OF MEDICINE
2022
View details for Web of Science ID 000912237600002
-
Protection against Omicron from Vaccination and Previous Infection. Reply.
The New England journal of medicine
2022; 388 (1)
View details for DOI 10.1056/NEJMc2214627
View details for PubMedID 36546676
-
Estimates of Quality-Adjusted Life-Year Loss for Injuries in the United States.
Medical decision making : an international journal of the Society for Medical Decision Making
2022: 272989X221141454
Abstract
The goal of this study is to develop an approach for estimating nationally representative quality-adjusted life-year (QALY) loss from injury and poisoning conditions using data collected in the Medical Expenditures Panel Survey (MEPS) and the National Health Interview Survey (NHIS).This study uses data from the 2002-2015 NHIS and MEPS surveys. Injuries were identified in the MEPS medical events file and through self-reporting of medical conditions. We restricted our model to 163,731 adults, for which we predict a total of 294,977 EQ-5D scores using responses to the self-administered questionnaire. EQ-5D scores were modeled using age, sex, comorbidities, and binary indicators of the presence and duration of injury at the time of the health status questionnaire. These models consider nonlinearity over time during the first 3 y following the injury event.Injuries are identified in MEPS using medical events that provide a reasonable proxy for the date of injury occurrence. Health-related quality of life (HRQL) decrements can be estimated using binary indicators of injury during different time periods. When grouped into 29 injury categories, most categories were statistically significant predictors of HRQL scores in the first year after injury. For these groups of injuries, mean first-year QALY loss estimates range from 0.005 (sprains and strains of joints and adjacent muscles, n = 7067) to 0.109 (injury to nerves and spinal cord, n = 71). Fewer estimates are significant in the second and third years after injury, which may reflect a return to baseline HRQL.This research presents both a framework for estimating QALY loss for short-lived medical conditions and nationally representative, community-based HRQL scores associated with a wide variety of injury and poisoning conditions.This research provides a catalog of nationally representative, preference-based EQ-5D score decrements associated with surviving a large set of injuries, based on patient-reported health status.Mean first-year QALY loss estimates range from 0.005 (sprains and strains of joints and adjacent muscles, n = 7067) to 0.109 (injury to nerves and spinal cord, n = 71).This article presents a novel methodology for assessing quality-of-life impacts for acute conditions by calculating the time elapsed between injury and health status elicitation. Researchers may explore adapting these methods to study other short-lived conditions and health states, such as COVID-19 or chemotherapy.
View details for DOI 10.1177/0272989X221141454
View details for PubMedID 36482721
-
Protection against Omicron from Vaccination and Previous Infection in a Prison System.
The New England journal of medicine
2022
Abstract
Information regarding the protection conferred by vaccination and previous infection against infection with the B.1.1.529 (omicron) variant of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) is limited.We evaluated the protection conferred by mRNA vaccines and previous infection against infection with the omicron variant in two high-risk populations: residents and staff in the California state prison system. We used a retrospective cohort design to analyze the risk of infection during the omicron wave using data collected from December 24, 2021, through April 14, 2022. Weighted Cox models were used to compare the effectiveness (measured as 1 minus the hazard ratio) of vaccination and previous infection across combinations of vaccination history (stratified according to the number of mRNA doses received) and infection history (none or infection before or during the period of B.1.617.2 [delta]-variant predominance). A secondary analysis used a rolling matched-cohort design to evaluate the effectiveness of three vaccine doses as compared with two doses.Among 59,794 residents and 16,572 staff, the estimated effectiveness of previous infection against omicron infection among unvaccinated persons who had been infected before or during the period of delta predominance ranged from 16.3% (95% confidence interval [CI], 8.1 to 23.7) to 48.9% (95% CI, 41.6 to 55.3). Depending on previous infection status, the estimated effectiveness of vaccination (relative to being unvaccinated and without previous documented infection) ranged from 18.6% (95% CI, 7.7 to 28.1) to 83.2% (95% CI, 77.7 to 87.4) with two vaccine doses and from 40.9% (95% CI, 31.9 to 48.7) to 87.9% (95% CI, 76.0 to 93.9) with three vaccine doses. Incremental effectiveness estimates of a third (booster) dose (relative to two doses) ranged from 25.0% (95% CI, 16.6 to 32.5) to 57.9% (95% CI, 48.4 to 65.7) among persons who either had not had previous documented infection or had been infected before the period of delta predominance.Our findings in two high-risk populations suggest that mRNA vaccination and previous infection were effective against omicron infection, with lower estimates among those infected before the period of delta predominance. Three vaccine doses offered significantly more protection than two doses, including among previously infected persons.
View details for DOI 10.1056/NEJMoa2207082
View details for PubMedID 36286260
-
Comparative inequalities in child dental caries across four countries: Examination of international birth cohorts and implications for oral health policy.
PloS one
2022; 17 (8): e0268899
Abstract
Child dental caries (i.e., cavities) are a major preventable health problem in most high-income countries. The aim of this study was to compare the extent of inequalities in child dental caries across four high-income countries alongside their child oral health policies. Coordinated analyses of data were conducted across four prospective population-based birth cohorts (Australia, n = 4085, born 2004; Québec, Canada, n = 1253, born 1997; Rotterdam, the Netherlands, n = 6690, born 2002; Southeast Sweden, n = 7445, born 1997), which enabled a high degree of harmonization. Risk ratios (adjusted) and slope indexes of inequality were estimated to quantify social gradients in child dental caries according to maternal education and household income. Children in the least advantaged quintile for income were at greater risk of caries, compared to the most advantaged quintile: Australia: AdjRR = 1.18, 95%CI = 1.04-1.34; Québec: AdjRR = 1.69, 95%CI = 1.36-2.10; Rotterdam: AdjRR = 1.67, 95%CI = 1.36-2.04; Southeast Sweden: AdjRR = 1.37, 95%CI = 1.10-1.71). There was a higher risk of caries for children of mothers with the lowest level of education, compared to the highest: Australia: AdjRR = 1.18, 95%CI = 1.01-1.38; Southeast Sweden: AdjRR = 2.31, 95%CI = 1.81-2.96; Rotterdam: AdjRR = 1.98, 95%CI = 1.71-2.30; Québec: AdjRR = 1.16, 95%CI = 0.98-1.37. The extent of inequalities varied in line with jurisdictional policies for provision of child oral health services and preventive public health measures. Clear gradients of social inequalities in child dental caries are evident in high-income countries. Policy related mechanisms may contribute to the differences in the extent of these inequalities. Lesser gradients in settings with combinations of universal dental coverage and/or fluoridation suggest these provisions may ameliorate inequalities through additional benefits for socio-economically disadvantaged groups of children.
View details for DOI 10.1371/journal.pone.0268899
View details for PubMedID 36044409
-
Dynamics of Respiratory Infectious Diseases in Incarcerated and Free-Living Populations: A Simulation Modeling Study.
Medical decision making : an international journal of the Society for Medical Decision Making
2022: 272989X221115364
Abstract
Historically, correctional facilities have had large outbreaks of respiratory infectious diseases like COVID-19. Hence, importation and exportation of such diseases from correctional facilities raises substantial concern.We developed a stochastic simulation model of transmission of respiratory infectious diseases within and between correctional facilities and the community. We investigated the infection dynamics, key governing factors, and relative importance of different infection routes (e.g., incarcerations and releases versus correctional staff). We also developed machine-learning meta-models of the simulation model, which allowed us to examine how our findings depended on different disease, correctional facility, and community characteristics.We find a magnification-reflection dynamic: a small outbreak in the community can cause a larger outbreak in the correction facility, which can then cause a second, larger outbreak in the community. This dynamic is strongest when community size is relatively small as compared with the size of the correctional population, the initial community R-effective is near 1, and initial prevalence of immunity in the correctional population is low. The timing of the correctional magnification and community reflection peaks in infection prevalence are primarily governed by the initial R-effective for each setting. Because the release rates from prisons are low, our model suggests correctional staff may be a more important infection entry route into prisons than incarcerations and releases; in jails, where incarceration and release rates are much higher, our model suggests the opposite.We find that across many combinations of respiratory pathogens, correctional settings, and communities, there can be substantial magnification-reflection dynamics, which are governed by several key factors. Our goal was to derive theoretical insights relevant to many contexts; our findings should be interpreted accordingly.We find a magnification-reflection dynamic: a small outbreak in a community can cause a larger outbreak in a correctional facility, which can then cause a second, larger outbreak in the community.For public health decision makers considering contexts most susceptible to this dynamic, we find that the dynamic is strongest when the community size is relatively small, initial community R-effective is near 1, and the initial prevalence of immunity in the correctional population is low; the timing of the correctional magnification and community reflection peaks in infection prevalence are primarily governed by the initial R-effective for each setting.We find that correctional staff may be a more important infection entry route into prisons than incarcerations and releases; however, for jails, the relative importance of the entry routes may be reversed.For modelers, we combine simulation modeling, machine-learning meta-modeling, and interpretable machine learning to examine how our findings depend on different disease, correctional facility, and community characteristics; we find they are generally robust.
View details for DOI 10.1177/0272989X221115364
View details for PubMedID 35904128
-
Household income and maternal education in early childhood and activity-limiting chronic health conditions in late childhood: findings from birth cohort studies from six countries.
Journal of epidemiology and community health
2022
Abstract
BACKGROUND: We examined absolute and relative relationships between household income and maternal education during early childhood (<5 years) with activity-limiting chronic health conditions (ALCHC) during later childhood in six longitudinal, prospective cohorts from high-income countries (UK, Australia, Canada, Sweden, Netherlands, USA).METHODS: Relative inequality (risk ratios, RR) and absolute inequality (Slope Index of Inequality) were estimated for ALCHC during later childhood by maternal education categories and household income quintiles in early childhood. Estimates were adjusted for mother ethnicity, maternal age at birth, child sex and multiple births, and were pooled using meta-regression.RESULTS: Pooled estimates, with over 42 000 children, demonstrated social gradients in ALCHC for high maternal education versus low (RR 1.54, 95% CI 1.28 to 1.85) and middle education (RR 1.24, 95% CI 1.11 to 1.38); as well as for high household income versus lowest (RR 1.90, 95% CI 1.66 to 2.18) and middle quintiles (RR 1.34, 95% CI 1.17 to 1.54). Absolute inequality showed decreasing ALCHC in all cohorts from low to high education (range: -2.85% Sweden, -13.36% Canada) and income (range: -1.8% Sweden, -19.35% Netherlands).CONCLUSION: We found graded relative risk of ALCHC during later childhood by maternal education and household income during early childhood in all cohorts. Absolute differences in ALCHC were consistently observed between the highest and lowest maternal education and household income levels across cohort populations. Our results support a potential role for generous, universal financial and childcare policies for families during early childhood in reducing the prevalence of activity limiting chronic conditions in later childhood.
View details for DOI 10.1136/jech-2022-219228
View details for PubMedID 35863874
-
Household income and maternal education in early childhood and risk of overweight and obesity in late childhood: Findings from seven birth cohort studies in six high-income countries.
International journal of obesity (2005)
2022
Abstract
BACKGROUND/OBJECTIVES: This study analysed the relationship between early childhood socioeconomic status (SES) measured by maternal education and household income and the subsequent development of childhood overweight and obesity.SUBJECTS/METHODS: Data from seven population-representative prospective child cohorts in six high-income countries: United Kingdom, Australia, the Netherlands, Canada (one national cohort and one from the province of Quebec), USA, Sweden. Children were included at birth or within the first 2 years of life. Pooled estimates relate to a total of N=26,565 included children. Overweight and obesity were defined using International Obesity Task Force (IOTF) cut-offs and measured in late childhood (8-11 years). Risk ratios (RRs) and pooled risk estimates were adjusted for potential confounders (maternal age, ethnicity, child sex). Slope Indexes of Inequality (SII) were estimated to quantify absolute inequality for maternal education and household income.RESULTS: Prevalence ranged from 15.0% overweight and 2.4% obese in the Swedish cohort to 37.6% overweight and 15.8% obese in the US cohort. Overall, across cohorts, social gradients were observed for risk of obesity for both low maternal education (pooled RR: 2.99, 95% CI: 2.07, 4.31) and low household income (pooled RR: 2.69, 95% CI: 1.68, 4.30); between-cohort heterogeneity ranged from negligible to moderate (p: 0.300 to<0.001). The association between RRs of obesity by income was lowest in Sweden than in other cohorts.CONCLUSIONS: There was a social gradient by maternal education on the risk of childhood obesity in all included cohorts. The SES associations measured by income were more heterogeneous and differed between Sweden versus the other national cohorts; these findings may be attributable to policy differences, including preschool policies, maternity leave, a ban on advertising to children, and universal free school meals.
View details for DOI 10.1038/s41366-022-01171-7
View details for PubMedID 35821522
-
Protection against Omicron conferred by mRNA primary vaccine series, boosters, and prior infection.
medRxiv : the preprint server for health sciences
2022
Abstract
B ackground: Prisons and jails are high-risk settings for Covid-19 transmission, morbidity, and mortality. We evaluate protection conferred by prior infection and vaccination against the SARS-CoV-2 Omicron variant within the California state prison system.M ethods: We employed a test-negative design to match resident and staff cases during the Omicron wave (December 24, 2021-April 14, 2022) to controls according to a case's test-week as well as demographic, clinical, and carceral characteristics. We estimated protection against infection using conditional logistic regression, with exposure status defined by vaccination, stratified by number of mRNA doses received, and prior infection, stratified by periods before or during Delta variant predominance.R esults: We matched 15,783 resident and 8,539 staff cases to 180,169 resident and 90,409 staff controls. Among cases, 29.7% and 2.2% were infected before or during the emergence of the Delta variant, respectively; 30.6% and 36.3% were vaccinated with two or three doses, respectively. Estimated protection from Omicron infection for two and three doses were 14.9% (95% Confidence Interval [CI], 12.3-19.7%) and 43.2% (42.2-47.4%) for those without known prior infections, 47.8% (95% CI, 46.6-52.8%) and 61.3% (95% CI, 60.7-64.8%) for those infected before the emergence of Delta, and 73.1% (95% CI, 69.8-80.1%) and 86.8% (95% CI, 82.1-92.7) for those infected during the period of Delta predominance.C onclusion: A third mRNA dose provided significant, additional protection over two doses, including among individuals with prior infection. Our findings suggest that vaccination should remain a priority-even in settings with high levels of transmission and prior infection.
View details for DOI 10.1101/2022.05.26.22275639
View details for PubMedID 35665013
-
Author Response to "Optimal Sample Size Calculation for Clinical Research under a Budget Constraint".
Medical decision making : an international journal of the Society for Medical Decision Making
2022; 42 (4): 419-420
View details for DOI 10.1177/0272989X221091567
View details for PubMedID 35412339
-
RCT of the effectiveness of stepped-care sleep therapy in general practice: The RESTING study protocol.
Contemporary clinical trials
2022: 106749
Abstract
Cognitive behavioral therapy for insomnia (CBT-I) is an effective, non-pharmacological intervention, designated by the American College of Physicians as the first-line treatment of insomnia disorder. The current randomized controlled study uses a Hybrid-Type-1 design to compare the effectiveness and implementation potential of two approaches to delivering CBT-I in primary care. One approach offers therapy to all patients through an automated, digital CBT-I program (ONLINE-ONLY). The other is a triaged STEPPED-CARE approach that uses a simple Decision Checklist to start patients in either digital or therapist-led treatment; patients making insufficient progress with digital treatment at 2 months are switched to therapist-led treatment. We will randomize 240 individuals (age 50 or older) with insomnia disorder to ONLINE-ONLY or STEPPED-CARE arms. The primary outcomes are insomnia severity and hypnotic medication use, assessed at baseline and at months 2, 4, 6, 9, and 12 after randomization. We hypothesize that STEPPED-CARE will be superior to ONLINE-ONLY in reducing insomnia severity and hypnotic use. We also aim to validate the Decision Checklist and explore moderators of outcome. Additionally, guided by the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) framework, we will use mixed methods to obtain data on the potential for future dissemination and implementation of each approach. This triaged stepped-care approach has the potential to improve sleep, reduce use of hypnotic medications, promote safety, offer convenient access to treatment, and support dissemination of CBT-I to a large number of patients currently facing barriers to accessing treatment. Clinical trial registration:NCT03532282.
View details for DOI 10.1016/j.cct.2022.106749
View details for PubMedID 35367385
-
Uptake of COVID-19 Vaccination Among Frontline Workers in California State Prisons
JAMA HEALTH FORUM
2022; 3 (3): e220099
Abstract
Prisons and jails are high-risk environments for COVID-19. Vaccination levels among workers in many such settings remain markedly lower than those of residents and members of surrounding communities. The situation is troubling because prison staff are a key vector for COVID-19 transmission.To assess patterns and timing of staff vaccination in California state prisons and identify individual-level and community-level factors associated with remaining unvaccinated.This cohort study used data from December 22, 2020, through June 30, 2021, to quantify the fractions of staff and incarcerated residents who remained unvaccinated among 23 472 custody and 7617 health care staff who worked in roles requiring direct contact with residents at 33 of the 35 prisons operated by the California Department of Corrections and Rehabilitation. Multivariable probit regressions assessed demographic, community, and peer factors associated with staff vaccination uptake.Remaining unvaccinated throughout the study period.Of 23 472 custody staff, 3751 (16%) were women, and 1454 (6%) were Asian/Pacific Islander individuals, 1571 (7%) Black individuals, 9008 (38%) Hispanic individuals, and 6666 (28%) White individuals. Of 7617 health care staff, 5434 (71%) were women, and 2148 (28%) were Asian/Pacific Islander individuals, 1201 (16%) Black individuals, 1409 (18%) Hispanic individuals, and 1771 (23%) White individuals. A total of 6103 custody staff (26%) and 3961 health care staff (52%) received 1 or more doses of a COVID-19 vaccine during the first 2 months vaccines were offered, but vaccination rates stagnated thereafter. By June 30, 2021, 14 317 custody staff (61%) and 2819 health care staff (37%) remained unvaccinated. In adjusted analyses, remaining unvaccinated was positively associated with younger age (custody staff: age, 18-29 years vs ≥60 years, 75% [95% CI, 73%-76%] vs 45% [95% CI, 42%-48%]; health care staff: 52% [95% CI, 48%-56%] vs 29% [95% CI, 27%-32%]), prior COVID-19 infection (custody staff: 67% [95% CI, 66%-68%] vs 59% [95% CI, 59%-60%]; health care staff: 44% [95% CI, 42%-47%] vs 36% [95% CI, 36%-36%]), residing in a community with relatively low rates of vaccination (custody staff: 75th vs 25th percentile:, 63% [95% CI, 62%-63%] vs 60% [95% CI, 59%-60%]; health care staff: 40% [95% CI, 39%-41%] vs 34% [95% CI, 33%-35%]), and sharing shifts with coworkers who had relatively low rates of vaccination (custody staff: 75th vs 25th percentile, 64% [95% CI, 62%-66%] vs 59% [95% CI, 57%-61%]; health care staff: 38% [95% CI, 36%-41%] vs 35% [95% CI, 31%-39%]).This cohort study of California state prison custody and health care staff found that vaccination uptake plateaued at levels that posed ongoing risks of further outbreaks in the prisons and continuing transmission from prisons to surrounding communities. Prison staff decisions to forgo vaccination appear to be multifactorial, and vaccine mandates may be necessary to achieve adequate levels of immunity in this high-risk setting.
View details for DOI 10.1001/jamahealthforum.2022.0099
View details for Web of Science ID 000837243200004
View details for PubMedID 35977288
View details for PubMedCentralID PMC8917424
-
Cost-Effectiveness Analysis and Microsimulation of Serial Multiparametric Magnetic Resonance Imaging in Active Surveillance of Localized Prostate Cancer.
The Journal of urology
2022: 101097JU0000000000002490
Abstract
PURPOSE: Many localized prostate cancers will follow an indolent course. Management has shifted towards active surveillance (AS), yet an optimal regimen remains controversial especially regarding expensive multiparametric magnetic resonance imaging (MRI). We aimed to assess cost-effectiveness of MRI in AS protocols.MATERIALS AND METHODS: A probabilistic microsimulation modeled individual patient trajectories for men diagnosed with low-risk cancer. We assessed no surveillance, up-front treatment (surgery or radiation), and scheduled AS protocols incorporating transrectal ultrasound-guided (TRUS) biopsy or MRI-based regimens at serial intervals. Lifetime quality-adjusted life years (QALYs) and costs adjusted to 2020-US$ were used to calculate expected net monetary benefit (NMB) at $50,000/QALY and incremental cost-effectiveness ratios (ICERs). Uncertainty was assessed with probabilistic sensitivity analysis and linear regression metamodeling.RESULTS: Conservative management with AS outperformed up-front definitive treatment in a modeled cohort reflecting characteristics from a multi-institutional trial. Biopsy decision conditional on positive imaging (MRI triage) at 2-year intervals provided the highest expected NMB (ICER $44,576). Biopsy after both positive and negative imaging (MRI pathway) and TRUS-based regimens were not cost-effective. MRI triage resulted in fewer biopsies while reducing metastatic disease or cancer death. Results were sensitive to test performance and cost. MRI triage was the most likely cost-effective strategy on probabilistic sensitivity analysis.CONCLUSIONS: For men with low-risk prostate cancer, our modeling demonstrated that AS with sequential MRI triage is more cost-effective than biopsy regardless of imaging, TRUS biopsy alone, or immediate treatment. AS-guidelines should specify the role of imaging, and prospective studies should be encouraged.
View details for DOI 10.1097/JU.0000000000002490
View details for PubMedID 35212570
-
Cost-Effectiveness of Dapagliflozin for Non-diabetic Chronic Kidney Disease.
Journal of general internal medicine
2022
Abstract
BACKGROUND: In the USA, chronic kidney disease (CKD) affects 1 in 7 adults and costs $100 billion annually. The DAPA-CKD trial found dapagliflozin, a sodium glucose co-transporter 2 (SGLT2) inhibitor, to be effective in reducing CKD progression and mortality in patients with diabetic and non-diabetic CKD. Currently, SGLT2 inhibitors are not considered standard of care for patients with non-diabetic CKD.OBJECTIVE: Determine the cost-effectiveness of adding dapagliflozin to standard management of patients with non-diabetic CKD.DESIGN: Markov model with lifetime time horizon and US healthcare sector perspective.PATIENTS: Patients with non-diabetic CKD INTERVENTION: Dapagliflozin plus standard care versus standard care only.MAIN MEASURES: Quality-adjusted life years (QALYs), costs, and incremental cost-effectiveness ratios (ICERs), all discounted at 3% annually; total incidence of kidney failure on kidney replacement therapy; average years on kidney replacement therapy.KEY RESULTS: Adding dapagliflozin to standard care improved life expectancy by 2 years, increased discounted QALYS (from 6.75 to 8.06), and reduced the total incidence of kidney failure on kidney replacement therapy (KRT) (from 17.4 to 11.0%) and average years on KRT (from 0.77 to 0.43) over the lifetime of the cohort. Dapagliflozin plus standard care was more effective than standard care alone while increasing lifetime costs (from $245,900 to $324,8900, or $60,000 per QALY gained). Results were robust to variations in assumptions about dapagliflozin's efficacy over time and by CKD stage, added costs of kidney replacement therapy, and expected population annual CKD progression rates and sensitive to the cost of dapagliflozin. The net 1-year budgetary implication of treating all US patients with non-diabetic CKD could be up to $21 billion.CONCLUSIONS: Dapagliflozin improved life expectancy and reduced progression of CKD, the proportion of patients requiring kidney replacement therapy, and time on kidney replacement therapy in patients with non-diabetic CKD. Use of dapagliflozin meets conventional criteria for cost-effectiveness.
View details for DOI 10.1007/s11606-021-07311-5
View details for PubMedID 35137296
-
Association of Emergency Department Pediatric Readiness With Mortality to 1 Year Among Injured Children Treated at Trauma Centers.
JAMA surgery
1800: e217419
Abstract
Importance: There is substantial variability among emergency departments (EDs) in their readiness to care for acutely ill and injured children, including US trauma centers. While high ED pediatric readiness is associated with improved in-hospital survival among children treated at trauma centers, the association between high ED readiness and long-term outcomes is unknown.Objective: To evaluate the association between ED pediatric readiness and 1-year survival among injured children presenting to 146 trauma centers.Design, Setting, and Participants: In this retrospective cohort study, injured children younger than 18 years who were residents of 8 states with admission, transfer to, or injury-related death at one of 146 participating trauma centers were included. Children cared for in and outside their state of residence were included. Subgroups included those with an Injury Severity Score (ISS) of 16 or more; any Abbreviated Injury Scale (AIS) score of 3 or more; head AIS score of 3 or more; and need for early critical resources. Data were collected from January 2012 to December 2017, with follow-up to December 2018. Data were analyzed from January to July 2021.Exposures: ED pediatric readiness for the initial ED, measured using the weighted Pediatric Readiness Score (wPRS; range, 0-100) from the 2013 National Pediatric Readiness Project assessment.Main Outcomes and Measures: Time to death within 365 days.Results: Of 88 071 included children, 30 654 (34.8%) were female; 2114 (2.4%) were Asian, 16 730 (10.0%) were Black, and 49 496 (56.2%) were White; and the median (IQR) age was 11 (5-15) years. A total of 1974 (2.2%) died within 1 year of the initial ED visit, including 1768 (2.0%) during hospitalization and 206 (0.2%) following discharge. Subgroups included 12 752 (14.5%) with an ISS of 16 or more, 28 402 (32.2%) with any AIS score of 3 or more, 13 348 (15.2%) with a head AIS of 3 or more, and 9048 (10.3%) requiring early critical resources. Compared with EDs in the lowest wPRS quartile (32-69), children cared for in the highest wPRS quartile (95-100) had lower hazard of death to 1 year (adjusted hazard ratio [aHR], 0.70; 95% CI, 0.56-0.88). Supplemental analyses removing early deaths had similar results (aHR, 0.75; 95% CI, 0.56-0.996). Findings were consistent across subgroups and multiple sensitivity analyses.Conclusions and Relevance: Children treated in high-readiness trauma center EDs after injury had a lower risk of death that persisted to 1 year. High ED readiness is independently associated with long-term survival among injured children.
View details for DOI 10.1001/jamasurg.2021.7419
View details for PubMedID 35107579
-
Perks and Pitfalls of Performance-Linked Reimbursement for Novel Drugs: The Case of Sacubitril-Valsartan.
Circulation. Cardiovascular quality and outcomes
1800; 15 (1): e007993
Abstract
BACKGROUND: Rising drug costs have increased interest in performance-linked reimbursement (PLR) contracts that tie payment to patient outcomes. PLR is theoretically attractive to payers interested in reducing the risk of overpaying for expensive drugs, to manufacturers working to improve early drug adoption, and to patients seeking improved access. Multiple PLR contracts were developed for sacubitril-valsartan. We evaluated how the characteristics of a PLR contract influence its performance.METHODS: We used a published cost-effectiveness model of sacubitril-valsartan. We evaluated hypothetical PLR contracts that adjusted drug payment based on observed therapy effectiveness. Ideally, these contracts reduce the uncertainty around the value obtained with purchasing sacubitril-valsartan. By reducing the financial risk in covering an ineffective therapy, PLR incentivizes insurers to increase patient access. We measured the uncertainty in value as the SD of the incremental net monetary benefit (INMB), an estimate of therapy value incorporating costs and clinical benefits. We evaluated the change in INMB SD under a variety of different assumptions regarding contract design, therapy effectiveness, and population characteristics.RESULTS: Over 2 years, sacubitril-valsartan led to 0.042 additional quality-adjusted life-years at an incremental cost of $4916. Using a willingness-to-pay of $150000 per quality-adjusted life-year, this led to a mean INMB across simulations of $1416 (SD, $1720). A PLR contract that adjusted payment based on cardiovascular mortality reduced the INMB SD moderately by 20.7% while a contract based on all-cause mortality was more effective (INMB SD reduction of 27.3%). A contract based on heart failure hospitalization reduction was ineffective. PLR effectiveness increased with greater uncertainty regarding therapy effectiveness or in sicker cohorts (eg, New York Heart Association Class III/IV heart failure). Contracts required precise estimates of treatment effect in addition to trust or verifiability between manufacturers and payers concerning patient selection.CONCLUSIONS: The development of accurate prospective estimates of treatment effectiveness using actual enrollee characteristics will be critical for successful PLR. If able to meet these requirements, PLRs could incentivize insurers to expand access to expensive treatments by reducing financial risk.
View details for DOI 10.1161/CIRCOUTCOMES.121.007993
View details for PubMedID 35041480
-
Social gradients in ADHD by household income and maternal education exposure during early childhood: Findings from birth cohort studies across six countries.
PloS one
2022; 17 (3): e0264709
Abstract
This study aimed to examine social gradients in ADHD during late childhood (age 9-11 years) using absolute and relative relationships with socioeconomic status exposure (household income, maternal education) during early childhood (<5 years) in seven cohorts from six industrialised countries (UK, Australia, Canada, The Netherlands, USA, Sweden).Secondary analyses were conducted for each birth cohort. Risk ratios, pooled risk estimates, and absolute inequality, measured by the Slope Index of Inequality (SII), were estimated to quantify social gradients in ADHD during late childhood by household income and maternal education measured during early childhood. Estimates were adjusted for child sex, mother age at birth, mother ethnicity, and multiple births.All cohorts demonstrated social gradients by household income and maternal education in early childhood, except for maternal education in Quebec. Pooled risk estimates, relating to 44,925 children, yielded expected gradients (income: low 1.83(CI 1.38,2.41), middle 1.42(1.13,1.79), high (reference); maternal education: low 2.13(1.39,3.25), middle 1.42(1.13,1.79)). Estimates of absolute inequality using SII showed that the largest differences in ADHD prevalence between the highest and lowest levels of maternal education were observed in Australia (4% lower) and Sweden (3% lower); for household income, the largest differences were observed in Quebec (6% lower) and Canada (all provinces: 5% lower).Findings indicate that children in families with high household income or maternal education are less likely to have ADHD at age 9-11. Absolute inequality, in combination with relative inequality, provides a more complete account of the socioeconomic status and ADHD relationship in different high-income countries. While the study design precludes causal inference, the linear relation between early childhood social circumstances and later ADHD suggests a potential role for policies that promote high levels of education, especially among women, and adequate levels of household income over children's early years in reducing risk of later ADHD.
View details for DOI 10.1371/journal.pone.0264709
View details for PubMedID 35294456
-
Effectiveness of COVID-19 vaccines among incarcerated people in California state prisons: retrospective cohort study.
Clinical infectious diseases : an official publication of the Infectious Diseases Society of America
2022
Abstract
Prisons and jails are high-risk settings for COVID-19. Vaccines may substantially reduce these risks, but evidence is needed on COVID-19 vaccine effectiveness for incarcerated people, who are confined in large, risky congregate settings.We conducted a retrospective cohort study to estimate effectiveness of mRNA vaccines, BNT162b2 (Pfizer-BioNTech) and mRNA-1273 (Moderna), against confirmed SARS-CoV-2 infections among incarcerated people in California prisons from December 22, 2020 through March 1, 2021. The California Department of Corrections and Rehabilitation provided daily data for all prison residents including demographic, clinical, and carceral characteristics, as well as COVID-19 testing, vaccination, and outcomes. We estimated vaccine effectiveness using multivariable Cox models with time-varying covariates, adjusted for resident characteristics and infection rates across prisons.Among 60,707 cohort members, 49% received at least one BNT162b2 or mRNA-1273 dose during the study period. Estimated vaccine effectiveness was 74% (95% confidence interval [CI], 64-82%) from day 14 after first dose until receipt of second dose and 97% (95% CI, 88-99%) from day 14 after second dose. Effectiveness was similar among the subset of residents who were medically vulnerable: 74% [95% CI, 62-82%] and 92% [95% CI, 74-98%] from 14 days after first and second doses, respectively.Consistent with results from randomized trials and observational studies in other populations, mRNA vaccines were highly effective in preventing SARS-CoV-2 infections among incarcerated people. Prioritizing incarcerated people for vaccination, redoubling efforts to boost vaccination, and continuing other ongoing mitigation practices are essential in preventing COVID-19 in this disproportionately affected population.
View details for DOI 10.1093/cid/ciab1032
View details for PubMedID 35083482
-
Prevention and control of dengue and Chikungunya in Colombia: A cost-effectiveness analysis.
PLoS neglected tropical diseases
1800; 15 (12): e0010086
Abstract
BACKGROUND: Chikungunya and dengue are emerging diseases that have caused large outbreaks in various regions of the world. Both are both spread by Aedes aegypti and Aedes albopictus mosquitos. We developed a dynamic transmission model of chikungunya and dengue, calibrated to data from Colombia (June 2014 -December 2017).METHODOLOGY/PRINCIPAL FINDINGS: We evaluated the health benefits and cost-effectiveness of residual insecticide treatment, long-lasting insecticide-treated nets, routine dengue vaccination for children aged 9, catchup vaccination for individuals aged 10-19 or 10-29, and portfolios of these interventions. Model calibration resulted in 300 realistic transmission parameters sets that produced close matches to disease-specific incidence and deaths. Insecticide was the preferred intervention and was cost-effective. Insecticide averted an estimated 95 chikungunya cases and 114 dengue cases per 100,000 people, 61 deaths, and 4,523 disability-adjusted life years (DALYs). In sensitivity analysis, strategies that included dengue vaccination were cost-effective only when the vaccine cost was 14% of the current price.CONCLUSIONS/SIGNIFICANCE: Insecticide to prevent chikungunya and dengue in Columbia could generate significant health benefits and be cost-effective. Because of limits on diagnostic accuracy and vaccine efficacy, the cost of dengue testing and vaccination must decrease dramatically for such vaccination to be cost-effective in Colombia. The vectors for chikungunya and dengue have recently spread to new regions, highlighting the importance of understanding the effectiveness and cost-effectiveness of policies aimed at preventing these diseases.
View details for DOI 10.1371/journal.pntd.0010086
View details for PubMedID 34965277
-
Comparison of Strategies for Typhoid Conjugate Vaccine Introduction in India: A Cost-Effectiveness Modeling Study.
The Journal of infectious diseases
2021; 224 (Supplement_5): S612-S624
Abstract
Typhoid fever causes substantial global mortality, with almost half occurring in India. New typhoid vaccines are highly effective and recommended by the World Health Organization for high-burden settings. There is a need to determine whether and which typhoid vaccine strategies should be implemented in India.We assessed typhoid vaccination using a dynamic compartmental model, parameterized by and calibrated to disease and costing data from a recent multisite surveillance study in India. We modeled routine and 1-time campaign strategies that target different ages and settings. The primary outcome was cost-effectiveness, measured by incremental cost-effectiveness ratios (ICERs) benchmarked against India's gross national income per capita (US$2130).Both routine and campaign vaccination strategies were cost-saving compared to the status quo, due to averted costs of illness. The preferred strategy was a nationwide community-based catchup campaign targeting children aged 1-15 years alongside routine vaccination, with an ICER of $929 per disability-adjusted life-year averted. Over the first 10 years of implementation, vaccination could avert 21-39 million cases and save $1.6-$2.2 billion. These findings were broadly consistent across willingness-to-pay thresholds, epidemiologic settings, and model input distributions.Despite high initial costs, routine and campaign typhoid vaccination in India could substantially reduce mortality and was highly cost-effective.
View details for DOI 10.1093/infdis/jiab150
View details for PubMedID 35238367
-
Optimal Patient Selection for Simultaneous Heart-Kidney Transplant: A Modified Cost-Effectiveness Analysis
LIPPINCOTT WILLIAMS & WILKINS. 2021
View details for Web of Science ID 000752020005412
-
Comparison of Strategies for Typhoid Conjugate Vaccine Introduction in India: A Cost-Effectiveness Modeling Study
JOURNAL OF INFECTIOUS DISEASES
2021; 224: S612-S624
View details for DOI 10.1093/infdis/jiab150
View details for Web of Science ID 000744683600017
-
Optimal patient selection for simultaneous heart-kidney transplant: a modified cost-effectiveness analysis.
American journal of transplantation : official journal of the American Society of Transplantation and the American Society of Transplant Surgeons
2021
Abstract
Increasing rates of simultaneous heart-kidney (SHK) transplant in the United States exacerbate the overall shortage of deceased donor kidneys (DDK). Current allocation policy does not impose constraints on SHK eligibility, and how best to do so remains unknown. We apply a decision analytic model to evaluate options for heart transplant (HT) candidates with comorbid kidney dysfunction. We compare SHK with a "Safety Net" strategy, in which DDK transplant is performed six months after HT, only if native kidneys do not recover. We identify patient subsets for whom SHK using a DDK is efficient, considering the quality-adjusted life year (QALY) gains from DDKs instead allocated for kidney transplant-only. For an average-aged candidate with 50% probability of kidney recovery after HT-only, SHK produces 0.64 more QALYs than Safety Net at a cost of 0.58 more kidneys used. SHK is inefficient in this scenario, producing fewer QALYs per DDK used (1.1) than a DDK allocated for KT-only (2.2). SHK is preferred to Safety Net only for candidates with a lower probability of native kidney recovery (24 - 38%, varying by recipient age). This finding favors implementation of a Safety Net provision and should inform the establishment of objective criteria for SHK transplant eligibility.
View details for DOI 10.1111/ajt.16888
View details for PubMedID 34741786
-
Quantifying and Benchmarking Disparities in COVID-19 Vaccination Rates by Race and Ethnicity.
JAMA network open
2021; 4 (10): e2130343
View details for DOI 10.1001/jamanetworkopen.2021.30343
View details for PubMedID 34668949
-
Providing more balanced information on the harms and benefits of cervical cancer screening: A randomized survey among US and Norwegian women.
Preventive medicine reports
2021; 23: 101452
Abstract
We aimed to identify how additional information about benefits and harms of cervical cancer (CC) screening impacted intention to participate in screening, what type of information on harms women preferred receiving, from whom, and whether it differed between two national healthcare settings. We conducted a survey that randomized screen-eligible women in the United States (n=1084) and Norway (n=1060) into four groups according to the timing of introducing additional information. We found that additional information did not significantly impact stated intentions-to-participate in screening or follow-up testing in either country; however, the proportion of Norwegian women stating uncertainty about seeking precancer treatment increased from 7.9% to 14.3% (p=0.012). Women reported strong system-specific preferences for sources of information: Norwegians (59%) preferred it come from a national public health agency while Americans (59%) preferred it come from a specialist care provider. Regression models revealed having a prior Pap-test was the most important predictor of intentions-to-participate in both countries, while having lower income reduced the probabilities of intentions-to-follow-up and seek precancer treatment among U.S. women. These results suggest that additional information on harms is unlikely to reduce participation in CC screening but could increase decision uncertainty to seek treatment. Providing unbiased information would improve on the ethical principle of respect for autonomy and self-determination. However, the clinical impact of additional information on women's understanding of the trade-offs involved with CC screening should be investigated. Future studies should also consider country-specific socioeconomic barriers to screening if communication re-design initiatives aim to improve CC screening participation.
View details for DOI 10.1016/j.pmedr.2021.101452
View details for PubMedID 34221852
-
Effectiveness of COVID-19 Vaccines among Incarcerated People in California State Prisons: A Retrospective Cohort Study.
medRxiv : the preprint server for health sciences
2021
Abstract
Background: Prisons and jails are high-risk settings for COVID-19 transmission, morbidity, and mortality. COVID-19 vaccines may substantially reduce these risks, but evidence is needed of their effectiveness for incarcerated people, who are confined in large, risky congregate settings.Methods: We conducted a retrospective cohort study to estimate effectiveness of mRNA vaccines, BNT162b2 (Pfizer-BioNTech) and mRNA-1273 (Moderna), against confirmed SARS-CoV-2 infections among incarcerated people in California prisons from December 22, 2020 through March 1, 2021. The California Department of Corrections and Rehabilitation provided daily data for all prison residents including demographic, clinical, and carceral characteristics, as well as COVID-19 testing, vaccination status, and outcomes. We estimated vaccine effectiveness using multivariable Cox models with time-varying covariates that adjusted for resident characteristics and infection rates across prisons.Findings: Among 60,707 residents in the cohort, 49% received at least one BNT162b2 or mRNA-1273 dose during the study period. Estimated vaccine effectiveness was 74% (95% confidence interval [CI], 64-82%) from day 14 after first dose until receipt of second dose and 97% (95% CI, 88-99%) from day 14 after second dose. Effectiveness was similar among the subset of residents who were medically vulnerable (74% [95% CI, 62-82%] and 92% [95% CI, 74-98%] from 14 days after first and second doses, respectively), as well as among the subset of residents who received the mRNA-1273 vaccine (71% [95% CI, 58-80%] and 96% [95% CI, 67-99%]).Conclusions: Consistent with results from randomized trials and observational studies in other populations, mRNA vaccines were highly effective in preventing SARS-CoV-2 infections among incarcerated people. Prioritizing incarcerated people for vaccination, redoubling efforts to boost vaccination and continuing other ongoing mitigation practices are essential in preventing COVID-19 in this disproportionately affected population.Funding: Horowitz Family Foundation, National Institute on Drug Abuse, Centers for Disease Control and Prevention, National Science Foundation, Open Society Foundation, Advanced Micro Devices.
View details for DOI 10.1101/2021.08.16.21262149
View details for PubMedID 34426814
-
Simulating Study Data to Support Expected Value of Sample Information Calculations: A Tutorial.
Medical decision making : an international journal of the Society for Medical Decision Making
2021: 272989X211026292
Abstract
The expected value of sample information (EVSI) can be used to prioritize avenues for future research and design studies that support medical decision making and offer value for money spent. EVSI is calculated based on 3 key elements. Two of these, a probabilistic model-based economic evaluation and updating model uncertainty based on simulated data, have been frequently discussed in the literature. By contrast, the third element, simulating data from the proposed studies, has received little attention. This tutorial contributes to bridging this gap by providing a step-by-step guide to simulating study data for EVSI calculations. We discuss a general-purpose algorithm for simulating data and demonstrate its use to simulate 3 different outcome types. We then discuss how to induce correlations in the generated data, how to adjust for common issues in study implementation such as missingness and censoring, and how individual patient data from previous studies can be leveraged to undertake EVSI calculations. For all examples, we provide comprehensive code written in the R language and, where possible, Excel spreadsheets in the supplementary materials. This tutorial facilitates practical EVSI calculations and allows EVSI to be used to prioritize research and design studies.
View details for DOI 10.1177/0272989X211026292
View details for PubMedID 34388954
-
Assessing Interventions That Prevent Multiple Infectious Diseases: Simple Methods for Multidisease Modeling.
Medical decision making : an international journal of the Society for Medical Decision Making
2021: 272989X211033287
Abstract
BACKGROUND: Many cost-effectiveness analyses (CEAs) only consider outcomes for a single disease when comparing interventions that prevent or treat 1 disease (e.g., vaccination) to interventions that prevent or treat multiple diseases (e.g., vector control to prevent mosquito-borne diseases). An intervention targeted to a single disease may be preferred to a broader intervention in a single-disease model, but this conclusion might change if outcomes from the additional diseases were included. However, multidisease models are often complex and difficult to construct.METHODS: We present conditions for when multiple diseases should be considered in such a CEA. We propose methods for estimating health outcomes and costs associated with control of additional diseases using parallel single-disease models. Parallel modeling can incorporate competing mortality and coinfection from multiple diseases while maintaining model simplicity. We illustrate our approach with a CEA that compares a dengue vaccine, a chikungunya vaccine, and mosquito control via insecticide and mosquito nets, which can prevent dengue, chikungunya, Zika, and yellow fever.RESULTS: The parallel models and the multidisease model generated similar estimates of disease incidence and deaths with much less complexity. When using this method in our case study, considering only chikungunya and dengue, the preferred strategy was insecticide. A broader strategy-insecticide plus long-lasting insecticide-treated nets-was not preferred when Zika and yellow fever were included, suggesting the conclusion is robust even without the explicit inclusion of all affected diseases.LIMITATIONS: Parallel modeling assumes independent probabilities of infection for each disease.CONCLUSIONS: When multidisease effects are important, our parallel modeling method can be used to model multiple diseases accurately while avoiding additional complexity.
View details for DOI 10.1177/0272989X211033287
View details for PubMedID 34378462
-
Modeling the Cost-Effectiveness of Interventions to Prevent Plague in Madagascar.
Tropical medicine and infectious disease
2021; 6 (2)
Abstract
Plague (Yersinia pestis) remains endemic in certain parts of the world. We assessed the cost-effectiveness of plague control interventions recommended by the World Health Organization with particular consideration to intervention coverage and timing. We developed a dynamic model of the spread of plague between interacting populations of humans, rats, and fleas and performed a cost-effectiveness analysis calibrated to a 2017 Madagascar outbreak. We assessed three interventions alone and in combination: expanded access to antibiotic treatment with doxycycline, mass distribution of doxycycline prophylaxis, and mass distribution of malathion. We varied intervention timing and coverage levels. We calculated costs, quality-adjusted life years (QALYs), and incremental cost-effectiveness ratios from a healthcare perspective. The preferred intervention, using a cost-effectiveness threshold of $1350/QALY (GDP per capita in Madagascar), was expanded access to antibiotic treatment with doxycycline with 100% coverage starting immediately after the first reported case, gaining 543 QALYs at an incremental cost of $1023/QALY gained. Sensitivity analyses support expanded access to antibiotic treatment and leave open the possibility that mass distribution of doxycycline prophylaxis or mass distribution of malathion could be cost-effective. Our analysis highlights the potential for rapid expansion of access to doxycycline upon recognition of plague outbreaks to cost-effectively prevent future large-scale plague outbreaks and highlights the importance of intervention timing.
View details for DOI 10.3390/tropicalmed6020101
View details for PubMedID 34208006
-
Evaluation of Emergency Department Pediatric Readiness and Outcomes Among US Trauma Centers.
JAMA pediatrics
2021
Abstract
Importance: The National Pediatric Readiness Project is a US initiative to improve emergency department (ED) readiness to care for acutely ill and injured children. However, it is unclear whether high ED pediatric readiness is associated with improved survival in US trauma centers.Objective: To evaluate the association between ED pediatric readiness, in-hospital mortality, and in-hospital complications among injured children presenting to US trauma centers.Design, Setting, and Participants: A retrospective cohort study of 832 EDs in US trauma centers in 50 states and the District of Columbia was conducted using data from January 1, 2012, through December 31, 2017. Injured children younger than 18 years who were admitted, transferred, or with injury-related death in a participating trauma center were included in the analysis. Subgroups included children with an Injury Severity Score (ISS) of 16 or above, indicating overall seriously injured (accounting for all injuries); any Abbreviated Injury Scale (AIS) score of 3 or above, indicating at least 1 serious injury; a head AIS score of 3 or above, indicating serious brain injury; and need for early use of critical resources.Exposures: Emergency department pediatric readiness for the initial ED visit, measured through the weighted Pediatric Readiness Score (range, 0-100) from the 2013 National Pediatric Readiness Project ED pediatric readiness assessment.Main Outcomes and Measures: In-hospital mortality, with a secondary composite outcome of in-hospital mortality or complication. For the primary measurement tools used, the possible range of the AIS is 0 to 6, with 3 or higher indicating a serious injury; the possible range of the ISS is 0 to 75, with 16 or higher indicating serious overall injury. The weighted Pediatric Readiness Score examines and scores 6 domains; in this study, the lowest quartile included scores of 29 to 62 and the highest quartile included scores of 93 to 100.Results: There were 372 004 injured children (239 273 [64.3%] boys; median age, 10 years [interquartile range, 4-15 years]), including 5700 (1.5%) who died in-hospital and 5018 (1.3%) who developed in-hospital complications. Subgroups included 50 440 children (13.6%) with an ISS of 16 or higher, 124 507 (33.5%) with any AIS score of 3 or higher, 57 368 (15.4%) with a head AIS score of 3 or higher, and 32 671 (8.8%) requiring early use of critical resources. Compared with EDs in the lowest weighted Pediatric Readiness Score quartile, children cared for in the highest ED quartile had lower in-hospital mortality (adjusted odds ratio [aOR], 0.58; 95% CI, 0.45-0.75), but not fewer complications (aOR for the composite outcome 0.88; 95% CI, 0.74-1.04). These findings were consistent across subgroups, strata, and multiple sensitivity analyses. If all children cared for in the lowest-readiness quartiles (1-3) were treated in an ED in the highest quartile of readiness, an additional 126 lives (95% CI, 97-154 lives) might be saved each year in these trauma centers.Conclusions and Relevance: In this cohort study, injured children treated in high-readiness EDs had lower mortality compared with similar children in low-readiness EDs, but not fewer complications. These findings support national efforts to increase ED pediatric readiness in US trauma centers that care for children.
View details for DOI 10.1001/jamapediatrics.2021.1319
View details for PubMedID 34096991
-
Availability of Cost-effectiveness Studies for Drugs With High Medicare Part D Expenditures.
JAMA network open
2021; 4 (6): e2113969
Abstract
Importance: Prescription drug spending in the US requires policy intervention to control costs and improve the value obtained from pharmaceutical spending. One such intervention is to apply cost-effectiveness evidence to decisions regarding drug coverage and pricing, but this intervention depends on the existence of such evidence to guide decisions.Objective: To characterize the availability and quality of cost-effectiveness studies for prescription drugs with the greatest Medicare Part D spending.Design, Setting, and Participants: In this national cross-sectional analysis, publicly available 2016 Medicare drug spending records were merged with 2016 US Food & Drug Administration Orange Book data and the Tufts Medical Center Cost-Effectiveness Analysis (CEA) Registry. All studies published through 2015 that evaluated the cost-effectiveness of the 250 drugs for which Medicare Part D spending was the greatest in US-based adult patient populations were included. Data were analyzed from September 2018 to June 2020.Main Outcomes and Measures: The presence and quality of published cost-effectiveness analyses for the 250 drugs for which Medicare Part D spending was greatest in 2016 were assessed based on the inclusion of key cost-effectiveness analysis elements and global ratings by independent reviewers for the Tufts CEA Registry.Results: Medicare Part D spending on the 250 drugs in the sample totaled $122.8 billion in 2016 (84.1% of total spending). Of these 250 drugs, 91 (36.4%) had a generic equivalent and 159 (63.6%) retained some patent exclusivity. There were 280 unique cost-effectiveness analyses for these drugs, representing data on 135 (54.0%) of the 250 drugs included and 67.0% of Part D spending on the top 250 drugs. The 115 drugs (46.0%) without cost-effectiveness studies accounted for 33.0% of Part D spending on the top 250 drugs. Of the 280 available studies, 128 (45.7%) were industry sponsored. A large proportion of the studies (250 [89.3%]) did not meet the minimum quality requirements.Conclusions and Relevance: In this cross-sectional study, a substantial proportion of 2016 Medicare Part D spending was for drugs with absent or low-quality cost-effectiveness analyses. The lack of quality analyses may present a challenge in efforts to develop policies addressing drug spending in terms of value.
View details for DOI 10.1001/jamanetworkopen.2021.13969
View details for PubMedID 34143189
-
Cost-effectiveness of Dapagliflozin for Treatment of Patients With Heart Failure With Reduced Ejection Fraction.
JAMA cardiology
2021
Abstract
Importance: In the Dapagliflozin and Prevention of Adverse Outcomes in Heart Failure (DAPA-HF) trial, dapagliflozin was shown to reduce cardiovascular mortality and hospitalizations due to heart failure while improving patient-reported health status. However, the cost-effectiveness of adding dapagliflozin therapy to standard of care (SOC) is unknown.Objective: To estimate the cost-effectiveness of dapagliflozin therapy among patients with chronic heart failure with reduced ejection fraction (HFrEF).Design, Setting, and Participants: This Markov cohort cost-effectiveness model used estimates of therapy effectiveness, transition probabilities, and utilities from the DAPA-HF trial and other published literature. Costs were derived from published sources. Patients with HFrEF included subgroups based on diabetes status and health status impairment due to heart failure. We compiled parameters from the literature including DAPA-HF, on which our model is based, and many other sources from December 2019 to February 27, 2021. We performed our analysis in February 2021.Exposures: Dapagliflozin or SOC.Main Outcomes and Measures: Hospitalizations for heart failure, life-years, quality-adjusted life-years (QALYs), costs, and the cost per QALY gained (incremental cost-effectiveness ratio).Results: In the model, dapagliflozin therapy yielded a mean of 0.78 additional life-years and 0.46 additional QALYs compared with SOC at an incremental cost of $38 212, resulting in a cost per QALY gained of $83 650. The cost per QALY was similar for patients with or without diabetes and for patients with mild or moderate impairment of health status due to heart failure. The cost-effectiveness was most sensitive to estimates of the effect on mortality and duration of therapy effectiveness. If the cost of dapagliflozin decreased from $474 to $270 (43% decline), the cost per QALY gained would drop below $50 000.Conclusions and Relevance: These findings suggest that dapagliflozin provides intermediate value compared with SOC, based on American College of Cardiology/American Heart Association benchmarks. Additional data regarding the magnitude of mortality reduction would improve the precision of cost-effectiveness estimates.
View details for DOI 10.1001/jamacardio.2021.1437
View details for PubMedID 34037681
-
Impact of Treatment Duration on Mortality Among Veterans with Opioid Use Disorder in the United States Veterans Health Administration.
Addiction (Abingdon, England)
2021
Abstract
BACKGROUND AND AIMS: While long-term medication-assisted treatment (MAT) using methadone or buprenorphine is associated with significantly lower all-cause mortality for individuals with opioid use disorder (OUD), periods of initiating or discontinuing treatment are associated with higher mortality risks relative to stable treatment. This study aimed to identify the OUD treatment durations necessary for the elevated mortality risks during treatment transitions to be balanced by reductions in mortality while receiving treatment.DESIGN: Simulation model based on a compartmental model of OUD diagnosis, MAT receipt, and all-cause mortality among Veterans with OUD in the United States Veteran Health Administration (VA) in 2017-2018. We simulated methadone and buprenorphine treatments of varying durations using parameters obtained through calibration and published meta-analyses of studies from North America, Europe, and Australia.SETTING: USA PARTICIPANTS: Simulated cohorts of 10,000 individuals with OUD MEASUREMENTS: All-cause mortality over 12 months FINDINGS: Receiving methadone for 4 months or longer or buprenorphine for 2 months or longer resulted in 54 (95% CI: 5-90) and 65 (95% CI: 21-89) fewer deaths relative to not receiving MAT for the same duration, using VA-specific mortality rates. We estimated shorter treatment durations necessary to achieve net mortality benefits of 2 months or longer for methadone and 1 month or longer for buprenorphine, using non-VA population literature estimates. Sensitivity analyses demonstrated that necessary treatment durations increased more with smaller mortality reductions on treatment than with larger relative risks during treatment transitions.CONCLUSIONS: Short periods (<6 months) of treatment with either methadone or buprenorphine are likely to yield net mortality benefits for people with opioid use disorder relative to receiving no medications, despite periods of elevated all-cause mortality risk during transitions into and out of treatment. Retaining people with opioid use disorder in treatment longer can increase these benefits.
View details for DOI 10.1111/add.15574
View details for PubMedID 33999485
-
Religion and Sanitation Practices
WORLD BANK ECONOMIC REVIEW
2021; 35 (2): 287-302
View details for DOI 10.1093/wber/lhz016
View details for Web of Science ID 000699403800001
-
COMPARISON OF STRATEGIES FOR TYPHOID CONJUGATE VACCINE INTRODUCTION IN INDIA: A GEOSPATIAL COST-EFFECTIVENESS MODELING STUDY
SAGE PUBLICATIONS INC. 2021: E240-E242
View details for Web of Science ID 000648637500189
-
A NEW METHOD FOR ESTIMATING THE CASE DETECTION FRACTION OF AN EMERGING EPIDEMIC AND AN APPLICATION TO COVID-19
SAGE PUBLICATIONS INC. 2021: E53-E55
View details for Web of Science ID 000648637500053
-
METHODS FOR CONSTRUCTING SUB-NATIONAL CONTACT MATRICES FOR TRANSMISSION MODELS OF RESPIRATORY VIRUSES LIKE SARS-COV-2 (COVID-19)
SAGE PUBLICATIONS INC. 2021: E62-E64
View details for Web of Science ID 000648637500059
-
POLICY ANALYSIS OF NON-PHARMACEUTICAL INTERVENTIONS AND RE-OPENING IN THE STATE OF HIDALGO, MEXICO
SAGE PUBLICATIONS INC. 2021: E200-E202
View details for Web of Science ID 000648637500163
-
ACCOUNTING FOR HOUSEHOLD TRANSMISSION DYNAMICS IN REALISTIC EPIDEMIC MODELS
SAGE PUBLICATIONS INC. 2021: E234-E236
View details for Web of Science ID 000648637500186
-
MODELING BIRTH COHORT TRENDS IN HEPATITIS C VIRUS INFECTION AND THEIR IMPLICATIONS IN CHINA
SAGE PUBLICATIONS INC. 2021: E249-E251
View details for Web of Science ID 000648637500195
-
POLICY COMPARISON OF NON-PHARMACEUTICAL INTERVENTIONS AND RE-OPENING IN MEXICO CITY, MEXICO: USING A NEAR-TERM VALIDATED MODEL TO CONTROL COVID-19 EPIDEMIC PEAKS AND REBOUNDS
SAGE PUBLICATIONS INC. 2021: E203-E204
View details for Web of Science ID 000648637500164
-
PREDICTING LIFE EXPECTANCY BASED ON TRIAL-REPORTED MEDIAN SURVIVAL WITH CANCER TREATMENTS: FEASIBILITY OF A NON-PARAMETRIC, REGISTRY-BASED APPROACH
SAGE PUBLICATIONS INC. 2021: E352-E353
View details for Web of Science ID 000648637500271
-
ASSESSING INTERVENTIONS THAT PREVENT MULTIPLE INFECTIOUS DISEASES: SIMPLE METHODS FOR MULTI-DISEASE MODELING
SAGE PUBLICATIONS INC. 2021: E335-E337
View details for Web of Science ID 000648637500258
-
MODELING INTERVENTIONS TO EXPAND MEDICATION-ASSISTED TREATMENT AMONG VETERANS WITH OPIOID USE DISORDER IN THE VETERANS HEALTH ADMINISTRATION
SAGE PUBLICATIONS INC. 2021: E189-E190
View details for Web of Science ID 000648637500155
-
COST-EFFECTIVENESS OF A STATE PERINATAL QUALITY COLLABORATIVE FOR REDUCING SEVERE MATERNAL MORBIDITY FROM HEMORRHAGE
SAGE PUBLICATIONS INC. 2021: E8-E9
View details for Web of Science ID 000648637500019
-
COST-EFFECTIVENESS OF A "WILD-CARD" PATIENT DESIGNATION POLICY IN DECEASED DONOR-KIDNEY TRANSPLANTS
SAGE PUBLICATIONS INC. 2021: E9-E10
View details for Web of Science ID 000648637500020
-
Predicting the Effectiveness of Endemic Infectious Disease Control Interventions: The Impact of Mass Action versus Network Model Structure.
Medical decision making : an international journal of the Society for Medical Decision Making
2021: 272989X211006025
Abstract
BACKGROUND: Analyses of the effectiveness of infectious disease control interventions often rely on dynamic transmission models to simulate intervention effects. We aim to understand how the choice of network or compartmental model can influence estimates of intervention effectiveness in the short and long term for an endemic disease with susceptible and infected states in which infection, once contracted, is lifelong.METHODS: We consider 4 disease models with different permutations of socially connected network versus unstructured contact (mass-action mixing) model and heterogeneous versus homogeneous disease risk. The models have susceptible and infected populations calibrated to the same long-term equilibrium disease prevalence. We consider a simple intervention with varying levels of coverage and efficacy that reduces transmission probabilities. We measure the rate of prevalence decline over the first 365 d after the intervention, long-term equilibrium prevalence, and long-term effective reproduction ratio at equilibrium.RESULTS: Prevalence declined up to 10% faster in homogeneous risk models than heterogeneous risk models. When the disease was not eradicated, the long-term equilibrium disease prevalence was higher in mass-action mixing models than in network models by 40% or more. This difference in long-term equilibrium prevalence between network versus mass-action mixing models was greater than that of heterogeneous versus homogeneous risk models (less than 30%); network models tended to have higher effective reproduction ratios than mass-action mixing models for given combinations of intervention coverage and efficacy.CONCLUSIONS: For interventions with high efficacy and coverage, mass-action mixing models could provide a sufficient estimate of effectiveness, whereas for interventions with low efficacy and coverage, or interventions in which outcomes are measured over short time horizons, predictions from network and mass-action models diverge, highlighting the importance of sensitivity analyses on model structure.HIGHLIGHTS: We calibrate 4 models-socially connected network versus unstructured contact (mass-action mixing) model and heterogeneous versus homogeneous disease risk-to 10% preintervention disease prevalence.We measure the short- and long-term intervention effectiveness of all models using the rate of prevalence decline, long-term equilibrium disease prevalence, and effective reproduction ratio.Generally, in the short term, prevalence declined faster in the homogeneous risk models than in the heterogeneous risk models.Generally, in the long term, equilibrium disease prevalence was higher in the mass-action mixing models than in the network models, and the effective reproduction ratio was higher in network models than in the mass-action mixing models.
View details for DOI 10.1177/0272989X211006025
View details for PubMedID 33899563
-
Critical Appraisal of Systematic Reviews With Costs and Cost-Effectiveness Outcomes: An ISPOR Good Practices Task Force Report.
Value in health : the journal of the International Society for Pharmacoeconomics and Outcomes Research
2021; 24 (4): 463–72
View details for DOI 10.1016/j.jval.2021.01.002
View details for PubMedID 33840423
-
Cost-effectiveness of Treatments for Opioid Use Disorder.
JAMA psychiatry
2021
Abstract
Importance: Opioid use disorder (OUD) is a significant cause of morbidity and mortality in the US, yet many individuals with OUD do not receive treatment.Objective: To assess the cost-effectiveness of OUD treatments and association of these treatments with outcomes in the US.Design and Setting: This model-based cost-effectiveness analysis included a US population with OUD.Interventions: Medication-assisted treatment (MAT) with buprenorphine, methadone, or injectable extended-release naltrexone; psychotherapy (beyond standard counseling); overdose education and naloxone distribution (OEND); and contingency management (CM).Main Outcomes and Measures: Fatal and nonfatal overdoses and deaths throughout 5 years, discounted lifetime quality-adjusted life-years (QALYs), and costs.Results: In the base case, in the absence of treatment, 42 717 overdoses (4132 fatal, 38 585 nonfatal) and 12 660 deaths were estimated to occur in a cohort of 100 000 patients over 5 years, and 11.58 discounted lifetime QALYs were estimated to be experienced per person. An estimated reduction in overdoses was associated with MAT with methadone (10.7%), MAT with buprenorphine or naltrexone (22.0%), and when combined with CM and psychotherapy (range, 21.0%-31.4%). Estimated deceased deaths were associated with MAT with methadone (6%), MAT with buprenorphine or naltrexone (13.9%), and when combined with CM, OEND, and psychotherapy (16.9%). MAT yielded discounted gains of 1.02 to 1.07 QALYs per person. Including only health care sector costs, methadone cost $16 000/QALY gained compared with no treatment, followed by methadone with OEND ($22 000/QALY gained), then by buprenorphine with OEND and CM ($42 000/QALY gained), and then by buprenorphine with OEND, CM, and psychotherapy ($250 000/QALY gained). MAT with naltrexone was dominated by other treatment alternatives. When criminal justice costs were included, all forms of MAT (with buprenorphine, methadone, and naltrexone) were associated with cost savings compared with no treatment, yielding savings of $25 000 to $105 000 in lifetime costs per person. The largest cost savings were associated with methadone plus CM. Results were qualitatively unchanged over a wide range of sensitivity analyses. An analysis using demographic and cost data for Veterans Health Administration patients yielded similar findings.Conclusions and Relevance: In this cost-effectiveness analysis, expanded access to MAT, combined with OEND and CM, was associated with cost-saving reductions in morbidity and mortality from OUD. Lack of widespread MAT availability limits access to a cost-saving medical intervention that reduces morbidity and mortality from OUD. Opioid overdoses in the US likely reached a record high in 2020 because of COVID-19 increasing substance use, exacerbating stress and social isolation, and interfering with opioid treatment. It is essential to understand the cost-effectiveness of alternative forms of MAT to treat OUD.
View details for DOI 10.1001/jamapsychiatry.2021.0247
View details for PubMedID 33787832
-
Patterns of heavy drinking behaviour over age and birth cohorts among Chinese men: a Markov model.
BMJ open
2021; 11 (3): e043261
Abstract
OBJECTIVES: To estimate the age patterns and cohort trends in heavy drinking among Chinese men from 1993 to 2011 and to project the future burden of heavy drinking through 2027.DESIGN: We constructed a Markov cohort model that simulates age-specific heavy drinking behaviours for a series of cohorts of Chinese men born between 1922 and 1993 and fitted the model to longitudinal data on drinking patterns (1993-2015). We projected male prevalence of heavy drinking from 2015 through 2027 with and without modification of heavy drinking behaviours.PARTICIPANTS: A cohort of Chinese men who were born between 1922 and 1993.MAIN OUTCOME MEASURES: Outcomes included age-specific and birth cohort-specific rates of initiating, quitting and reinitiating heavy drinking from 1993 through 2011, projected prevalence of heavy drinking from 2015 to 2027, and total reduction in prevalence and total averted deaths with hypothetical elimination of heavy drinking behaviours.RESULTS: Across multiple birth cohorts, middle-aged Chinese men have consistently higher risks of starting and resuming heavy drinking and lower probabilities of quitting their current heavy drinking than men in other age groups. From 1993 to 2011, the risk of starting or resuming heavy drinking continued to decrease over generations. Our model projected that the prevalence of heavy drinking among Chinese men will decrease by 33% (95% CI 11.5% to 54.6%) between 2015 and the end of 2027. Complete elimination of or acceptance of a change in heavy drinking behaviours among Chinese men could accelerate this decrease by 12percentage points (95%CI 7.8 to 18.2) and avert 377000 deaths (95%CI 228000 to 577000) in total from 2015 to 2027.CONCLUSION: Heavy drinking prevalence will continue to decrease through 2027 if current age-specific and birth cohort-specific patterns of starting, quitting and resuming heavy drinking continue. Effective mitigation policy should consider age-specific patterns in heavy drinking behaviours to further reduce the burden of heavy drinking.
View details for DOI 10.1136/bmjopen-2020-043261
View details for PubMedID 33653752
-
How simulation modeling can support the public health response to the opioid crisis in North America: Setting priorities and assessing value The OUD Modeling writing group
INTERNATIONAL JOURNAL OF DRUG POLICY
2021; 88
View details for DOI 10.1016/j.drugpo.2020.102726
View details for Web of Science ID 000635180000009
-
Using RE-AIM to examine the potential public health impact of an integrated collaborative care intervention for weight and depression management in primary care: Results from the RAINBOW trial.
PloS one
2021; 16 (3): e0248339
Abstract
BACKGROUND: An integrated collaborative care intervention was used to treat primary care patients with comorbid obesity and depression in a randomized clinical trial. To increase wider uptake and dissemination, information is needed on translational potential.METHODS: The trial collected longitudinal, qualitative data at baseline, 6 months (end of intensive treatment), 12 months (end of maintenance treatment), and 24 months (end of follow-up). Semi-structured interviews (n = 142) were conducted with 54 out of 409 randomly selected trial participants and 37 other stakeholders, such as recruitment staff, intervention staff, and clinicians. Using a Framework Analysis approach, we examined themes across time and stakeholder groups according to the RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) framework.RESULTS: At baseline, participants and other stakeholders reported being skeptical of the collaborative care approach related to some RE-AIM dimensions. However, over time they indicated greater confidence regarding the potential for future public health impact. They also provided information on barriers and actionable information to enhance program reach, effectiveness, adoption, implementation, and maintenance.CONCLUSIONS: RE-AIM provided a useful framework for understanding how to increase the impact of a collaborative and integrative approach for treating comorbid obesity and depression. It also demonstrates the utility of using the framework as a planning tool early in the evidence-generation pipeline.
View details for DOI 10.1371/journal.pone.0248339
View details for PubMedID 33705465
-
Outbreaks of COVID-19 variants in US prisons: a mathematical modelling analysis of vaccination and reopening policies.
The Lancet. Public health
2021
Abstract
Residents of prisons have experienced disproportionate COVID-19-related health harms. To control outbreaks, many prisons in the USA restricted in-person activities, which are now resuming even as viral variants proliferate. This study aims to use mathematical modelling to assess the risks and harms of COVID-19 outbreaks in prisons under a range of policies, including resumption of activities.We obtained daily resident-level data for all California state prisons from Jan 1, 2020, to May 15, 2021, describing prison layouts, housing status, sociodemographic and health characteristics, participation in activities, and COVID-19 testing, infection, and vaccination status. We developed a transmission-dynamic stochastic microsimulation parameterised by the California data and published literature. After an initial infection is introduced to a prison, the model evaluates the effect of various policy scenarios on infections and hospitalisations over 200 days. Scenarios vary by vaccine coverage, baseline immunity (0%, 25%, or 50%), resumption of activities, and use of non-pharmaceutical interventions (NPIs) that reduce transmission by 75%. We simulated five prison types that differ by residential layout and demographics, and estimated outcomes with and without repeated infection introductions over the 200 days.If a viral variant is introduced into a prison that has resumed pre-2020 contact levels, has moderate vaccine coverage (ranging from 36% to 76% among residents, dependent on age, with 40% coverage for staff), and has no baseline immunity, 23-74% of residents are expected to be infected over 200 days. High vaccination coverage (90%) coupled with NPIs reduces cumulative infections to 2-54%. Even in prisons with low room occupancies (ie, no more than two occupants) and low levels of cumulative infections (ie, <10%), hospitalisation risks are substantial when these prisons house medically vulnerable populations. Risks of large outbreaks (>20% of residents infected) are substantially higher if infections are repeatedly introduced.Balancing benefits of resuming activities against risks of outbreaks presents challenging trade-offs. After achieving high vaccine coverage, prisons with mostly one-to-two-person cells that have higher baseline immunity from previous outbreaks can resume in-person activities with low risk of a widespread new outbreak, provided they maintain widespread NPIs, continue testing, and take measures to protect the medically vulnerable.Horowitz Family Foundation, National Institute on Drug Abuse, Centers for Disease Control and Prevention, National Science Foundation, Open Society Foundation, Advanced Micro Devices.
View details for DOI 10.1016/S2468-2667(21)00162-6
View details for PubMedID 34364404
-
Health Disparities And COVID-19: The Authors Reply.
Health affairs (Project Hope)
2021; 40 (9): 1514
View details for DOI 10.1377/hlthaff.2021.01203
View details for PubMedID 34495717
-
Covid-19 in the California State Prison System: An Observational Study of Decarceration, Ongoing Risks, and Risk Factors.
medRxiv : the preprint server for health sciences
2021
Abstract
Correctional institutions nationwide are seeking to mitigate Covid-19-related risks.To quantify changes to California's prison population since the pandemic began and identify risk factors for Covid-19 infection.We described residents' demographic characteristics, health status, Covid-19 risk scores, room occupancy, and labor participation. We used Cox proportional hazard models to estimate the association between rates of Covid-19 infection and room occupancy and out-of-room labor, respectively.California state prisons (March 1-October 10, 2020).Residents of California state prisons.Changes in the incarcerated population's size, composition, housing, and activities. For the risk factor analysis, the exposure variables were room type (cells vs dormitories) and labor participation (any room occupant participating in the prior 2 weeks) and the outcome variable was incident Covid-19 case rates.The incarcerated population decreased 19.1% (119,401 to 96,623) during the study period.On October 10, 2020, 11.5% of residents were aged ≥60, 18.3% had high Covid-19 risk scores, 31.0% participated in out-of-room labor, and 14.8% lived in rooms with ≥10 occupants. Nearly 40% of residents with high Covid-19 risk scores lived in dormitories. In 9 prisons with major outbreaks (6,928 rooms; 21,750 residents), dormitory residents had higher infection rates than cell residents (adjusted hazard ratio [AHR], 2.51 95%CI, 2.25-2.80) and residents of rooms with labor participation had higher rates than residents of other rooms (AHR, 1.56; 95%CI, 1.39-1.74).Inability to measure density of residents' living conditions or contact networks among residents and staff.Despite reductions in room occupancy and mixing, California prisons still house many medically vulnerable residents in risky settings. Reducing risks further requires a combination of strategies, including rehousing, decarceration, and vaccination.Horowitz Family Foundation; National Institute on Drug Abuse; National Science Foundation Graduate Research Fellowship; Open Society Foundations.
View details for DOI 10.1101/2021.03.04.21252942
View details for PubMedID 33758868
View details for PubMedCentralID PMC7987024
-
COVID-19 in the California State Prison System: an Observational Study of Decarceration, Ongoing Risks, and Risk Factors.
Journal of general internal medicine
2021
Abstract
Correctional institutions nationwide are seeking to mitigate COVID-19-related risks.To quantify changes to California's prison population since the pandemic began and identify risk factors for COVID-19 infection.For California state prisons (March 1-October 10, 2020), we described residents' demographic characteristics, health status, COVID-19 risk scores, room occupancy, and labor participation. We used Cox proportional hazard models to estimate the association between rates of COVID-19 infection and room occupancy and out-of-room labor, respectively.Residents of California state prisons.Changes in the incarcerated population's size, composition, housing, and activities. For the risk factor analysis, the exposure variables were room type (cells vs. dormitories) and labor participation (any room occupant participating in the prior 2 weeks) and the outcome variable was incident COVID-19 case rates.The incarcerated population decreased 19.1% (119,401 to 96,623) during the study period. On October 10, 2020, 11.5% of residents were aged ≥60, 18.3% had high COVID-19 risk scores, 31.0% participated in out-of-room labor, and 14.8% lived in rooms with ≥10 occupants. Nearly 40% of residents with high COVID-19 risk scores lived in dormitories. In 9 prisons with major outbreaks (6,928 rooms; 21,750 residents), dormitory residents had higher infection rates than cell residents (adjusted hazard ratio [AHR], 2.51 95% CI, 2.25-2.80) and residents of rooms with labor participation had higher rates than residents of other rooms (AHR, 1.56; 95% CI, 1.39-1.74).Despite reductions in room occupancy and mixing, California prisons still house many medically vulnerable residents in risky settings. Reducing risks further requires a combination of strategies, including rehousing, decarceration, and vaccination.
View details for DOI 10.1007/s11606-021-07022-x
View details for PubMedID 34291377
-
Effectiveness of the mRNA-1273 Vaccine during a SARS-CoV-2 Delta Outbreak in a Prison.
The New England journal of medicine
2021
View details for DOI 10.1056/NEJMc2114089
View details for PubMedID 34670040
-
Dependence of COVID-19 Policies on End-of-Year Holiday Contacts in Mexico City Metropolitan Area: A Modeling Study.
MDM policy & practice
2021; 6 (2): 23814683211049249
Abstract
Background. Mexico City Metropolitan Area (MCMA) has the largest number of COVID-19 (coronavirus disease 2019) cases in Mexico and is at risk of exceeding its hospital capacity in early 2021. Methods. We used the Stanford-CIDE Coronavirus Simulation Model (SC-COSMO), a dynamic transmission model of COVID-19, to evaluate the effect of policies considering increased contacts during the end-of-year holidays, intensification of physical distancing, and school reopening on projected confirmed cases and deaths, hospital demand, and hospital capacity exceedance. Model parameters were derived from primary data, literature, and calibrated. Results. Following high levels of holiday contacts even with no in-person schooling, MCMA will have 0.9 million (95% prediction interval 0.3-1.6) additional COVID-19 cases between December 7, 2020, and March 7, 2021, and hospitalizations will peak at 26,000 (8,300-54,500) on January 25, 2021, with a 97% chance of exceeding COVID-19-specific capacity (9,667 beds). If MCMA were to control holiday contacts, the city could reopen in-person schools, provided they increase physical distancing with 0.5 million (0.2-0.9) additional cases and hospitalizations peaking at 12,000 (3,700-27,000) on January 19, 2021 (60% chance of exceedance). Conclusion. MCMA must increase COVID-19 hospital capacity under all scenarios considered. MCMA's ability to reopen schools in early 2021 depends on sustaining physical distancing and on controlling contacts during the end-of-year holiday.
View details for DOI 10.1177/23814683211049249
View details for PubMedID 34660906
View details for PubMedCentralID PMC8512280
-
Covid-19 Vaccine Acceptance in California State Prisons.
The New England journal of medicine
2021
View details for DOI 10.1056/NEJMc2105282
View details for PubMedID 33979505
-
Racial/Ethnic Disparities In COVID-19 Exposure Risk, Testing, And Cases At The Subcounty Level In California.
Health affairs (Project Hope)
2021: 101377hlthaff202100098
Abstract
With a population of forty million and substantial geographic variation in sociodemographics and health services, California is an important setting in which to study disparities. Its population (37.5 percent White, 39.1 percent Latino, 5.3 percent Black, and 14.4 percent Asian) experienced 59,258 COVID-19 deaths through April 14, 2021-the most of any state. We analyzed California's racial/ethnic disparities in COVID-19 exposure risks, testing rates, test positivity, and case rates through October 2020, combining data from 15.4 million SARS-CoV-2 tests with subcounty exposure risk estimates from the American Community Survey. We defined "high-exposure-risk" households as those with one or more essential workers and fewer rooms than inhabitants. Latino people in California are 8.1 times more likely to live in high-exposure-risk households than White people (23.6 percent versus 2.9 percent), are overrepresented in cumulative cases (3,784 versus 1,112 per 100,000 people), and are underrepresented in cumulative testing (35,635 versus 48,930 per 100,000 people). These risks and outcomes were worse for Latino people than for members of other racial/ethnic minority groups. Subcounty disparity analyses can inform targeting of interventions and resources, including community-based testing and vaccine access measures. Tracking COVID-19 disparities and developing equity-focused public health programming that mitigates the effects of systemic racism can help improve health outcomes among California's populations of color.
View details for DOI 10.1377/hlthaff.2021.00098
View details for PubMedID 33979192
-
Nationwide Cost-Effectiveness Analysis of Surgical Stabilization of Rib Fractures by Flail Chest Status and Age Groups
Journal of Trauma and Acute Care Surgery
2021
View details for DOI 10.1097/TA.0000000000003021
-
Mapping Inequality in SARS-CoV-2 Household Exposure and Transmission Risk in the USA.
Journal of general internal medicine
2021
View details for DOI 10.1007/s11606-021-06603-0
View details for PubMedID 33604818
View details for PubMedCentralID PMC7891469
-
School Reopenings and the Community During the COVID-19 Pandemic.
JAMA health forum
2020; 1 (10): e201294
View details for DOI 10.1001/jamahealthforum.2020.1294
View details for PubMedID 36218562
-
Optimal Allocation of Research Funds under a Budget Constraint.
Medical decision making : an international journal of the Society for Medical Decision Making
2020; 40 (6): 797–814
Abstract
Purpose. Health economic evaluations that include the expected value of sample information support implementation decisions as well as decisions about further research. However, just as decision makers must consider portfolios of implementation spending, they must also identify the optimal portfolio of research investments. Methods. Under a fixed research budget, a decision maker determines which studies to fund; additional budget allocated to one study to increase the study sample size implies less budget available to collect information to reduce decision uncertainty in other implementation decisions. We employ a budget-constrained portfolio optimization framework in which the decisions are whether to invest in a study and at what sample size. The objective is to maximize the sum of the studies' population expected net benefit of sampling (ENBS). We show how to determine the optimal research portfolio and study-specific levels of investment. We demonstrate our framework with a stylized example to illustrate solution features and a real-world application using 6 published cost-effectiveness analyses. Results. Among the studies selected for nonzero investment, the optimal sample size occurs at the point at which the marginal population ENBS divided by the marginal cost of additional sampling is the same for all studies. Compared with standard ENBS optimization without a research budget constraint, optimal budget-constrained sample sizes are typically smaller but allow more studies to be funded. Conclusions. The budget constraint for research studies directly implies that the optimal sample size for additional research is not the point at which the ENBS is maximized for individual studies. A portfolio optimization approach can yield higher total ENBS. Ultimately, there is a maximum willingness to pay for incremental information that determines optimal sample sizes.
View details for DOI 10.1177/0272989X20944875
View details for PubMedID 32845233
-
Cost-effectiveness of first-line therapy for advanced renal cell carcinoma in the immunotherapy era.
LIPPINCOTT WILLIAMS & WILKINS. 2020
View details for Web of Science ID 000560368307310
-
THE POTENTIAL PUBLIC HEALTH IMPACT OF A DEPRESSION AND WEIGHT MANAGEMENT INTERVENTION: LESSONS FROM THE RAINBOW TRIAL
OXFORD UNIV PRESS INC. 2020: S505
View details for Web of Science ID 000546262401233
-
First-Year Economic and Quality of Life Effects of the RAINBOW Intervention to Treat Comorbid Obesity and Depression.
Obesity (Silver Spring, Md.)
2020
Abstract
OBJECTIVE: Obesity and depression are prevalent and often co-occurring conditions in the United States. The Research Aimed at Improving Both Mood and Weight (RAINBOW) randomized trial demonstrated the effectiveness of an integrated intervention for adults with both conditions. Characterizing the intervention's economic effects is important for broader dissemination and implementation.METHODS: This study evaluated the cost (2018 US dollars) and health-related quality of life(HRQoL) impacts during RAINBOW's first year, comparing intervention (n=204) and usual-care groups (n=205). Outcomes included intervention delivery costs, differential changes in antidepressant medication spending compared with the pretrial year, differential changes in medical services spending compared with the pretrial year, and HRQoL changes from baseline using Euroqol-5D US utility weights.RESULTS: RAINBOW's 1-year delivery cost per person was $2,251. Compared with usual care, annual antidepressant medication days increased more (38 days [95% CI: 4 to 72]; P=0.027). Annual antidepressant medication spending had a larger, nonsignificant increase ($89 [95% CI: -$20 to $197]; P=0.109). Annual spending on medical care services had a smaller, nonsignificant decrease (-$54 [95% CI: -$832 to $941]; P=0.905). HRQoL had a nonsignificant increase (0.011 [95% CI: -0.025 to 0.047]; P=0.546).CONCLUSIONS: The RAINBOW intervention's economic value will depend on how its 1-year improvements in obesity and depression translate into long-term reduced morbidity, delayed mortality, or averted costs.
View details for DOI 10.1002/oby.22805
View details for PubMedID 32320533
-
Effect of an Intervention for Obesity and Depression on Patient-Centered Outcomes: An RCT.
American journal of preventive medicine
2020
Abstract
INTRODUCTION: An integrated collaborative care intervention was successful for treating comorbid obesity and depression. The effect of the integrated intervention on secondary outcomes of quality of life and psychosocial functioning were examined, as well as whether improvements in these secondary outcomes were correlated with improvements in the primary outcomes of weight and depressive symptoms.STUDY DESIGN: This RCT compared an integrated collaborative care intervention for obesity and depression to usual care. Data were analyzed in 2018.SETTING/PARTICIPANTS: Adult primary care patients (n=409) with a BMI ≥30 (≥27 if Asian) and 9-Item Patient Health Questionnaire score ≥10 were recruited from September 30, 2014 to January 12, 2017 from primary care clinics in Northern California.INTERVENTION: The 12-month intervention integrated a behavioral weight loss program and problem-solving therapy with as-needed antidepressant medications for depression.MAIN OUTCOME MEASURES: A priori secondary outcomes included health-related quality of life (Short Form-8 Health Survey), obesity-specific quality of life (Obesity-Related Problems Scale), sleep disturbance and sleep-related impairment (Patient-Reported Outcomes Measurement Information System), and functional disability (Sheehan Disability Scale) at baseline and 6 and 12 months.RESULTS: Participants randomized to the intervention experienced significantly greater improvements in obesity-specific problems, mental health-related quality of life, sleep disturbance, sleep-related impairment, and functional disability at 6 months but not 12 months. Improvements in obesity-related problems (beta=0.01, 95% CI=0.01, 0.02) and sleep disturbance (beta= -0.02, 95% CI= -0.04, 0) were associated with lower BMI. Improvements in the physical (beta= -0.01, 95% CI= -0.01, 0) and mental health components (beta= -0.02, 95% CI= -0.03, -0.02) of the Short Form-8 Health Survey as well as sleep disturbance (beta=0.01, 95% CI=0.01, 0.02) and sleep-related impairment (beta=0.01, 95% CI=0, 0.01) were associated with fewer depressive symptoms.CONCLUSIONS: An integrated collaborative care intervention for obesity and depression that was shown previously to improve weight and depressive symptoms may also confer benefits for quality of life and psychosocial functioning over 6 months.TRIAL REGISTRATION: This study is registered at clinicaltrials.gov NCT02246413.
View details for DOI 10.1016/j.amepre.2019.11.005
View details for PubMedID 32067873
-
COMPARING METHODS FOR MODEL CALIBRATION WITH HIGH UNCERTAINTY: MODELING CHOLERA IN BANGLADESH
SAGE PUBLICATIONS INC. 2020: E186–E187
View details for Web of Science ID 000509275600163
-
How do Covid-19 policy options depend on end-of-year holiday contacts in Mexico City Metropolitan Area? A Modeling Study.
medRxiv : the preprint server for health sciences
2020
Abstract
With more than 20 million residents, Mexico City Metropolitan Area (MCMA) has the largest number of Covid-19 cases in Mexico and is at risk of exceeding its hospital capacity in late December 2020.We used SC-COSMO, a dynamic compartmental Covid-19 model, to evaluate scenarios considering combinations of increased contacts during the holiday season, intensification of social distancing, and school reopening. Model parameters were derived from primary data from MCMA, published literature, and calibrated to time-series of incident confirmed cases, deaths, and hospital occupancy. Outcomes included projected confirmed cases and deaths, hospital demand, and magnitude of hospital capacity exceedance.Following high levels of holiday contacts even with no in-person schooling, we predict that MCMA will have 1·0 million (95% prediction interval 0·5 - 1·7) additional Covid-19 cases between December 7, 2020 and March 7, 2021 and that hospitalizations will peak at 35,000 (14,700 - 67,500) on January 27, 2021, with a >99% chance of exceeding Covid-19-specific capacity (9,667 beds). If holiday contacts can be controlled, MCMA can reopen in-person schools provided social distancing is increased with 0·5 million (0·2 - 1·0) additional cases and hospitalizations peaking at 14,900 (5,600 - 32,000) on January 23, 2021 (77% chance of exceedance).MCMA must substantially increase Covid-19 hospital capacity under all scenarios considered. MCMA's ability to reopen schools in mid-January 2021 depends on sustaining social distancing and that contacts during the end-of-year holiday were well controlled.Society for Medical Decision Making, Gordon and Betty Moore Foundation, and Wadhwani Institute for Artificial Intelligence Foundation.Evidence before this study: As of mid-December 2020, Mexico has the twelfth highest incidence of confirmed cases of Covid-19 worldwide and its epidemic is currently growing. Mexico's case fatality ratio (CFR) - 9·1% - is the second highest in the world. With more than 20 million residents, Mexico City Metropolitan Area (MCMA) has the highest number and incidence rate of Covid-19 confirmed cases in Mexico and a CFR of 8·1%. MCMA is nearing its current hospital capacity even as it faces the prospect of increased social contacts during the 2020 end-of-year holidays. There is limited Mexico-specific evidence available on epidemic, such as parameters governing time-dependent mortality, hospitalization and transmission. Literature searches required supplementation through primary data analysis and model calibration to support the first realistic model-based Covid-19 policy evaluation for Mexico, which makes this analysis relevant and timely.Added value of this study: Study strengths include the use of detailed primary data provided by MCMA; the Bayesian model calibration to enable evaluation of projections and their uncertainty; and consideration of both epidemic and health system outcomes. The model projects that failure to limit social contacts during the end-of-year holidays will substantially accelerate MCMA's epidemic (1·0 million (95% prediction interval 0·5 - 1·7) additional cases by early March 2021). Hospitalization demand could reach 35,000 (14,700 - 67,500), with a >99% chance of exceeding current capacity (9,667 beds). Controlling social contacts during the holidays could enable MCMA to reopen in-person schooling without greatly exacerbating the epidemic provided social distancing in both schools and the community were maintained. Under all scenarios and policies, current hospital capacity appears insufficient, highlighting the need for rapid capacity expansion.Implications of all the available evidence: MCMA officials should prioritize rapid hospital capacity expansion. MCMA's ability to reopen schools in mid-January 2021 depends on sustaining social distancing and that contacts during the end-of-year holiday were well controlled.
View details for DOI 10.1101/2020.12.21.20248597
View details for PubMedID 33398301
View details for PubMedCentralID PMC7781344
-
Computing the Expected Value of Sample Information Efficiently: Practical Guidance and Recommendations for Four Model-Based Methods.
Value in health : the journal of the International Society for Pharmacoeconomics and Outcomes Research
2020; 23 (6): 734–42
Abstract
Value of information (VOI) analyses can help policy makers make informed decisions about whether to conduct and how to design future studies. Historically a computationally expensive method to compute the expected value of sample information (EVSI) restricted the use of VOI to simple decision models and study designs. Recently, 4 EVSI approximation methods have made such analyses more feasible and accessible. Members of the Collaborative Network for Value of Information (ConVOI) compared the inputs, the analyst's expertise and skills, and the software required for the 4 recently developed EVSI approximation methods. Our report provides practical guidance and recommendations to help inform the choice between the 4 efficient EVSI estimation methods. More specifically, this report provides: (1) a step-by-step guide to the methods' use, (2) the expertise and skills required to implement the methods, and (3) method recommendations based on the features of decision-analytic problems.
View details for DOI 10.1016/j.jval.2020.02.010
View details for PubMedID 32540231
-
Cost-Effectiveness of a Pharmacogenomic Test for Stratified Isoniazid Dosing in Treatment of Active Tuberculosis.
Clinical infectious diseases : an official publication of the Infectious Diseases Society of America
2020
Abstract
There is marked interindividual variability in metabolism and resulting toxicity and effectiveness of drugs used for tuberculosis treatment. For isoniazid, mutations in the N-acetyltransferase-2 (NAT2) gene explain over 88% of pharmacokinetic variability. However, weight-based dosing remains the norm globally. The potential clinical impact and cost-effectiveness of pharmacogenomic-guided therapy (PGT) is unknown.We constructed a decision tree model to project lifetime costs and benefits of isoniazid PGT for drug-susceptible tuberculosis in Brazil, South Africa, and India. PGT was modeled to reduce isoniazid toxicity among slow NAT2 acetylators and reduce treatment failure among rapid acetylators. The genotyping test was assumed to cost the same as the GeneXpert test. The main outcomes were costs (2018 USD), quality adjusted life years (QALYs), and incremental cost-effectiveness ratios.In Brazil, PGT gained 19 discounted life years (23 QALYs) and cost $11,064 per 1,000 patients, a value of $476 per QALY gained. In South Africa, PGT gained 15 life years (19 QALYs) and cost $33,182 per 1,000 patients, a value of $1,780 per QALY gained. In India, PGT gained 20 life years (24 QALYs) and cost $13,195 per 1,000 patients, a value of $546 per QALY gained. One-way sensitivity analyses showed the cost-effectiveness to be robust to all input parameters. Probabilistic sensitivity analyses were below per capita GDP in all three countries in 99% of simulations.Isoniazid PGT improves health outcomes and would be cost-effective in the treatment of drug-susceptible tuberculosis in Brazil, South Africa, and India.
View details for DOI 10.1093/cid/ciz1212
View details for PubMedID 31905381
-
The household secondary attack rate of SARS-CoV-2: A rapid review.
Clinical infectious diseases : an official publication of the Infectious Diseases Society of America
2020
Abstract
Although much of the public health effort to combat COVID-19 has focused on disease control strategies in public settings, transmission of SARS-CoV-2 within households remains an important problem. The nature and determinants of household transmission are poorly understood.To address this gap, we gathered and analyzed data from 22 published and pre-published studies from 10 countries (20,291 household contacts) that were available through September 2, 2020. Our goal was to combine estimates of the SARS-CoV-2 household secondary attack rate (SAR) and explore variation in estimates of the household SAR.The overall pooled random-effects estimate of the household SAR was 17.1% (95% CI: 13.7-21.2%). In study-level, random-effects meta-regressions stratified by testing frequency (1 test, 2 tests, >2 tests), SAR estimates were 9.2% (95% CI: 6.7-12.3%), 17.5% (95% CI: 13.9-21.8%), and 21.3% (95% CI: 13.8-31.3%), respectively. Household SAR tended to be higher among older adult contacts and among contacts of symptomatic cases.These findings suggest that SAR reported using a single follow-up test may be underestimated and that testing household contacts of COVID-19 cases on multiple occasions may increase the yield for identifying secondary cases.
View details for DOI 10.1093/cid/ciaa1558
View details for PubMedID 33045075
-
Calculating the Expected Value of Sample Information in Practice: Considerations from 3 Case Studies.
Medical decision making : an international journal of the Society for Medical Decision Making
2020: 272989X20912402
Abstract
Background. Investing efficiently in future research to improve policy decisions is an important goal. Expected value of sample information (EVSI) can be used to select the specific design and sample size of a proposed study by assessing the benefit of a range of different studies. Estimating EVSI with the standard nested Monte Carlo algorithm has a notoriously high computational burden, especially when using a complex decision model or when optimizing over study sample sizes and designs. Recently, several more efficient EVSI approximation methods have been developed. However, these approximation methods have not been compared, and therefore their comparative performance across different examples has not been explored. Methods. We compared 4 EVSI methods using 3 previously published health economic models. The examples were chosen to represent a range of real-world contexts, including situations with multiple study outcomes, missing data, and data from an observational rather than a randomized study. The computational speed and accuracy of each method were compared. Results. In each example, the approximation methods took minutes or hours to achieve reasonably accurate EVSI estimates, whereas the traditional Monte Carlo method took weeks. Specific methods are particularly suited to problems where we wish to compare multiple proposed sample sizes, when the proposed sample size is large, or when the health economic model is computationally expensive. Conclusions. As all the evaluated methods gave estimates similar to those given by traditional Monte Carlo, we suggest that EVSI can now be efficiently computed with confidence in realistic examples. No systematically superior EVSI computation method exists as the properties of the different methods depend on the underlying health economic model, data generation process, and user expertise.
View details for DOI 10.1177/0272989X20912402
View details for PubMedID 32297840
-
Cost-Effectiveness of Initial Versus Delayed Lanreotide for Treatment of Metastatic Enteropancreatic Neuroendocrine Tumors.
Journal of the National Comprehensive Cancer Network : JNCCN
2020; 18 (9): 1200–1209
Abstract
The Controlled Study of Lanreotide Antiproliferative Response in Neuroendocrine Tumors (CLARINET) trial showed prolonged progression-free survival in patients initially treated with lanreotide versus placebo. We evaluated the cost-effectiveness of upfront lanreotide versus active surveillance with lanreotide administered after progression in patients with metastatic enteropancreatic neuroendocrine tumors (NETs), both of which are treatment options recommended in NCCN Clinical Practice Guidelines in Oncology for Neuroendocrine and Adrenal Tumors.We developed a Markov model calibrated to the CLARINET trial and its extension. We based the active surveillance strategy on the CLARINET placebo arm. We calculated incremental cost-effectiveness ratios (ICERs) in dollars per quality-adjusted life-year (QALY). We modeled lanreotide's cost at $7,638 per 120 mg (average sales price plus 6%), used published utilities (stable disease, 0.77; progressed disease, 0.61), adopted a healthcare sector perspective and lifetime time horizon, and discounted costs and benefits at 3% annually. We examined sensitivity to survival extrapolation and modeled octreotide long-acting release (LAR) ($6,183 per 30 mg). We conducted one-way, multiway, and probabilistic sensitivity analyses.Upfront lanreotide led to 5.21 QALYs and a cost of $804,600. Active surveillance followed by lanreotide after progression led to 4.84 QALYs and a cost of $590,200, giving an ICER of $578,500/QALY gained. Reducing lanreotide's price by 95% (to $370) or 85% (to $1,128) per 120 mg would allow upfront lanreotide to reach ICERs of $100,000/QALY or $150,000/QALY. Across a range of survival curve extrapolation scenarios, pricing lanreotide at $370 to $4,000 or $1,130 to $5,600 per 120 mg would reach ICERs of $100,000/QALY or $150,000/QALY, respectively. Our findings were robust to extensive sensitivity analyses. The ICER modeling octreotide LAR is $482,700/QALY gained.At its current price, lanreotide is not cost-effective as initial therapy for patients with metastatic enteropancreatic NETs and should be reserved for postprogression treatment. To be cost-effective as initial therapy, the price of lanreotide would need to be lowered by 48% to 95% or 27% to 86% to reach ICERs of $100,000/QALY or $150,00/QALY, respectively.
View details for DOI 10.6004/jnccn.2020.7563
View details for PubMedID 32886901
-
Public Health Interventions with Harms and Benefits: A Graphical Framework for Evaluating Tradeoffs.
Medical decision making : an international journal of the Society for Medical Decision Making
2020: 272989X20960458
Abstract
Evaluations of public health interventions typically report benefits and harms aggregated over the population. However, benefits and harms are not always evenly distributed. Examining disaggregated outcomes enables decision makers to consider health benefits and harms accruing to both intended intervention recipients and others in the population.We provide a graphical framework for categorizing and comparing public health interventions that examines the distribution of benefit and harm between and within population subgroups for a single intervention and compares distributions of harm and benefit for multiple interventions. We demonstrate the framework through a case study of a hypothetical increase in the price of meat (5%, 10%, 25%, or 50%) that, via elasticity of demand, reduces consumption and consequently reduces body mass index. We examine how inequalities in benefits and harms (measured by quality-adjusted life-years) are distributed across a population of white and black males and females.A 50% meat price increase would yield the greatest net benefit to the population. However, because of reduced consumption among low-weight individuals, black males would bear disproportionate harm relative to the benefit they receive. With increasing meat price, the distribution of harm relative to benefit becomes less "internal" to those receiving benefit and more "distributed" to those not receiving commensurate benefit. When we segment the population by sex only, this result does not hold.Disaggregating harms and benefits to understand their differential impact on subgroups can strongly affect which decision alternative is deemed optimal, as can the approach to segmenting the population. Our framework provides a useful tool for illuminating key tradeoffs relevant to harm-averse decision makers and those concerned with both equity and efficiency.
View details for DOI 10.1177/0272989X20960458
View details for PubMedID 32996356
-
What Is the Optimal Primary Care Panel Size?: A Systematic Review.
Annals of internal medicine
2020
Abstract
Primary care for a panel of patients is a central component of population health, but the optimal panel size is unclear.To review evidence about the association of primary care panel size with health care outcomes and provider burnout.English-language searches of multiple databases from inception to October 2019 and Google searches performed in September 2019.English-language studies of any design, including simulation models, that assessed the association between primary care panel size and safety, efficacy, patient-centeredness, timeliness, efficiency, equity, or provider burnout.Independent, dual-reviewer extraction; group consensus rating of certainty of evidence.Sixteen hypothesis-testing studies and 12 simulation modeling studies met inclusion criteria. All but 1 hypothesis-testing study were cross-sectional assessments of association. Three studies each provided low-certainty evidence that increasing panel size was associated with no or modestly adverse effects on patient-centered and effective care. Eight studies provided low-certainty evidence that increasing panel size was associated with variable effects on timely care. No studies assessed the effect of panel size on safety, efficiency, or equity. One study provided very-low-certainty evidence of an association between increased panel size and provider burnout. The 12 simulation studies evaluated 5 models; all used access as the only outcome of care. Five and 2 studies, respectively, provided moderate-certainty evidence that adjusting panel size for case mix and adding clinical conditions to the case mix resulted in better access.No studies had concurrent comparison groups, and published and unpublished studies may have been missed.Evidence is insufficient to make evidence-based recommendations about the optimal primary care panel size for achieving beneficial health outcomes.Veterans Affairs Quality Enhancement Research Initiative.
View details for DOI 10.7326/M19-2491
View details for PubMedID 31958814
-
Methods for Model Calibration under High Uncertainty: Modeling Cholera in Bangladesh.
Medical decision making : an international journal of the Society for Medical Decision Making
2020: 272989X20938683
Abstract
Background. Published data on a disease do not always correspond directly to the parameters needed to simulate natural history. Several calibration methods have been applied to computer-based disease models to extract needed parameters that make a model's output consistent with available data. Objective. To assess 3 calibration methods and evaluate their performance in a real-world application. Methods. We calibrated a model of cholera natural history in Bangladesh, where a lack of active surveillance biases available data. We built a cohort state-transition cholera natural history model that includes case hospitalization to reflect the passive surveillance data-generating process. We applied 3 calibration techniques: incremental mixture importance sampling, sampling importance resampling, and random search with rejection sampling. We adapted these techniques to the context of wide prior uncertainty and many degrees of freedom. We evaluated the resulting posterior parameter distributions using a range of metrics and compared predicted cholera burden estimates. Results. All 3 calibration techniques produced posterior distributions with a higher likelihood and better fit to calibration targets as compared with prior distributions. Incremental mixture importance sampling resulted in the highest likelihood and largest number of unique parameter sets to better inform joint parameter uncertainty. Compared with naïve uncalibrated parameter sets, calibrated models of cholera in Bangladesh project substantially more cases, many of which are not detected by passive surveillance, and fewer deaths. Limitations. Calibration cannot completely overcome poor data quality, which can leave some parameters less well informed than others. Calibration techniques may perform differently under different circumstances. Conclusions. Incremental mixture importance sampling, when adapted to the context of high uncertainty, performs well. By accounting for biases in data, calibration can improve model projections of disease burden.
View details for DOI 10.1177/0272989X20938683
View details for PubMedID 32639859
-
OPTIMAL ALLOCATION OF CLINICAL TRIAL SAMPLE SIZE TO SUBPOPULATIONS WITH CORRELATED PARAMETERS
SAGE PUBLICATIONS INC. 2020: E260–E261
View details for Web of Science ID 000509275600223
-
PRACTICAL CONSIDERATIONS FOR THE EFFICIENT COMPUTATION OF THE EXPECTED VALUE OF SAMPLE INFORMATION TO PRIORITIZE RESEARCH IN HEALTH CARE
SAGE PUBLICATIONS INC. 2020: E63–E64
View details for Web of Science ID 000509275600063
-
ESTIMATES OF FIRST-YEAR QUALITY-ADJUSTED LIFE YEAR LOSS FOR INJURIES AND POISONINGS IN THE UNITED STATES
SAGE PUBLICATIONS INC. 2020: E277–E278
View details for Web of Science ID 000509275600236
-
CALIBRATION TO CROSS-SECTIONAL DATA WHEN BIRTH COHORT TRENDS EXIST AS EXEMPLIFIED BY MODELS OF HEAVY DRINKING IN CHINA: CONSISTENCY WITH CALIBRATION TARGETS DESPITE MARKEDLY DIFFERENT PARAMETERS AND FUTURE PROJECTIONS
SAGE PUBLICATIONS INC. 2020: E125–E126
View details for Web of Science ID 000509275600110
-
CALIBRATION TO CROSS-SECTIONAL DATA WHEN BIRTH COHORT TRENDS EXIST AS EXEMPLIFIED BY MODELS OF HEAVY DRINKING IN CHINA: CONSISTENCY WITH CALIBRATION TARGETS DESPITE MARKEDLY DIFFERENT PARAMETERS AND FUTURE PROJECTIONS
SAGE PUBLICATIONS INC. 2020: E378–E379
View details for Web of Science ID 000509275600315
-
PRICING TREATMENTS COST-EFFECTIVELY WHEN THEY HAVE MULTIPLE INDICATIONS: NOT JUST A SIMPLE THRESHOLD ANALYSIS
SAGE PUBLICATIONS INC. 2020: E248
View details for Web of Science ID 000509275600214
-
AGE- AND TIME-TRENDS IN HARMFUL ALCOHOL USE IN CHINA: PROJECTING BURDEN AND THE POTENTIAL FOR INTERVENTION BENEFIT
SAGE PUBLICATIONS INC. 2020: E49–E50
View details for Web of Science ID 000509275600053
-
OPTIMIZING PORTFOLIOS OF RESEARCH STUDIES DESIGNED TO INFORM ECONOMIC EVALUATIONS
SAGE PUBLICATIONS INC. 2020: E253–E254
View details for Web of Science ID 000509275600219
-
CALCULATING THE EXPECTED VALUE OF SAMPLE INFORMATION IN PRACTICE: CONSIDERATIONS FROM THREE CASE STUDIES
SAGE PUBLICATIONS INC. 2020: E337–E338
View details for Web of Science ID 000509275600285
-
PREDICTING THE EFFECTIVENESS OF INTERVENTIONS FOR INFECTIOUS DISEASE CONTROL: THE ROLE OF MODEL STRUCTURE
SAGE PUBLICATIONS INC. 2020: E377
View details for Web of Science ID 000509275600314
-
PREDICTING THE EFFECTIVENESS OF INTERVENTIONS FOR INFECTIOUS DISEASE CONTROL: THE ROLE OF MODEL STRUCTURE
SAGE PUBLICATIONS INC. 2020: E118
View details for Web of Science ID 000509275600105
-
COST-EFFECTIVENESS OF INTERVENTIONS TO PREVENT PLAGUE IN MADAGASCAR
SAGE PUBLICATIONS INC. 2020: E116–E117
View details for Web of Science ID 000509275600104
-
ESTIMATED MORTALITY RATES AMONG TREATED AND UNTREATED VETERANS WITH OPIOID USE DISORDER IN THE VETERANS HEALTH ADMINISTRATION
SAGE PUBLICATIONS INC. 2020: E105–E106
View details for Web of Science ID 000509275600095
-
Cost-Effectiveness of Transitional Care Services After Hospitalization With Heart Failure.
Annals of internal medicine
2020
Abstract
Patients with heart failure (HF) discharged from the hospital are at high risk for death and rehospitalization. Transitional care service interventions attempt to mitigate these risks.To assess the cost-effectiveness of 3 types of postdischarge HF transitional care services and standard care.Decision analytic microsimulation model.Randomized controlled trials, clinical registries, cohort studies, Centers for Disease Control and Prevention life tables, Centers for Medicare & Medicaid Services data, and National Inpatient Sample (Healthcare Cost and Utilization Project) data.Patients with HF who were aged 75 years at hospital discharge.Lifetime.Health care sector.Disease management clinics, nurse home visits (NHVs), and nurse case management.Quality-adjusted life-years (QALYs), costs, net monetary benefits, and incremental cost-effectiveness ratios (ICERs).All 3 transitional care interventions examined were more costly and effective than standard care, with NHVs dominating the other 2 interventions. Compared with standard care, NHVs increased QALYs (2.49 vs. 2.25) and costs ($81 327 vs. $76 705), resulting in an ICER of $19 570 per QALY gained.Results were largely insensitive to variations in in-hospital mortality, age at baseline, or costs of rehospitalization. Probabilistic sensitivity analysis confirmed that transitional care services were preferred over standard care in nearly all 10 000 samples, at willingness-to-pay thresholds of $50 000 or more per QALY gained.Transitional care service designs and implementations are heterogeneous, leading to uncertainty about intervention effectiveness and costs when applied in particular settings.In older patients with HF, transitional care services are economically attractive, with NHVs being the most cost-effective strategy in many situations. Transitional care services should become the standard of care for postdischarge management of patients with HF.Swiss National Science Foundation, Research Council of Norway, and an Intermountain-Stanford collaboration.
View details for DOI 10.7326/M19-1980
View details for PubMedID 31986526
-
PREVENTION AND CONTROL OF DENGUE AND CHIKUNGUNYA IN COLOMBIA: A COST-EFFECTIVENESS ANALYSIS
SAGE PUBLICATIONS INC. 2020: E85–E87
View details for Web of Science ID 000509275600080
-
Quantifying Positive Health Externalities of Disease Control Interventions: Modeling Chikungunya and Dengue.
Medical decision making : an international journal of the Society for Medical Decision Making
2019: 272989X19880554
Abstract
Purpose. Health interventions can generate positive externalities not captured in traditional, single-disease cost-effectiveness analyses (CEAs), potentially biasing results. We illustrate this with the example of mosquito-borne diseases. When a particular mosquito species can transmit multiple diseases, a single-disease CEA comparing disease-specific interventions (e.g., vaccination) with interventions targeting the mosquito population (e.g., insecticide) would underestimate the insecticide's full benefits (i.e., preventing other diseases). Methods. We developed three dynamic transmission models: chikungunya, dengue, and combined chikungunya and dengue, each calibrated to disease-specific incidence and deaths in Colombia (June 2014 to December 2017). We compared the models' predictions of the incremental benefits and cost-effectiveness of an insecticide (10% efficacy), hypothetical chikungunya and dengue vaccines (40% coverage, 95% efficacy), and combinations of these interventions. Results. Model calibration yielded realistic parameters that produced close matches to disease-specific incidence and deaths. The chikungunya model predicted that vaccine would decrease the incidence of chikungunya and avert more total deaths than insecticide. The dengue model predicted that insecticide and the dengue vaccine would reduce dengue incidence and deaths, with no effect for the chikungunya vaccine. In the combined model, insecticide was more effective than either vaccine in reducing the incidence of and deaths from both diseases. In all models, the combined strategy was at least as effective as the most effective single strategy. In an illustrative CEA, the most frequently preferred strategy was vaccine in the chikungunya model, the status quo in the dengue model, and insecticide in the combined model. Limitations. There is uncertainty in the target calibration data. Conclusions. Failure to capture positive externalities can bias CEA results, especially when evaluating interventions that affect multiple diseases. Multidisease modeling is a reasonable alternative for addressing such biases.
View details for DOI 10.1177/0272989X19880554
View details for PubMedID 31642362
-
The Costs of Hepatitis C by Liver Disease Stage: Estimates from the Veterans Health Administration
APPLIED HEALTH ECONOMICS AND HEALTH POLICY
2019; 17 (4): 513–21
View details for DOI 10.1007/s40258-019-00468-5
View details for Web of Science ID 000475518500007
-
Cost-effectiveness of Screening for Nasopharyngeal Carcinoma among Asian American Men in the United States
OTOLARYNGOLOGY-HEAD AND NECK SURGERY
2019; 161 (1): 82–90
View details for DOI 10.1177/0194599819832593
View details for Web of Science ID 000473507100011
-
Operative Versus Nonoperative Management of Appendicitis: A Long-Term Cost Effectiveness Analysis.
MDM policy & practice
2019; 4 (2): 2381468319866448
Abstract
Background. Recent clinical trials suggest that nonoperative management (NOM) of patients with acute, uncomplicated appendicitis is an acceptable alternative to surgery. However, limited data exist comparing the long-term cost-effectiveness of nonoperative treatment strategies. Design. We constructed a Markov model comparing the cost-effectiveness of three treatment strategies for uncomplicated appendicitis: 1) laparoscopic appendectomy, 2) inpatient NOM, and 3) outpatient NOM. The model assessed lifetime costs and outcomes from a third-party payer perspective. The preferred strategy was the one yielding the greatest utility without exceeding a $50,000 willingness-to-pay threshold. Results. Outpatient NOM cost $233,700 over a lifetime; laparoscopic appendectomy cost $2500 more while inpatient NOM cost $7300 more. Outpatient NOM generated 24.9270 quality-adjusted life-years (QALYs), while laparoscopic appendectomy and inpatient NOM yielded 0.0709 and 0.0005 additional QALYs, respectively. Laparoscopic appendectomy was cost-effective compared with outpatient NOM (incremental cost-effectiveness ratio $32,300 per QALY gained); inpatient NOM was dominated by laparoscopic appendectomy. In one-way sensitivity analyses, the preferred strategy changed when varying perioperative mortality, probability of appendiceal malignancy or recurrent appendicitis after NOM, probability of a complicated recurrence, and appendectomy cost. A two-way sensitivity analysis showed that the rates of NOM failure and appendicitis recurrence described in randomized trials exceeded the values required for NOM to be preferred. Limitations. There are limited NOM data to generate long-term model probabilities. Health state utilities were often drawn from single studies and may significantly influence model outcomes. Conclusion. Laparoscopic appendectomy is a cost-effective treatment for acute uncomplicated appendicitis over a lifetime time horizon. Inpatient NOM was never the preferred strategy in the scenarios considered here. These results emphasize the importance of considering long-term costs and outcomes when evaluating NOM.
View details for DOI 10.1177/2381468319866448
View details for PubMedID 31453362
-
Cost Effectiveness of Chimeric Antigen Receptor T-Cell Therapy in Multiply Relapsed or Refractory Adult Large B-Cell Lymphoma.
Journal of clinical oncology : official journal of the American Society of Clinical Oncology
2019: JCO1802079
Abstract
Two anti-CD19 chimeric antigen receptor T-cell (CAR-T) therapies are approved for diffuse large B-cell lymphoma, axicabtagene ciloleucel (axi-cel) and tisagenlecleucel; each costs $373,000. We evaluated their cost effectiveness.We used a decision analytic Markov model informed by recent multicenter, single-arm trials to evaluate axi-cel and tisagenlecleucel in multiply relapsed/refractory, adult, diffuse large B-cell lymphoma from a US health payer perspective over a lifetime horizon. Under a range of plausible long-term effectiveness assumptions, each therapy was compared with salvage chemoimmunotherapy regimens and stem-cell transplantation. Main outcomes were undiscounted life years, discounted lifetime costs, discounted quality-adjusted life years (QALYs), and incremental cost-effectiveness ratio (3% annual discount rate). Sensitivity analyses explored uncertainty.In an optimistic scenario, assuming a 40% 5-year progression-free survival (PFS), axi-cel increased life expectancy by 8.2 years at $129,000/QALY gained (95% uncertainty interval, $90,000 to $219,000). At a 30% 5-year PFS, improvements in life expectancy were more modest (6.4 years) and expensive ($159,000/QALY gained [95% uncertainty interval, $105,000 to $284,000]). In an optimistic scenario, assuming a 35% 5-year PFS, tisagenlecleucel increased life expectancy by 4.6 years at $168,000/QALY gained (95% uncertainty interval, $105,000 to $414,000/QALY). At a 25% 5-year PFS, improvements in life expectancy were smaller (3.4 years) and more expensive ($223,000/QALY gained [95% uncertainty interval, $123,000 to $1,170,000/QALY]). Administering CAR-T to all indicated patients would increase US health care costs by approximately $10 billion over 5 years. Price reductions to $250,000 and $200,000, respectively, or payment only for initial complete response (at current prices) would allow axi-cel and tisagenlecleucel to cost less than $150,000/QALY, even at 25% PFS.At 2018 prices, it is possible that both CAR-T therapies meet a less than $150,000/QALY threshold. This depends on long-term outcomes compared with chemoimmunotherapy and stem-cell transplantation, which are uncertain. Widespread adoption would substantially increase non-Hodgkin lymphoma health care costs. Price reductions or payment for initial response would improve cost effectiveness, even with modest long-term outcomes.
View details for DOI 10.1200/JCO.18.02079
View details for PubMedID 31157579
-
Cost-effectiveness of chimeric antigen receptor T-cell therapy in multiply relapsed or refractory adult large B-cell lymphoma.
AMER SOC CLINICAL ONCOLOGY. 2019
View details for Web of Science ID 000487345806323
-
Building a tuberculosis-free world: The Lancet Commission on tuberculosis
LANCET
2019; 393 (10178): 1331–84
View details for DOI 10.1016/S0140-6736(19)30024-8
View details for Web of Science ID 000462967700030
-
Effect of Integrated Behavioral Weight Loss Treatment and Problem-Solving Therapy on Body Mass Index and Depressive Symptoms Among Patients With Obesity and Depression The RAINBOW Randomized Clinical Trial
JAMA-JOURNAL OF THE AMERICAN MEDICAL ASSOCIATION
2019; 321 (9): 869–79
View details for DOI 10.1001/jama.2019.0557
View details for Web of Science ID 000460351600017
-
Cost-effectiveness of Screening for Nasopharyngeal Carcinoma among Asian American Men in the United States.
Otolaryngology--head and neck surgery : official journal of American Academy of Otolaryngology-Head and Neck Surgery
2019: 194599819832593
Abstract
OBJECTIVE: Most patients with nasopharyngeal carcinoma (NPC) in the United States are diagnosed with stage III-IV disease. Screening for NPC in endemic areas results in earlier detection and improved outcomes. We examined the cost-effectiveness of screening for NPC with plasma Epstein-Barr virus DNA among Asian American men in the United States.STUDY DESIGN: We used a Markov cohort model to estimate discounted life-years, quality-adjusted life-years (QALYs), costs, and incremental cost-effectiveness ratios for screening as compared with usual care without screening.SETTING: The base case analysis considered onetime screening for 50-year-old Asian American men.SUBJECTS AND METHODS: Confirmatory testing was magnetic resonance imaging and nasopharyngoscopy. Cancer-specific outcomes, health utility values, and costs were determined from cancer registries and the published literature.RESULTS: For Asian American men, usual care without screening resulted in the detection of NPC at stages I, II, III-IVB, and IVC among 6%, 29%, 54%, and 11% of those with cancer, respectively, whereas screening resulted in earlier detection with a stage distribution of 43%, 24%, 32%, and 1%. This corresponded to an additional 0.00055 QALYs gained at a cost of $63 per person: an incremental cost of $113,341 per QALY gained. In probabilistic sensitivity analysis, screening Asian American men was cost-effective at $100,000 per QALY gained in 35% of samples.CONCLUSION: Although screening for NPC with plasma Epstein-Barr virus DNA for 50-year-old Asian American men may result in earlier detection, in this study it was unlikely to be cost-effective. Screening may be reasonable for certain subpopulations at higher risk for NPC, but clinical studies are necessary before implementation.
View details for PubMedID 30832545
-
The Cost-Effectiveness of Initial vs Delayed Lanreotide for Treatment of Metastatic Enteropancreatic Neuroendocrine Tumors in the United States
LIPPINCOTT WILLIAMS & WILKINS. 2019: 429
View details for Web of Science ID 000462541800031
-
How does information on the harms and benefits of cervical cancer screening alter the intention to be screened?: a randomized survey of Norwegian women
EUROPEAN JOURNAL OF CANCER PREVENTION
2019; 28 (2): 87–95
View details for DOI 10.1097/CEJ.0000000000000436
View details for Web of Science ID 000480688900005
-
Cost-effectiveness of Canakinumab for Prevention of Recurrent Cardiovascular Events
JAMA CARDIOLOGY
2019; 4 (2): 128–35
View details for DOI 10.1001/jamacardio.2018.4566
View details for Web of Science ID 000459481900009
-
Cost-effectiveness of Canakinumab for Prevention of Recurrent Cardiovascular Events.
JAMA cardiology
2019
Abstract
Importance: In the Canakinumab Anti-inflammatory Thrombosis Outcome Study (CANTOS) trial, the anti-inflammatory monoclonal antibody canakinumab significantly reduced the risk of recurrent cardiovascular events in patients with previous myocardial infarction (MI) and high-sensitivity C-reactive protein (hs-CRP) levels of 2 mg/L or greater.Objective: To estimate the cost-effectiveness of adding canakinumab to standard of care for the secondary prevention of major cardiovascular events over a range of potential prices.Design, Setting, and Participants: A state-transition Markov model was constructed to estimate costs and outcomes over a lifetime horizon by projecting rates of recurrent MI, coronary revascularization, infection, and lung cancer with and without canakinumab treatment. We used a US health care sector perspective, and the base case used the current US market price of canakinumab of $73 000 per year. A hypothetical cohort of patients after MI aged 61 years with an hs-CRP level of 2 mg/L or greater was constructed.Interventions: Canakinumab, 150 mg, administered every 3 months plus standard of care compared with standard of care alone.Main Outcomes and Measures: Lifetime costs and quality-adjusted life-years (QALYs), discounted at 3% annually.Results: Adding canakinumab to standard of care increased life expectancy from 11.31 to 11.36 years, QALYs from 9.37 to 9.50, and costs from $242 000 to $1 074 000, yielding an incremental cost-effectiveness ratio of $6.4 million per QALY gained. The price would have to be reduced by more than 98% (to $1150 per year or less) to meet the $100 000 per QALY willingness-to-pay threshold. These results were generally robust across alternative assumptions, eg, substantially lower health-related quality of life after recurrent cardiovascular events, lower infection rates while receiving canakinumab, and reduced all-cause mortality while receiving canakinumab. Including a potential beneficial effect of canakinumab on lung cancer incidence improved the incremental cost-effectiveness ratio to $3.5 million per QALY gained. A strategy of continuing canakinumab selectively in patients with reduction in hs-CRP levels to less than 2 mg/L would have a cost-effectiveness ratio of $819 000 per QALY gained.Conclusions and Relevance: Canakinumab is not cost-effective at current US prices for prevention of recurrent cardiovascular events in patients with a prior MI. Substantial price reductions would be needed for canakinumab to be considered cost-effective.
View details for PubMedID 30649147
-
Defining a willingness-to-transplant threshold in an era of organ scarcity: Simultaneous liver-kidney transplant as a case example.
Transplantation
2019
Abstract
Organ scarcity continues in solid organ transplantation, such that the availability of organs limits the number of people able to benefit from transplantation. Medical advancements in managing end-stage organ disease have led to an increasing demand for multi-organ transplant, wherein a patient with multi-organ disease receives more than one organ from the same donor. Current allocation schemes give priority to multi-organ recipients over single-organ transplant recipients, which raises ethical questions regarding equity and utility.We use simultaneous liver-kidney (SLK) transplant, a type of multi-organ transplant, as a case study to examine the tension between equity and utility in multi-organ allocation. We adapt the health economics willingness-to-pay threshold to a solid organ transplant setting by coining a new metric: the willingness-to-transplant (WTT) threshold.We demonstrate how the WTT threshold can be used to evaluate different SLK allocation strategies by synthesizing utility and equity perspectives.We submit that this new framework enables us to distill the question of SLK allocation down to: what is the minimum amount of benefit we require from a deceased donor kidney to allocate it for a particular indication? Addressing the above question will prove helpful to devising a rational system of SLK allocation and is applicable to other transplant settings.
View details for DOI 10.1097/TP.0000000000002788
View details for PubMedID 31107820
-
Building a tuberculosis-free world: The Lancet Commission on tuberculosis.
Lancet (London, England)
2019
View details for PubMedID 30904263
-
The Costs of Hepatitis C by Liver Disease Stage: Estimates from the Veterans Health Administration.
Applied health economics and health policy
2019
Abstract
The release of highly effective but costly medications for the treatment of hepatitis C virus combined with a doubling in the incidence of hepatitis C virus have posed substantial financial challenges for many healthcare systems. We provide estimates of the cost of treating patients with hepatitis C virus that can inform the triage of pharmaceutical care in systems with limited healthcare resources.We conducted an observational study using a national US cohort of 206,090 veterans with laboratory-identified hepatitis C virus followed from Fiscal Year 2010 to 2014. We estimated the cost of: non-advanced Fibrosis-4; advanced Fibrosis-4; hepatocellular carcinoma; liver transplant; and post-liver transplant. The former two stages were ascertained using laboratory result data; the latter stages were ascertained using administrative data. Costs were obtained from the Veterans Health Administration's activity-based cost accounting system and more closely represent the actual costs of providing care, an improvement on the charge data that generally characterizes the hepatitis C virus cost literature. Generalized estimating equations were used to estimate and predict costs per liver disease stage. Missing data were multiply imputed.Annual costs of care increased as patients progressed from non-advanced Fibrosis-4 to advanced Fibrosis-4, hepatocellular carcinoma, and liver transplant (all p < 0.001). Post-liver transplant, costs decreased significantly (p < 0.001). In simulations, patients were estimated to incur the following annual costs: US $17,556 for non-advanced Fibrosis-4; US $20,791 for advanced Fibrosis-4; US $46,089 for liver cancer; US $261,959 in the year of the liver transplant; and US $18,643 per year after the liver transplant.Cost differences of treating non-advanced and advanced Fibrosis-4 are relatively small. The greatest cost savings would be realized from avoiding progression to liver cancer and transplant.
View details for PubMedID 31030359
-
Effect of Integrated Behavioral Weight Loss Treatment and Problem-Solving Therapy on Body Mass Index and Depressive Symptoms Among Patients With Obesity and Depression: The RAINBOW Randomized Clinical Trial.
JAMA
2019; 321 (9): 869–79
Abstract
Importance: Coexisting obesity and depression exacerbate morbidity and disability, but effective treatments remain elusive.Objective: To test the hypothesis that an integrated collaborative care intervention would significantly improve both obesity and depression at 12 months compared with usual care.Design, Setting, and Participants: The Research Aimed at Improving Both Mood and Weight (RAINBOW) randomized clinical trial enrolled 409 adults with body mass indices (BMIs) of 30 or greater (≥27 for Asian adults) and 9-item Patient Health Questionnaire (PHQ-9) scores of 10 or greater. Primary care patients at a health system in Northern California were recruited from September 30, 2014, to January 12, 2017; the date of final 12-month follow-up was January 17, 2018.Interventions: All participants randomly assigned to the intervention (n=204) or the usual care control group (n=205) received medical care from their personal physicians as usual, received information on routine services for obesity and depression at their clinic, and received wireless physical activity trackers. Intervention participants also received a 12-month intervention that integrated a Diabetes Prevention Program-based behavioral weight loss treatment with problem-solving therapy for depression and, if indicated, antidepressant medications.Main Outcomes and Measures: The co-primary outcome measures were BMI and 20-item Depression Symptom Checklist (SCL-20) scores (range, 0 [best] to 4 [worst]) at 12 months.Results: Among 409 participants randomized (mean age of 51.0 years [SD, 12.1 years]; 70% were women; mean BMI of 36.7 [SD, 6.4]; mean PHQ-9 score of 13.8 [SD, 3.1]; and mean SCL-20 score of 1.5 [SD, 0.5]), 344 (84.1%) completed 12-month follow-up. At 12 months, mean BMI declined from 36.7 (SD, 6.9) to 35.9 (SD, 7.1) among intervention participants compared with a change in mean BMI from 36.6 (SD, 5.8) to 36.6 (SD, 6.0) among usual care participants (between-group mean difference, -0.7 [95% CI, -1.1 to -0.2]; P=.01). Mean SCL-20 score declined from 1.5 (SD, 0.5) to 1.1 (SD, 1.0) at 12 months among intervention participants compared with a change in mean SCL-20 score from 1.5 (SD, 0.6) to 1.4 (SD, 1.3) among usual care participants (between-group mean difference, -0.2 [95% CI, -0.4 to 0]; P=.01). There were 47 adverse events or serious adverse events that involved musculoskeletal injuries (27 in the intervention group and 20 in the usual care group).Conclusions and Relevance: Among adults with obesity and depression, a collaborative care intervention integrating behavioral weight loss treatment, problem-solving therapy, and as-needed antidepressant medications significantly improved weight loss and depressive symptoms at 12 months compared with usual care; however, the effect sizes were modest and of uncertain clinical importance.Trial Registration: ClinicalTrials.gov Identifier: NCT02246413.
View details for PubMedID 30835308
-
How does information on the harms and benefits of cervical cancer screening alter the intention to be screened?: a randomized survey of Norwegian women.
European journal of cancer prevention : the official journal of the European Cancer Prevention Organisation (ECP)
2019; 28 (2): 87–95
Abstract
Cervical cancer (CC) is the 13th most frequent cancer among women in Norway, but the third most common among women aged 25-49 years. The national screening program sends information letters to promote screening participation. We aimed to evaluate how women's stated intention to participate in screening and pursue treatment changed with the provision of additional information on harms associated with screening, and to assess women's preferences on the timing and source of such information. We administered a web-based questionnaire to a panel of Norwegian women aged 25-69 years and randomized into three groups on the basis of when in the screening process additional information was introduced: (i) invited for routine screening, (ii) recommended an additional test following detection of cellular abnormalities, and (iii) recommended precancer treatment. A fourth (control) group did not receive any additional information. Results show that among 1060 respondents, additional information did not significantly alter women's stated intentions to screen. However, it created decision uncertainty on when treatment was recommended (8.76-9.09 vs. 9.40; 10-point Likert scale; P=0.004). Over 80% of women favored receiving information on harms and 59% preferred that information come from a qualified public health authority. Nearly 90% of women in all groups overestimated women's lifetime risk of CC. In conclusion, additional information on harms did not alter Norwegian women's stated intention to screen for CC; yet, it resulted in greater decision uncertainty to undergo precancer treatment. Incorporating information on harms into invitation letters is warranted as it would increase women's ability to make informed choices.
View details for PubMedID 29595751
-
Cost Effectiveness of Chimeric Antigen Receptor T-Cell Therapy in Multiply Relapsed or Refractory Adult Large B-Cell Lymphoma
Journal of Clinical Oncology
2019
View details for DOI 10.1200/JCO.18.02079
-
Optimal timing of drug sensitivity testing for patients on first-line tuberculosis treatment
HEALTH CARE MANAGEMENT SCIENCE
2018; 21 (4): 632–46
Abstract
Effective treatment for tuberculosis (TB) patients on first-line treatment involves triaging those with drug-resistant (DR) TB to appropriate treatment alternatives. Patients likely to have DR TB are identified using results from repeated inexpensive sputum-smear (SS) tests and expensive but definitive drug sensitivity tests (DST). Early DST may lead to high costs and unnecessary testing; late DST may lead to poor health outcomes and disease transmission. We use a partially observable Markov decision process (POMDP) framework to determine optimal DST timing. We develop policy-relevant structural properties of the POMDP model. We apply our model to TB in India to identify the patterns of SS test results that should prompt DST if transmission costs remain at status-quo levels. Unlike previous analyses of personalized treatment policies, we take a societal perspective and consider the effects of disease transmission. The inclusion of such effects can significantly alter the optimal policy. We find that an optimal DST policy could save India approximately $1.9 billion annually.
View details for PubMedID 28861650
View details for PubMedCentralID PMC5832607
-
Cost-Effectiveness of Screening for Nasopharyngeal Carcinoma with Plasma Epstein-Barr Virus DNA
ELSEVIER SCIENCE INC. 2018: E401
View details for DOI 10.1016/j.ijrobp.2018.07.1184
View details for Web of Science ID 000447811601207
-
Optimal Information Collection Policies in a Markov Decision Process Framework
MEDICAL DECISION MAKING
2018; 38 (7): 797–809
Abstract
The cost-effectiveness and value of additional information about a health technology or program may change over time because of trends affecting patient cohorts and/or the intervention. Delaying information collection even for parameters that do not change over time may be optimal.We present a stochastic dynamic programming approach to simultaneously identify the optimal intervention and information collection policies. We use our framework to evaluate birth cohort hepatitis C virus (HCV) screening. We focus on how the presence of a time-varying parameter (HCV prevalence) affects the optimal information collection policy for a parameter assumed constant across birth cohorts: liver fibrosis stage distribution for screen-detected diagnosis at age 50.We prove that it may be optimal to delay information collection until a time when the information more immediately affects decision making. For the example of HCV screening, given initial beliefs, the optimal policy (at 2010) was to continue screening and collect information about the distribution of liver fibrosis at screen-detected diagnosis in 12 years, increasing the expected incremental net monetary benefit (INMB) by $169.5 million compared to current guidelines.The option to delay information collection until the information is sufficiently likely to influence decisions can increase efficiency. A dynamic programming framework enables an assessment of the marginal value of information and determines the optimal policy, including when and how much information to collect.
View details for PubMedID 30179585
-
Cost Effectiveness of Chimeric Antigen Receptor T-Cell Therapy in Relapsed or Refractory Pediatric B-Cell Acute Lymphoblastic Leukemia.
Journal of clinical oncology : official journal of the American Society of Clinical Oncology
2018: JCO2018790642
Abstract
Purpose The anti-CD19 chimeric antigen receptor T-cell therapy tisagenlecleucel was recently approved to treat relapsed or refractory pediatric acute lymphoblastic leukemia. With a one-time infusion cost of $475,000, tisagenlecleucel is currently the most expensive oncologic therapy. We aimed to determine whether tisagenlecleucel is cost effective compared with currently available treatments. Methods Markov modeling was used to evaluate tisagenlecleucel in pediatric relapsed or refractory acute lymphoblastic leukemia from a US health payer perspective over a lifetime horizon. The model was informed by recent multicenter, single-arm clinical trials. Tisagenlecleucel (under a range of plausible long-term effectiveness) was compared with blinatumomab, clofarabine combination therapy (clofarabine, etoposide, and cyclophosphamide), and clofarabine monotherapy. Scenario and probabilistic sensitivity analyses were used to explore uncertainty. Main outcomes were life-years, discounted lifetime costs, discounted quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratio (3% discount rate). Results With an assumption of a 40% 5-year relapse-free survival rate, tisagenlecleucel increased life expectancies by 12.1 years and cost $61,000/QALY gained. However, at a 20% 5-year relapse-free survival rate, life-expectancies were more modest (3.8 years) and expensive ($151,000/QALY gained). At a 0% 5-year relapse-free survival rate and with use as a bridge to transplant, tisagenlecleucel increased life expectancies by 5.7 years and cost $184,000/QALY gained. Reduction of the price of tisagenlecleucel to $200,000 or $350,000 would allow it to meet a $100,000/QALY or $150,000/QALY willingness-to-pay threshold in all scenarios. Conclusion The long-term effectiveness of tisagenlecleucel is a critical but uncertain determinant of its cost effectiveness. At its current price, tisagenlecleucel represents reasonable value if it can keep a substantial fraction of patients in remission without transplantation; however, if all patients ultimately require a transplantation to remain in remission, it will not be cost effective at generally accepted thresholds. Price reductions would favorably influence cost effectiveness even if long-term clinical outcomes are modest.
View details for DOI 10.1200/JCO.2018.79.0642
View details for PubMedID 30212291
-
Cost-effectiveness of ibrutinib as first-line therapy for chronic lymphocytic leukemia in older adults without deletion 17p
BLOOD ADVANCES
2018; 2 (15): 1946–56
Abstract
Ibrutinib is a novel oral therapy that has shown significant efficacy as initial treatment of chronic lymphocytic leukemia (CLL). It is a high-cost continuous therapy differing from other regimens that are given for much shorter courses. Our objective was to evaluate the cost-effectiveness of ibrutinib for first-line treatment of CLL in patients older than age 65 years without a 17p deletion. We developed a semi-Markov model to analyze the cost-effectiveness of ibrutinib vs a comparator therapy from a US Medicare perspective. No direct comparison between ibrutinib and the best available treatment alternative, obinutuzumab plus chlorambucil (chemoimmunotherapy), exists. Therefore, we compared ibrutinib to a theoretical treatment alternative, which was modeled to confer the effectiveness of an inferior treatment (chlorambucil alone) and the costs and adverse events of chemoimmunotherapy, which would provide ibrutinib with the best chance of being cost-effective. Even so, the incremental cost-effectiveness ratio of ibrutinib vs the modeled comparator was $189 000 per quality-adjusted life-year (QALY) gained. To reach a willingness-to-pay threshold (WTP) of $150 000 per QALY, the monthly cost of ibrutinib would have to be at most $6800, $1700 less than the modeled cost of $8500 per month (a reduction of $20 400 per year). When the comparator efficacy is increased to more closely match that seen in trials evaluating chemoimmunotherapy, ibrutinib costs more than $262 000 per QALY gained, and the monthly cost of ibrutinib would need to be lowered to less than $5000 per month to be cost-effective. Ibrutinib is not cost-effective as initial therapy at a WTP threshold of $150 000 per QALY gained.
View details for PubMedID 30097461
View details for PubMedCentralID PMC6093732
-
Cost-effectiveness of canakinumab to prevent recurrent cardiovascular events
OXFORD UNIV PRESS. 2018: 503–4
View details for Web of Science ID 000459824001584
-
Which patients receive surgery in for-profit and non-profit hospitals in a universal health system? An explorative register-based study in Norway
BMJ OPEN
2018; 8 (6): e019780
Abstract
To compare the socioeconomic status (SES) and case-mix among day surgical patients treated at private for-profit hospitals (PFPs) and non-profit hospitals (NPs) in Norway, and to explore whether the use of PFPs in a universal health system has compromised the principle of equal access regardless of SES.A retrospective, exploratory study comparing hospital types using the Norwegian Patient Register linked with socioeconomic data from Statistics Norway by using Norwegian citizens' personal identification numbers.The Norwegian healthcare system.All publicly financed patients in five Norwegian metropolitan areas having day surgery for meniscus (34 100 patients), carpal tunnel syndrome (15 010), benign breast hypertrophy (6297) or hallux valgus (2135) from 2009 to 2014.Having surgery at a PFP or NP.Across four unique procedures, the adjusted odds ratios (aORs) for using PFPs were generally lower for the lowest educational level (0.77-0.87) and the lowest income level (0.68-0.89), though aORs were not always significant. Likewise, comorbidity and previous hospitalisation had lower aORs (0.62-0.95; 0.44-0.97, respectively) for having surgery at PFPs across procedures, though again aORs were not always significant. No clear patterns emerged with respect to age, gender or higher levels of income and education.The evidence from our study of four procedures suggests that equal access to PFPs compared with NPs for those patients at the lowest education and income levels may be compromised, though further investigations are needed to generalise these findings across more procedures and probe causal mechanisms and appropriate policy remedies. The finding that comorbidity and previous hospitalisation had lower odds of treatment at PFPs indicates that NPs play an essential role for more complex patients, but raises questions about patient preference and cream skimming.
View details for PubMedID 29886441
-
Matching Microsimulation Risk Factor Correlations to Cross-sectional Data: The Shortest Distance Method
MEDICAL DECISION MAKING
2018; 38 (4): 452–64
Abstract
Microsimulation models often compute the distribution of a simulated cohort's risk factors and medical outcomes over time using repeated waves of cross-sectional data. We sought to develop a strategy to simulate how risk factor values remain correlated over time within individuals, and compare it to available alternative methods.We developed a method using shortest-distance matching for modeling changes in risk factors in individuals over time, which preserves both the cohort distribution of each risk factor as well as the cross-sectional correlation between risk factors observed in repeated cross-sectional data. We compared the performance of the method with rank stability and regression methods, using both synthetic data and data from the Framingham Offspring Heart Study (FOHS) to simulate a cohort's atherosclerotic cardiovascular disease (ASCVD) risk.The correlation between risk factors was better preserved using the shortest distance method than with rank stability or regression (root mean squared difference = 0.077 with shortest distance, v. 0.126 with rank stability and 0.146 with regression in FOHS, and 0.052, 0.426 and 0.352, respectively, in the synthetic data). The shortest distance method generated population ASCVD risk estimate distributions indistinguishable from the true distribution in over 99.8% of cases (Kolmogorov-Smirnov, P > 0.05), outperforming some existing regression methods, which produced ASCVD distributions statistically distinguishable from the true one at the 5% level around 15% of the time.None of the methods considered could predict individual longitudinal trends without error. The shortest-distance method was not statistically inferior to rank stability or regression methods for predicting individual risk factor values over time in the FOHS.A shortest distance method may assist in preserving risk factor correlations in microsimulations informed by cross-sectional data.
View details for PubMedID 29185378
View details for PubMedCentralID PMC5913001
-
Comparing Simultaneous Liver-Kidney Transplant Strategies: A Modified Cost-Effectiveness Analysis
TRANSPLANTATION
2018; 102 (5): E219–E228
View details for DOI 10.1097/TP.0000000000002148
View details for Web of Science ID 000431423600006
-
Comparing Simultaneous Liver-Kidney Transplant Strategies: A Modified Cost-Effectiveness Analysis.
Transplantation
2018
Abstract
BACKGROUND: The proportion of patients with kidney failure at time of liver transplantation is at an historic high in the United States. The optimal timing of kidney transplantation with respect to the liver transplant is unknown.METHODS: We used a modified cost-effectiveness analysis to compare four strategies: the old system ("pre-OPTN"), the new Organ Procurement Transplant Network (OPTN) system since August 10, 2017 ("OPTN"), and two strategies which restrict simultaneous liver-kidney transplants ("safety net" and "stringent"). We measured "cost" by deployment of deceased donor kidneys (DDKs) to liver transplant recipients and effectiveness by life years (LYs) and quality-adjusted life years (QALYs) in liver transplant recipients. We validated our model against Scientific Registry for Transplant Recipients data.RESULTS: The OPTN, safety net and stringent strategies were on the efficient frontier. By rank order, OPTN > safety net > stringent strategy in terms of LY, QALY and DDK deployment. The pre-OPTN system was dominated, or outperformed, by all alternative strategies. The incremental LY per DDK between the strategies ranged from 1.30 to 1.85. The incremental QALY per DDK ranged from 1.11 to 2.03.CONCLUSION: These estimates quantify the "organ"-effectiveness of various kidney allocation strategies for liver transplant candidates. The OPTN system will likely deliver better liver transplant outcomes at the expense of more frequent deployment of DDKs to liver transplant recipients.
View details for PubMedID 29554056
-
Cost-effectiveness of multidisciplinary care in mild to moderate chronic kidney disease in the United States: A modeling study
PLOS MEDICINE
2018; 15 (3): e1002532
Abstract
Multidisciplinary care (MDC) programs have been proposed as a way to alleviate the cost and morbidity associated with chronic kidney disease (CKD) in the US.We assessed the cost-effectiveness of a theoretical Medicare-based MDC program for CKD compared to usual CKD care in Medicare beneficiaries with stage 3 and 4 CKD between 45 and 84 years old in the US. The program used nephrologists, advanced practitioners, educators, dieticians, and social workers. From Medicare claims and published literature, we developed a novel deterministic Markov model for CKD progression and calibrated it to long-term risks of mortality and progression to end-stage renal disease. We then used the model to project accrued discounted costs and quality-adjusted life years (QALYs) over patients' remaining lifetime. We estimated the incremental cost-effectiveness ratio (ICER) of MDC, or the cost of the intervention per QALY gained. MDC added 0.23 (95% CI: 0.08, 0.42) QALYs over usual care, costing $51,285 per QALY gained (net monetary benefit of $23,100 at a threshold of $150,000 per QALY gained; 95% CI: $6,252, $44,323). In all subpopulations analyzed, ICERs ranged from $42,663 to $72,432 per QALY gained. MDC was generally more cost-effective in patients with higher urine albumin excretion. Although ICERs were higher in younger patients, MDC could yield greater improvements in health in younger than older patients. MDC remained cost-effective when we decreased its effectiveness to 25% of the base case or increased the cost 5-fold. The program costed less than $70,000 per QALY in 95% of probabilistic sensitivity analyses and less than $87,500 per QALY in 99% of analyses. Limitations of our study include its theoretical nature and being less generalizable to populations at low risk for progression to ESRD. We did not study the potential impact of MDC on hospitalization (cardiovascular or other).Our model estimates that a Medicare-funded MDC program could reduce the need for dialysis, prolong life expectancy, and meet conventional cost-effectiveness thresholds in middle-aged to elderly patients with mild to moderate CKD.
View details for PubMedID 29584720
-
Cost-Effectiveness of Chimeric Antigen Receptor T-Cell Therapy in Relapsed or Refractory Pediatric B-Cell Acute Lymphoblastic Leukemia
Journal of Clinical Oncology
2018
View details for DOI 10.1200/JCO.2018.79.0642
-
Population Health and Cost-Effectiveness Implications of a "Treat All" Recommendation for HCV: A Review of the Model-Based Evidence.
MDM policy & practice
2018; 3 (1): 2381468318776634
Abstract
The World Health Organization HCV Guideline Development Group is considering a "treat all" recommendation for persons infected with hepatitis C virus (HCV). We reviewed the model-based evidence of cost-effectiveness and population health impacts comparing expanded treatment policies to more limited treatment access policies, focusing primarily on evaluations of all-oral directly acting antivirals published after 2012. Searching PubMed, we identified 2,917 unique titles. Sequentially reviewing titles and abstracts identified 226 potentially relevant articles for full-text review. Sixty-nine articles met all inclusion criteria-42 cost-effectiveness analyses and 30 models of population-health impacts, with 3 articles presenting both types of analysis. Cost-effectiveness studies for many countries concluded that expanding treatment to people with mild liver fibrosis, who inject drugs (PWID), or who are incarcerated is generally cost-effective compared to more restrictive treatment access policies at country-specific prices. For certain patient subpopulations in some countries-for example, elderly individuals without fibrosis-treatment is only cost-effective at lower prices. A frequent limitation is the omission of benefits and consequences of HCV transmission (i.e., treatment as prevention; risks of reinfection), which may underestimate or overestimate the cost-effectiveness of a "treat all" policy. Epidemiologic modeling studies project that through a combination of prevention, aggressive screening and diagnosis, and prompt treatment for all fibrosis stages, it may be possible to virtually eliminate HCV in many countries. Studies show that if resources are not available to diagnose and treat all HCV-infected individuals, treatment prioritization may be needed, with alternative prioritization strategies resulting in tradeoffs between reducing mortality or reducing incidence. Notably, because most new HCV infections are among PWID in many settings, HCV elimination requires unrestricted treatment access combined with injection transmission disruption strategies. The model-based evidence suggests that a properly constructed strategy that substantially expands HCV treatment could achieve cost-effective improvements in population health in many countries.
View details for PubMedID 30288448
-
Economically Efficient Hepatitis C Virus Treatment Prioritization Improves Health Outcomes: Hepatitis C Virus Treatment Prioritization.
Medical decision making : an international journal of the Society for Medical Decision Making
2018: 272989X18792284
Abstract
The total cost of treating the 3 million Americans chronically infected with hepatitis C virus (HCV) represents a substantial affordability challenge requiring treatment prioritization. This study compares the health and economic outcomes of alternative treatment prioritization schedules.We developed a multiyear HCV treatment budget allocation model to evaluate the tradeoffs of 7 prioritization strategies. We used optimization to identify the priority schedule that maximizes population net monetary benefit (NMB). We compared prioritization schedules in terms of the number of individuals treated, the number of individuals who progress to end-stage liver disease (ESLD), and population total quality-adjusted life years (QALYs). We applied the model to the population of treatment-naive patients with a total annual HCV treatment budget of US$8.6 billion.First-come, first-served (FCFS) treats the fewest people with advanced fibrosis, prevents the fewest cases of ESLD, and gains the fewest QALYs. A schedule developed from optimizing population NMB prioritizes treatment in the first year to patients with moderate to severe fibrosis who are younger than 65 years, followed by older individuals with moderate to severe fibrosis. While this strategy yields the greatest population QALYs, prioritization by disease severity alone prevents more cases of ESLD. Sensitivity analysis indicated that the differences between prioritization schedules are greater when the budget is smaller. A 10% annual treatment price reduction enabled treatment 1 year sooner to several patient subgroups, specifically older patients and those with less severe liver fibrosis.In the absence of a sufficient budget to treat all patients, explicit prioritization targeting younger people with more severe disease first provides the greatest health benefits. We provide our spreadsheet model so that decision makers can compare health tradeoffs of different budget levels and various prioritization strategies with inputs tailored to their population.
View details for PubMedID 30132410
-
Effect of Interferon-Free Regimens on Disparities in Hepatitis C Treatment of US Veterans.
Value in health : the journal of the International Society for Pharmacoeconomics and Outcomes Research
2018; 21 (8): 921–30
Abstract
To determine whether implementation of interferon-free treatment for hepatitis C virus (HCV) reached groups less likely to benefit from earlier therapies, including patients with genotype 1 virus or contraindications to interferon treatment, and groups that faced treatment disparities: African Americans, patients with HIV co-infection, and those with drug use disorder.Electronic medical records of the US Veterans Health Administration (VHA) were used to characterize patients with chronic HCV infection and the treatments they received. Initiation of treatment in 206,544 patients with chronic HCV characterized by viral genotype, demographic characteristics, and comorbid medical and mental illness was studied using a competing events Cox regression over 6 years.With the advent of interferon-free regimens, the proportion treated increased from 2.4% in 2010 to 18.1% in 2015, an absolute increase of 15.7%. Patients with genotype 1 virus, poor response to previous treatment, and liver disease had the greatest increase. Large absolute increases in the proportion treated were observed in patients with HIV co-infection (18.6%), alcohol use disorder (11.9%), and drug use disorder (12.6%) and in African American (13.7%) and Hispanic (13.5%) patients, groups that were less likely to receive interferon-containing treatment. The VHA spent $962 million on interferon-free treatments in 2015, 1.5% of its operating budget.The proportion of patients with HCV treated in VHA increased sevenfold. The VHA was successful in implementing interferon treatment in previously undertreated populations, and this may become the community standard of care.
View details for PubMedID 30098669
-
Profiles of sociodemographic, behavioral, clinical and psychosocial characteristics among primary care patients with comorbid obesity and depression.
Preventive medicine reports
2017; 8: 42–50
Abstract
The objective of this study is to characterize profiles of obese depressed participants using baseline data collected from October 2014 through December 2016 for an ongoing randomized controlled trial (n=409) in Bay Area, California, USA. Four comorbidity severity categories were defined by interaction of the binary levels of body mass index (BMI) and depression Symptom Checklist 20 (SCL20) scores. Sociodemographic, behavioral, clinical and psychosocial characteristics were measured. Mean (SD) age was 51 (12.1) years, BMI 36.7 (6.4) kg/m2, and SCL20 1.5 (0.5). Participants in the 4 comorbidity severity categories had similar sociodemographic characteristics, but differed significantly in the other characteristics. Two statistically significant canonical dimensions were identified. Participants with BMI≥35 and SCL20≥1.5 differed significantly from those with BMI<35 and SCL20<1.5 on dimension 1, which primarily featured high physical health (e.g., central obesity, high blood pressure and impaired sleep) and mental health comorbidities (e.g., post-traumatic stress and anxiety), poor health-related quality of life (in general and problems specifically with obesity, anxiety, depression, and usual daily activities), and an avoidance problem-solving style. Participants with BMI<35 and SCL20≥1.5 differed significantly from those with BMI≥35 and SCL20<1.5 on dimension 2, which primarily included fewer Hispanics, less central obesity, and more leisure-time physical activity, but greater anxiety and post-traumatic stress and poorer obesity- or mental health-related quality of life. In conclusion, patients with comorbid obesity and depression of varying severity have different profiles of behavioral, clinical and psychosocial characteristics. This insight may inform analysis of treatment heterogeneity and development of targeted intervention strategies. Trial registration:ClinicalTrials.gov #NCT02246413.
View details for PubMedID 28840096
-
Estimation of the cost-effectiveness of HIV prevention portfolios for people who inject drugs in the United States: A model-based analysis.
PLoS medicine
2017; 14 (5)
Abstract
The risks of HIV transmission associated with the opioid epidemic make cost-effective programs for people who inject drugs (PWID) a public health priority. Some of these programs have benefits beyond prevention of HIV-a critical consideration given that injection drug use is increasing across most United States demographic groups. To identify high-value HIV prevention program portfolios for US PWID, we consider combinations of four interventions with demonstrated efficacy: opioid agonist therapy (OAT), needle and syringe programs (NSPs), HIV testing and treatment (Test & Treat), and oral HIV pre-exposure prophylaxis (PrEP).We adapted an empirically calibrated dynamic compartmental model and used it to assess the discounted costs (in 2015 US dollars), health outcomes (HIV infections averted, change in HIV prevalence, and discounted quality-adjusted life years [QALYs]), and incremental cost-effectiveness ratios (ICERs) of the four prevention programs, considered singly and in combination over a 20-y time horizon. We obtained epidemiologic, economic, and health utility parameter estimates from the literature, previously published models, and expert opinion. We estimate that expansions of OAT, NSPs, and Test & Treat implemented singly up to 50% coverage levels can be cost-effective relative to the next highest coverage level (low, medium, and high at 40%, 45%, and 50%, respectively) and that OAT, which we assume to have immediate and direct health benefits for the individual, has the potential to be the highest value investment, even under scenarios where it prevents fewer infections than other programs. Although a model-based analysis can provide only estimates of health outcomes, we project that, over 20 y, 50% coverage with OAT could avert up to 22,000 (95% CI: 5,200, 46,000) infections and cost US$18,000 (95% CI: US$14,000, US$24,000) per QALY gained, 50% NSP coverage could avert up to 35,000 (95% CI: 8,900, 43,000) infections and cost US$25,000 (95% CI: US$7,000, US$76,000) per QALY gained, 50% Test & Treat coverage could avert up to 6,700 (95% CI: 1,200, 16,000) infections and cost US$27,000 (95% CI: US$15,000, US$48,000) per QALY gained, and 50% PrEP coverage could avert up to 37,000 (22,000, 58,000) infections and cost US$300,000 (95% CI: US$162,000, US$667,000) per QALY gained. When coverage expansions are allowed to include combined investment with other programs and are compared to the next best intervention, the model projects that scaling OAT coverage up to 50%, then scaling NSP coverage to 50%, then scaling Test & Treat coverage to 50% can be cost-effective, with each coverage expansion having the potential to cost less than US$50,000 per QALY gained relative to the next best portfolio. In probabilistic sensitivity analyses, 59% of portfolios prioritized the addition of OAT and 41% prioritized the addition of NSPs, while PrEP was not likely to be a priority nor a cost-effective addition. Our findings are intended to be illustrative, as data on achievable coverage are limited and, in practice, the expansion scenarios considered may exceed feasible levels. We assumed independence of interventions and constant returns to scale. Extensive sensitivity analyses allowed us to assess parameter sensitivity, but the use of a dynamic compartmental model limited the exploration of structural sensitivities.We estimate that OAT, NSPs, and Test & Treat, implemented singly or in combination, have the potential to effectively and cost-effectively prevent HIV in US PWID. PrEP is not likely to be cost-effective in this population, based on the scenarios we evaluated. While local budgets or policy may constrain feasible coverage levels for the various interventions, our findings suggest that investments in combined prevention programs can substantially reduce HIV transmission and improve health outcomes among PWID.
View details for DOI 10.1371/journal.pmed.1002312
View details for PubMedID 28542184
-
Evaluation of a social franchising and telemedicine programme and the care provided for childhood diarrhoea and pneumonia, Bihar, India.
Bulletin of the World Health Organization
2017; 95 (5): 343-352E
Abstract
To evaluate the impact on the quality of the care provided for childhood diarrhoea and pneumonia in Bihar, India, of a large-scale, social franchising and telemedicine programme - the World Health Partners' Sky Program.We investigated changes associated with the programme in the knowledge and performance of health-care providers by carrying out 810 assessments in a representative sample of providers in areas where the programme was and was not implemented. Providers were assessed using hypothetical patient vignettes and the standardized patient method both before and after programme implementation, in 2011 and 2014, respectively. Differences in providers' performance between implementation and nonimplementation areas were assessed using multivariate difference-in-difference linear regression models.The programme did not significantly improve health-care providers' knowledge or performance with regard to childhood diarrhoea or pneumonia in Bihar. There was a persistent large gap between knowledge of appropriate care and the care actually delivered.Social franchising has received attention globally as a model for delivering high-quality care in rural areas in the developing world but supporting data are scarce. Our findings emphasize the need for sound empirical evidence before social franchising programmes are scaled up.
View details for DOI 10.2471/BLT.16.179556
View details for PubMedID 28479635
-
A Markov Model for Transplant & Renal Outcomes After Liver & Simultaneous Liver-Kidney Transplants.
WILEY. 2017: 806
View details for Web of Science ID 000404515704585
-
How Much Is a Kidney Worth? Comparing Kidney Transplant Strategies in Liver Transplant Candidates
WILEY. 2017: 325
View details for Web of Science ID 000404515702336
-
Optimizing patient treatment decisions in an era of rapid technological advances: the case of hepatitis C treatment
HEALTH CARE MANAGEMENT SCIENCE
2017; 20 (1): 16-32
Abstract
How long should a patient with a treatable chronic disease wait for more effective treatments before accepting the best available treatment? We develop a framework to guide optimal treatment decisions for a deteriorating chronic disease when treatment technologies are improving over time. We formulate an optimal stopping problem using a discrete-time, finite-horizon Markov decision process. The goal is to maximize a patient's quality-adjusted life expectancy. We derive structural properties of the model and analytically solve a three-period treatment decision problem. We illustrate the model with the example of treatment for chronic hepatitis C virus (HCV). Chronic HCV affects 3-4 million Americans and has been historically difficult to treat, but increasingly effective treatments have been commercialized in the past few years. We show that the optimal treatment decision is more likely to be to accept currently available treatment-despite expectations for future treatment improvement-for patients who have high-risk history, who are older, or who have more comorbidities. Insights from this study can guide HCV treatment decisions for individual patients. More broadly, our model can guide treatment decisions for curable chronic diseases by finding the optimal treatment policy for individual patients in a heterogeneous population.
View details for DOI 10.1007/s10729-015-9330-6
View details for Web of Science ID 000398100100002
View details for PubMedCentralID PMC4718905
-
Cost-effectiveness of Intensive Blood Pressure Management-Is There an Additional Price to Pay?-Reply.
JAMA cardiology
2017
View details for DOI 10.1001/jamacardio.2016.5837
View details for PubMedID 28199457
-
Cost-Effectiveness of Left Ventricular Assist Devices in Ambulatory Patients With Advanced Heart Failure.
JACC. Heart failure
2017; 5 (2): 110-119
Abstract
This study assessed the cost-effectiveness of left ventricular assist devices (LVADs) as destination therapy in ambulatory patients with advanced heart failure.LVADs improve survival and quality of life in inotrope-dependent heart failure, but data are limited as to their value in less severely ill patients.We determined costs of care among Medicare beneficiaries before and after LVAD implantation from 2009 to 2010. We used these costs and efficacy data from published studies in a Markov model to project the incremental cost-effectiveness ratio (ICER) of destination LVAD therapy compared with that of medical management. We discounted costs and benefits at 3% annually and report costs as 2016 U.S. dollars.The mean cost of LVAD implantation was $175,420. The mean cost of readmission was lower before LVAD than after ($12,377 vs. $19,465, respectively; p < 0.001), while monthly outpatient costs were similar ($3,364 vs. $2,974, respectively; p = 0.54). In the lifetime simulation model, LVAD increased quality-adjusted life-years (QALYs) (4.41 vs. 2.67, respectively), readmissions (13.03 vs. 6.35, respectively), and costs ($726,200 vs. $361,800, respectively) compared with medical management, yielding an ICER of $209,400 per QALY gained and $597,400 per life-year gained. These results were sensitive to LVAD readmission rates and outpatient care costs; the ICER would be $86,900 if these parameters were 50% lower.LVADs in non-inotrope-dependent heart failure patients improved quality of life but substantially increased lifetime costs because of frequent readmissions and costly follow-up care. LVADs may provide good value if outpatient costs and adverse events can be reduced.
View details for DOI 10.1016/j.jchf.2016.09.008
View details for PubMedID 28017351
-
Cost-effectiveness of Stereotactic Body Radiation Therapy versus Radiofrequency Ablation for Hepatocellular Carcinoma: A Markov Modeling Study.
Radiology
2017: 161509-?
Abstract
Purpose To assess the cost-effectiveness of stereotactic body radiation therapy (SBRT) versus radiofrequency ablation (RFA) for patients with inoperable localized hepatocellular carcinoma (HCC) who are eligible for both SBRT and RFA. Materials and Methods A decision-analytic Markov model was developed for patients with inoperable, localized HCC who were eligible for both RFA and SBRT to evaluate the cost-effectiveness of the following treatment strategies: (a) SBRT as initial treatment followed by SBRT for local progression (SBRT-SBRT), (b) RFA followed by RFA for local progression (RFA-RFA), (c) SBRT followed by RFA for local progression (SBRT-RFA), and (d) RFA followed by SBRT for local progression (RFA-SBRT). Probabilities of disease progression, treatment characteristics, and mortality were derived from published studies. Outcomes included health benefits expressed as discounted quality-adjusted life years (QALYs), costs in U.S. dollars, and cost-effectiveness expressed as an incremental cost-effectiveness ratio. Deterministic and probabilistic sensitivity analysis was performed to assess the robustness of the findings. Results In the base case, SBRT-SBRT yielded the most QALYs (1.565) and cost $197 557. RFA-SBRT yielded 1.558 QALYs and cost $193 288. SBRT-SBRT was not cost-effective, at $558 679 per QALY gained relative to RFA-SBRT. RFA-SBRT was the preferred strategy, because RFA-RFA and SBRT-RFA were less effective and more costly. In all evaluated scenarios, SBRT was preferred as salvage therapy for local progression after RFA. Probabilistic sensitivity analysis showed that at a willingness-to-pay threshold of $100 000 per QALY gained, RFA-SBRT was preferred in 65.8% of simulations. Conclusion SBRT for initial treatment of localized, inoperable HCC is not cost-effective. However, SBRT is the preferred salvage therapy for local progression after RFA. (©) RSNA, 2017 Online supplemental material is available for this article.
View details for DOI 10.1148/radiol.2016161509
View details for PubMedID 28045603
-
Risk stratification in compartmental epidemic models: Where to draw the line?
Journal of theoretical biology
2017; 428: 1–17
Abstract
Economic evaluations of infectious disease control interventions frequently use dynamic compartmental epidemic models. Such models capture heterogeneity in risk of infection by stratifying the population into discrete risk groups, thus approximating what is typically continuous variation in risk. An important open question is whether and how different risk stratification choices influence model predictions of intervention effects. We develop equivalent Susceptible-Infected-Susceptible (SIS) dynamic transmission models: an unstratified model, a model stratified into a high-risk and low-risk group, and a model with an arbitrary number of risk groups. Absent intervention, the models produce the same overall prevalence of infected individuals in steady state. We consider an intervention that either reduces the contact rate or increases the disease clearance rate. We develop analytical and numerical results characterizing the models and the effects of the intervention. We find that there exist multiple feasible choices of risk stratification, contact distribution, and within- and between-group contact rates for models that stratify risk. We show analytically and empirically that these choices can generate different estimates of intervention effectiveness, and that these differences can be significant enough to alter conclusions from cost-effectiveness analyses and change policy recommendations. We conclude that the choice of how to discretize risk in compartmental epidemic models can influence predicted effectiveness of interventions. Therefore, analysts should examine multiple alternatives and report the range of results.
View details for PubMedID 28606751
-
Some Health States Are Better Than Others: Using Health State Rank Order to Improve Probabilistic Analyses
MEDICAL DECISION MAKING
2016; 36 (8): 927-940
Abstract
Probabilistic sensitivity analyses (PSA) may lead policy makers to take nonoptimal actions due to misestimates of decision uncertainty caused by ignoring correlations. We developed a method to establish joint uncertainty distributions of quality-of-life (QoL) weights exploiting ordinal preferences over health states.Our method takes as inputs independent, univariate marginal distributions for each QoL weight and a preference ordering. It establishes a correlation matrix between QoL weights intended to preserve the ordering. It samples QoL weight values from their distributions, ordering them with the correlation matrix. It calculates the proportion of samples violating the ordering, iteratively adjusting the correlation matrix until this proportion is below an arbitrarily small threshold. We compare our method with the uncorrelated method and other methods for preserving rank ordering in terms of violation proportions and fidelity to the specified marginal distributions along with PSA and expected value of partial perfect information (EVPPI) estimates, using 2 models: 1) a decision tree with 2 decision alternatives and 2) a chronic hepatitis C virus (HCV) Markov model with 3 alternatives.All methods make tradeoffs between violating preference orderings and altering marginal distributions. For both models, our method simultaneously performed best, with largest performance advantages when distributions reflected wider uncertainty. For PSA, larger changes to the marginal distributions induced by existing methods resulted in differing conclusions about which strategy was most likely optimal. For EVPPI, both preference order violations and altered marginal distributions caused existing methods to misestimate the maximum value of seeking additional information, sometimes concluding that there was no value.Analysts can characterize the joint uncertainty in QoL weights to improve PSA and value-of-information estimates using Open Source implementations of our method.
View details for DOI 10.1177/0272989X15605091
View details for Web of Science ID 000385499700001
View details for PubMedID 26377369
View details for PubMedCentralID PMC4794424
-
Cost-effectiveness and resource implications of aggressive action on tuberculosis in China, India, and South Africa: a combined analysis of nine models.
The Lancet. Global health
2016; 4 (11): e816-e826
Abstract
The post-2015 End TB Strategy sets global targets of reducing tuberculosis incidence by 50% and mortality by 75% by 2025. We aimed to assess resource requirements and cost-effectiveness of strategies to achieve these targets in China, India, and South Africa.We examined intervention scenarios developed in consultation with country stakeholders, which scaled up existing interventions to high but feasible coverage by 2025. Nine independent modelling groups collaborated to estimate policy outcomes, and we estimated the cost of each scenario by synthesising service use estimates, empirical cost data, and expert opinion on implementation strategies. We estimated health effects (ie, disability-adjusted life-years averted) and resource implications for 2016-35, including patient-incurred costs. To assess resource requirements and cost-effectiveness, we compared scenarios with a base case representing continued current practice.Incremental tuberculosis service costs differed by scenario and country, and in some cases they more than doubled existing funding needs. In general, expansion of tuberculosis services substantially reduced patient-incurred costs and, in India and China, produced net cost savings for most interventions under a societal perspective. In all three countries, expansion of access to care produced substantial health gains. Compared with current practice and conventional cost-effectiveness thresholds, most intervention approaches seemed highly cost-effective.Expansion of tuberculosis services seems cost-effective for high-burden countries and could generate substantial health and economic benefits for patients, although substantial new funding would be required. Further work to determine the optimal intervention mix for each country is necessary.Bill & Melinda Gates Foundation.
View details for DOI 10.1016/S2214-109X(16)30265-0
View details for PubMedID 27720689
-
Providers' knowledge of diagnosis and treatment of tuberculosis using vignettes: evidence from rural Bihar, India.
BMJ global health
2016; 1 (4)
Abstract
Almost 25% of all new cases of tuberculosis (TB) worldwide are in India, where drug resistance and low quality of care remain key challenges.We conducted an observational, cross-sectional study of healthcare providers' knowledge of diagnosis and treatment of TB in rural Bihar, India, from June to September 2012. Using data from vignette-based interviews with 395 most commonly visited healthcare providers in study areas, we scored providers' knowledge and used multivariable regression models to examine their relationship to providers' characteristics.80% of 395 providers had no formal medical qualifications. Overall, providers demonstrated low levels of knowledge: 64.9% (95% CI 59.8% to 69.8%) diagnosed correctly, and 21.7% (CI 16.8% to 27.1%) recommended correct treatment. Providers seldom asked diagnostic questions such as fever (31.4%, CI 26.8% to 36.2%) and bloody sputum (11.1%, CI 8.2% to 14.7%), or results from sputum microscopy (20.0%, CI: 16.2% to 24.3%). After controlling for whether providers treat TB, MBBS providers were not significantly different, from unqualified providers or those with alternative medical qualifications, on knowledge score or offering correct treatment. MBBS providers were, however, more likely to recommend referrals relative to complementary medicine and unqualified providers (23.2 and 37.7 percentage points, respectively).Healthcare providers in rural areas in Bihar, India, have low levels of knowledge regarding TB diagnosis and treatment. Our findings highlight the need for policies to improve training, incentives, task shifting and regulation to improve knowledge and performance of existing providers. Further, more research is needed on the incentives providers face and the role of information on quality to help patients select providers who offer higher quality care.
View details for DOI 10.1136/bmjgh-2016-000155
View details for PubMedID 28588984
-
Cost-effectiveness and resource implications of aggressive action on tuberculosis in China, India, and South Africa: a combined analysis of nine models
LANCET GLOBAL HEALTH
2016; 4 (11): E816-E826
Abstract
The post-2015 End TB Strategy sets global targets of reducing tuberculosis incidence by 50% and mortality by 75% by 2025. We aimed to assess resource requirements and cost-effectiveness of strategies to achieve these targets in China, India, and South Africa.We examined intervention scenarios developed in consultation with country stakeholders, which scaled up existing interventions to high but feasible coverage by 2025. Nine independent modelling groups collaborated to estimate policy outcomes, and we estimated the cost of each scenario by synthesising service use estimates, empirical cost data, and expert opinion on implementation strategies. We estimated health effects (ie, disability-adjusted life-years averted) and resource implications for 2016-35, including patient-incurred costs. To assess resource requirements and cost-effectiveness, we compared scenarios with a base case representing continued current practice.Incremental tuberculosis service costs differed by scenario and country, and in some cases they more than doubled existing funding needs. In general, expansion of tuberculosis services substantially reduced patient-incurred costs and, in India and China, produced net cost savings for most interventions under a societal perspective. In all three countries, expansion of access to care produced substantial health gains. Compared with current practice and conventional cost-effectiveness thresholds, most intervention approaches seemed highly cost-effective.Expansion of tuberculosis services seems cost-effective for high-burden countries and could generate substantial health and economic benefits for patients, although substantial new funding would be required. Further work to determine the optimal intervention mix for each country is necessary.Bill & Melinda Gates Foundation.
View details for DOI 10.1016/S2214-109X(16)30265-0
View details for Web of Science ID 000386811200023
-
Feasibility of achieving the 2025 WHO global tuberculosis targets in South Africa, China, and India: a combined analysis of 11 mathematical models.
The Lancet. Global health
2016; 4 (11): e806-e815
Abstract
The post-2015 End TB Strategy proposes targets of 50% reduction in tuberculosis incidence and 75% reduction in mortality from tuberculosis by 2025. We aimed to assess whether these targets are feasible in three high-burden countries with contrasting epidemiology and previous programmatic achievements.11 independently developed mathematical models of tuberculosis transmission projected the epidemiological impact of currently available tuberculosis interventions for prevention, diagnosis, and treatment in China, India, and South Africa. Models were calibrated with data on tuberculosis incidence and mortality in 2012. Representatives from national tuberculosis programmes and the advocacy community provided distinct country-specific intervention scenarios, which included screening for symptoms, active case finding, and preventive therapy.Aggressive scale-up of any single intervention scenario could not achieve the post-2015 End TB Strategy targets in any country. However, the models projected that, in the South Africa national tuberculosis programme scenario, a combination of continuous isoniazid preventive therapy for individuals on antiretroviral therapy, expanded facility-based screening for symptoms of tuberculosis at health centres, and improved tuberculosis care could achieve a 55% reduction in incidence (range 31-62%) and a 72% reduction in mortality (range 64-82%) compared with 2015 levels. For India, and particularly for China, full scale-up of all interventions in tuberculosis-programme performance fell short of the 2025 targets, despite preventing a cumulative 3·4 million cases. The advocacy scenarios illustrated the high impact of detecting and treating latent tuberculosis.Major reductions in tuberculosis burden seem possible with current interventions. However, additional interventions, adapted to country-specific tuberculosis epidemiology and health systems, are needed to reach the post-2015 End TB Strategy targets at country level.Bill and Melinda Gates Foundation.
View details for DOI 10.1016/S2214-109X(16)30199-1
View details for PubMedID 27720688
-
Cost-effectiveness of Intensive Blood Pressure Management.
JAMA cardiology
2016; 1 (8): 872-879
Abstract
Among high-risk patients with hypertension, targeting a systolic blood pressure of 120 mm Hg reduces cardiovascular morbidity and mortality compared with a higher target. However, intensive blood pressure management incurs additional costs from treatment and from adverse events.To evaluate the incremental cost-effectiveness of intensive blood pressure management compared with standard management.This cost-effectiveness analysis conducted from September 2015 to August 2016 used a Markov cohort model to estimate cost-effectiveness of intensive blood pressure management among 68-year-old high-risk adults with hypertension but not diabetes. We used the Systolic Blood Pressure Intervention Trial (SPRINT) to estimate treatment effects and adverse event rates. We used Centers for Disease Control and Prevention Life Tables to project age- and cause-specific mortality, calibrated to rates reported in SPRINT. We also used population-based observational data to model development of heart failure, myocardial infarction, stroke, and subsequent mortality. Costs were based on published sources, Medicare data, and the National Inpatient Sample.Treatment of hypertension to a systolic blood pressure goal of 120 mm Hg (intensive management) or 140 mm Hg (standard management).Lifetime costs and quality-adjusted life-years (QALYs), discounted at 3% annually.Standard management yielded 9.6 QALYs and accrued $155 261 in lifetime costs, while intensive management yielded 10.5 QALYs and accrued $176 584 in costs. Intensive blood pressure management cost $23 777 per QALY gained. In a sensitivity analysis, serious adverse events would need to occur at 3 times the rate observed in SPRINT and be 3 times more common in the intensive management arm to prefer standard management.Intensive blood pressure management is cost-effective at typical thresholds for value in health care and remains so even with substantially higher adverse event rates.
View details for DOI 10.1001/jamacardio.2016.3517
View details for PubMedID 27627731
-
Cost-Effectiveness of Treatments for Genotype 1 Hepatitis C Virus Infection in non-VA and VA Populations.
MDM policy & practice
2016; 1
Abstract
Chronic hepatitis C viral (HCV) infection affects millions of Americans. Healthcare systems face complex choices between multiple highly efficacious, costly treatments. This study assessed the cost-effectiveness of HCV treatments for chronic, genotype 1 HCV monoinfected, treatment-naïve individuals in the Department of Veterans Affairs (VA) and general U.S. healthcare systems.We conducted a decision-analytic Markov model-based cost-effectiveness analysis, employing appropriate payer perspectives and time horizons, and discounting benefits and costs at 3% annually. Interventions included: Sofosbuvir/ledipasvir (SOF-LDV); ombitasvir/paritaprevir/ritonavir/dasabuvir (3D); sofosbuvir/simeprevir (SOF-SMV); sofosbuvir/pegylated interferon/ribavirin (SOF-RBV-PEG); boceprevir/pegylated interferon/ribavirin (BOC-RBV-PEG); and pegylated interferon/ribavirin (PEG-RBV). Outcomes were sustained virologic response (SVR), advanced liver disease, costs, quality adjusted life years (QALYs), and incremental cost-effectiveness.SOF-LDV and 3D achieve higher SVR rates compared to older regimens and reduce advanced liver disease (>20% relative to no treatment), increasing QALYs by over 2 years per person. For the non-VA population, at current prices ($5,040 per week for SOF-LDV and $4,796 per week for 3D), SOF-LDV's lifetime cost ($293,370) is $18,000 lower than 3D's because of its shorter treatment duration in subgroups. SOF-LDV costs $17,100 per QALY gained relative to no treatment. 3D costs $208,000 per QALY gained relative to SOF-LDV. Both dominate other treatments and are even more cost-effective for the VA, though VA aggregate treatment costs still exceed $4 billion at SOF-LDV prices of $3,308 per week. Drug prices strongly determine relative cost-effectiveness for SOF-LDV and 3D; With sufficient price reductions (approximately 20-30% depending on the health system), 3D could be cost-effective relative to SOF-LDV. Limitations include the lack of long-term head-to-head regimen effectiveness trials.New HCV treatments are cost-effective in multiple healthcare systems if trial-estimated efficacy is achieved in practice, though, at current prices, total expenditures could present substantial challenges.
View details for DOI 10.1177/2381468316671946
View details for PubMedID 29756049
View details for PubMedCentralID PMC5942888
-
Risk of self-reported symptoms or diagnosis of active tuberculosis in relationship to low body mass index, diabetes and their co-occurrence.
Tropical medicine & international health
2016; 21 (10): 1272-1281
Abstract
Globally, tuberculosis prevalence has declined, but its risk factors have varied across place and time - low body mass index (BMI) has persisted while diabetes has increased. Using India's National Family Health Survey (NFHS), wave 3 and World Health Survey (WHS) data, we examined their relationships to support projection of future trends and targeted control efforts.Multivariate logistic regressions at the individual level with and without diabetes/BMI interactions assessed the relationship between tuberculosis, diabetes and low BMI and the importance of risk factor co-occurrence. Population-level analyses examined how tuberculosis incidence and prevalence varied with diabetes/low BMI co-occurrence.In NFHS, diabetic individuals had higher predicted tuberculosis risks (diabetic vs. non-diabetic: 2.50% vs. 0.63% at low BMI; 0.81% vs. 0.20% at normal BMI; 0.37% vs. 0.09% at high BMI), which were not significantly different when modelled independently or allowing for risk modification with diabetes/low BMI co-occurrence. WHS findings were generally consistent. Population-level analysis found that diabetes/low BMI co-occurrence may be associated with elevated tuberculosis risk, although its predicted effect on tuberculosis incidence/prevalence was generally ≤0.2 percentage points and not robustly statistically significant.Concerns about the additional elevation of tuberculosis risk from diabetes/low BMI co-occurrence and hence the need to coordinate tuberculosis control efforts around the nexus of co-occurring diabetes and low BMI may be premature. However, study findings robustly support the importance of individually targeting low BMI and diabetes as part of ongoing tuberculosis control efforts.
View details for DOI 10.1111/tmi.12763
View details for PubMedID 27495971
-
Cost-Effectiveness of Local Therapies for Inoperable, Localized Hepatocellular Carcinoma
ELSEVIER SCIENCE INC. 2016: E138
View details for DOI 10.1016/j.ijrobp.2016.06.938
View details for Web of Science ID 000387655802337
-
Effect Of A Large-Scale Social Franchising And Telemedicine Program On Childhood Diarrhea And Pneumonia Outcomes In India.
Health affairs
2016; 35 (10): 1800-1809
Abstract
Despite the rapid growth of social franchising, there is little evidence on its population impact in the health sector. Similar in many ways to private-sector commercial franchising, social franchising can be found in sectors with a social objective, such as health care. This article evaluates the World Health Partners (WHP) Sky program, a large-scale social franchising and telemedicine program in Bihar, India. We studied appropriate treatment for childhood diarrhea and pneumonia and associated health care outcomes. We used multivariate difference-in-differences models to analyze data on 67,950 children ages five and under in 2011 and 2014. We found that the WHP-Sky program did not improve rates of appropriate treatment or disease prevalence. Both provider participation and service use among target populations were low. Our results do not imply that social franchising cannot succeed; instead, they underscore the importance of understanding factors that explain variation in the performance of social franchises. Our findings also highlight, for donors and governments in particular, the importance of conducting rigorous impact evaluations of new and potentially innovative health care delivery programs before investing in scaling them up.
View details for PubMedID 27702952
-
Reply to R. Colomer et al.
Journal of clinical oncology
2016; 34 (26): 3227-3228
View details for DOI 10.1200/JCO.2016.68.4084
View details for PubMedID 27432936
-
Cost-Effectiveness of HIV Preexposure Prophylaxis for People Who Inject Drugs in the United States
ANNALS OF INTERNAL MEDICINE
2016; 165 (1): 10-?
View details for DOI 10.7326/M15-2634
View details for Web of Science ID 000379215800003
-
Cost-Effectiveness of Implantable Pulmonary Artery Pressure Monitoring in Chronic Heart Failure
JACC-HEART FAILURE
2016; 4 (5): 368-375
Abstract
This study aimed to evaluate the cost-effectiveness of the CardioMEMS (CardioMEMS Heart Failure System, St Jude Medical Inc, Atlanta, Georgia) device in patients with chronic heart failure.The CardioMEMS device, an implantable pulmonary artery pressure monitor, was shown to reduce hospitalizations for heart failure and improve quality of life in the CHAMPION (CardioMEMS Heart Sensor Allows Monitoring of Pressure to Improve Outcomes in NYHA Class III Heart Failure Patients) trial.We developed a Markov model to determine the hospitalization, survival, quality of life, cost, and incremental cost-effectiveness ratio of CardioMEMS implantation compared with usual care among a CHAMPION trial cohort of patients with heart failure. We obtained event rates and utilities from published trial data; we used costs from literature estimates and Medicare reimbursement data. We performed subgroup analyses of preserved and reduced ejection fraction and an exploratory analysis in a lower-risk cohort on the basis of the CHARM (Candesartan in Heart failure: Reduction in Mortality and Morbidity) trials.CardioMEMS reduced lifetime hospitalizations (2.18 vs. 3.12), increased quality-adjusted life-years (QALYs) (2.74 vs. 2.46), and increased costs ($176,648 vs. $156,569), thus yielding a cost of $71,462 per QALY gained and $48,054 per life-year gained. The cost per QALY gained was $82,301 in patients with reduced ejection fraction and $47,768 in those with preserved ejection fraction. In the lower-risk CHARM cohort, the device would need to reduce hospitalizations for heart failure by 41% to cost <$100,000 per QALY gained. The cost-effectiveness was most sensitive to the device's durability.In populations similar to that of the CHAMPION trial, the CardioMEMS device is cost-effective if the trial effectiveness is sustained over long periods. Post-marketing surveillance data on durability will further clarify its value.
View details for DOI 10.1016/j.jchf.2015.12.015
View details for PubMedID 26874380
-
COST-EFFECTIVENESS OF INTENSIVE BLOOD PRESSURE CONTROL
SPRINGER. 2016: S170
View details for Web of Science ID 000392201600150
-
Cost-Effectiveness of Pertuzumab in Human Epidermal Growth Factor Receptor 2-Positive Metastatic Breast Cancer.
Journal of clinical oncology
2016; 34 (9): 902-909
Abstract
The Clinical Evaluation of Pertuzumab and Trastuzumab (CLEOPATRA) study showed a 15.7-month survival benefit with the addition of pertuzumab to docetaxel and trastuzumab (THP) as first-line treatment for patients with human epidermal growth factor receptor 2 (HER2) -overexpressing metastatic breast cancer. We performed a cost-effectiveness analysis to assess the value of adding pertuzumab.We developed a decision-analytic Markov model to evaluate the cost effectiveness of docetaxel plus trastuzumab (TH) with or without pertuzumab in US patients with metastatic breast cancer. The model followed patients weekly over their remaining lifetimes. Health states included stable disease, progressing disease, hospice, and death. Transition probabilities were based on the CLEOPATRA study. Costs reflected the 2014 Medicare rates. Health state utilities were the same as those used in other recent cost-effectiveness studies of trastuzumab and pertuzumab. Outcomes included health benefits expressed as discounted quality-adjusted life-years (QALYs), costs in US dollars, and cost effectiveness expressed as an incremental cost-effectiveness ratio. One- and multiway deterministic and probabilistic sensitivity analyses explored the effects of specific assumptions.Modeled median survival was 39.4 months for TH and 56.9 months for THP. The addition of pertuzumab resulted in an additional 1.82 life-years gained, or 0.64 QALYs, at a cost of $713,219 per QALY gained. Deterministic sensitivity analysis showed that THP is unlikely to be cost effective even under the most favorable assumptions, and probabilistic sensitivity analysis predicted 0% chance of cost effectiveness at a willingness to pay of $100,000 per QALY gained.THP in patients with metastatic HER2-positive breast cancer is unlikely to be cost effective in the United States.
View details for DOI 10.1200/JCO.2015.62.9105
View details for PubMedID 26351332
-
Cost-effectiveness of pertuzumab in HER2+metastatic breast cancer
AMER ASSOC CANCER RESEARCH. 2016
View details for DOI 10.1158/1538-7445.SABCS15-P6-11-01
View details for Web of Science ID 000375622403068
-
An Efficient, Noniterative Method of Identifying the Cost-Effectiveness Frontier
MEDICAL DECISION MAKING
2016; 36 (1): 132-136
Abstract
Cost-effectiveness analysis aims to identify treatments and policies that maximize benefits subject to resource constraints. However, the conventional process of identifying the efficient frontier (i.e., the set of potentially cost-effective options) can be algorithmically inefficient, especially when considering a policy problem with many alternative options or when performing an extensive suite of sensitivity analyses for which the efficient frontier must be found for each. Here, we describe an alternative one-pass algorithm that is conceptually simple, easier to implement, and potentially faster for situations that challenge the conventional approach. Our algorithm accomplishes this by exploiting the relationship between the net monetary benefit and the cost-effectiveness plane. To facilitate further evaluation and use of this approach, we also provide scripts in R and Matlab that implement our method and can be used to identify efficient frontiers for any decision problem.
View details for DOI 10.1177/0272989X15583496
View details for Web of Science ID 000366910300012
View details for PubMedID 25926282
View details for PubMedCentralID PMC4626430
-
Cost-Effectiveness of Treatments for Genotype 1 Hepatitis C Virus Infection in Non-VA and VA Populations
Medical Decision Making Policy and Practice
2016; 1 (1): 1-12
View details for DOI 10.1177/2381468316671946
-
Evaluating Cost-effectiveness of Interventions That Affect Fertility and Childbearing: How Health Effects Are Measured Matters.
Medical decision making
2015; 35 (7): 818-846
Abstract
Current guidelines for economic evaluations of health interventions define relevant outcomes as those accruing to individuals receiving interventions. Little consensus exists on counting health impacts on current and future fertility and childbearing. Our objective was to characterize current practices for counting such health outcomes.We developed a framework characterizing health interventions with direct and/or indirect effects on fertility and childbearing and how such outcomes are reported. We identified interventions spanning the framework and performed a targeted literature review for economic evaluations of these interventions. For each article, we characterized how the potential health outcomes from each intervention were considered, focusing on quality-adjusted life-years (QALYs) associated with fertility and childbearing.We reviewed 108 studies, identifying 7 themes: 1) Studies were heterogeneous in reporting outcomes. 2) Studies often selected outcomes for inclusion that tend to bias toward finding the intervention to be cost-effective. 3) Studies often avoided the challenges of assigning QALYs for pregnancy and fertility by instead considering cost per intermediate outcome. 4) Even for the same intervention, studies took heterogeneous approaches to outcome evaluation. 5) Studies used multiple, competing rationales for whether and how to include fertility-related QALYs and whose QALYs to include. 6) Studies examining interventions with indirect effects on fertility typically ignored such QALYs. 7) Even recent studies had these shortcomings. Limitations include that the review was targeted rather than systematic.Economic evaluations inconsistently consider QALYs from current and future fertility and childbearing in ways that frequently appear biased toward the interventions considered. As the Panel on Cost-Effectiveness in Health and Medicine updates its guidelines, making the practice of cost-effectiveness analysis more consistent is a priority. Our study contributes to harmonizing methods in this respect.
View details for DOI 10.1177/0272989X15583845
View details for PubMedID 25926281
View details for PubMedCentralID PMC4418217
-
Cost-Effectiveness of Adding Cardiac Resynchronization Therapy to an Implantable Cardioverter-Defibrillator Among Patients With Mild Heart Failure.
Annals of internal medicine
2015; 163 (6): 417-426
Abstract
Cardiac resynchronization therapy (CRT) reduces mortality and heart failure hospitalizations in patients with mild heart failure.To estimate the cost-effectiveness of adding CRT to an implantable cardioverter-defibrillator (CRT-D) compared with implantable cardioverter-defibrillator (ICD) alone among patients with left ventricular systolic dysfunction, prolonged intraventricular conduction, and mild heart failure.Markov decision model.Clinical trials, clinical registries, claims data from Centers for Medicare & Medicaid Services, and Centers for Disease Control and Prevention life tables.Patients aged 65 years or older with a left ventricular ejection fraction (LVEF) of 30% or less, QRS duration of 120 milliseconds or more, and New York Heart Association (NYHA) class I or II symptoms.Lifetime.Societal.CRT-D or ICD alone.Life-years, quality-adjusted life-years (QALYs), costs, and incremental cost-effectiveness ratios (ICERs).Use of CRT-D increased life expectancy (9.8 years versus 8.8 years), QALYs (8.6 years versus 7.6 years), and costs ($286 500 versus $228 600), yielding a cost per QALY gained of $61 700.The cost-effectiveness of CRT-D was most dependent on the degree of mortality reduction: When the risk ratio for death was 0.95, the ICER increased to $119 600 per QALY. More expensive CRT-D devices, shorter CRT-D battery life, and older age also made the cost-effectiveness of CRT-D less favorable.The estimated mortality reduction for CRT-D was largely based on a single trial. Data on patients with NYHA class I symptoms were limited. The cost-effectiveness of CRT-D in patients with NYHA class I symptoms remains uncertain.In patients with an LVEF of 30% or less, QRS duration of 120 milliseconds or more, and NYHA class II symptoms, CRT-D appears to be economically attractive relative to ICD alone when a reduction in mortality is expected.National Institutes of Health, University of Copenhagen, U.S. Department of Veterans Affairs.
View details for DOI 10.7326/M14-1804
View details for PubMedID 26301323
-
Cost-Effectiveness of Adding Cardiac Resynchronization Therapy to an Implantable Cardioverter-Defibrillator Among Patients With Mild Heart Failure.
Annals of internal medicine
2015; 163 (6): 417-426
View details for DOI 10.7326/M14-1804
View details for PubMedID 26301323
-
Will Divestment from Employment-Based Health Insurance Save Employers Money? The Case of State and Local Governments
JOURNAL OF EMPIRICAL LEGAL STUDIES
2015; 12 (3): 343-394
View details for DOI 10.1111/jels.12076
View details for Web of Science ID 000360209700001
-
Cost-effectiveness of improvements in diagnosis and treatment accessibility for tuberculosis control in India
INTERNATIONAL JOURNAL OF TUBERCULOSIS AND LUNG DISEASE
2015; 19 (9): 1115-1124
Abstract
Inaccurate diagnosis and inaccessibility of care undercut the effectiveness of high-quality anti-tuberculosis treatment and select for resistance. Rapid diagnostic systems, such as Xpert(®) MTB/RIF for tuberculosis (TB) diagnosis and drug susceptibility testing (DST), and programs that provide high-quality DOTS anti-tuberculosis treatment to patients in the unregulated private sector (public-private mix [PPM]), may help address these challenges, albeit at increased cost.We extended a microsimulation model of TB in India calibrated to demographic, epidemiologic, and care trends to evaluate 1) replacing DST with Xpert; 2) replacing microscopy and culture with Xpert to diagnose multidrug-resistant TB (MDR-TB) and non-MDR-TB; 3) implementing nationwide PPM; and combinations of (3) with (1) or (2).PPM (assuming costs of $38/person) and Xpert improved health and increase costs relative to the status quo. PPM alone or with Xpert cost <1 gross domestic product/capita per quality-adjusted life-year gained relative to the next best intervention, and dominated Xpert interventions excluding PPM.While both PPM and Xpert are promising tools for combatting TB in India, PPM should be prioritized over Xpert, as private sector engagement is more cost-effective than Xpert alone and, if sufficient resources are available, would substantially increase the value of Xpert if both interventions are implemented together.
View details for DOI 10.5588/ijtld.15.0158
View details for Web of Science ID 000359894400022
View details for PubMedID 26260835
-
Cost-effectiveness of improvements in diagnosis and treatment accessibility for tuberculosis control in India.
international journal of tuberculosis and lung disease
2015; 19 (9): 1115-?
Abstract
Inaccurate diagnosis and inaccessibility of care undercut the effectiveness of high-quality anti-tuberculosis treatment and select for resistance. Rapid diagnostic systems, such as Xpert(®) MTB/RIF for tuberculosis (TB) diagnosis and drug susceptibility testing (DST), and programs that provide high-quality DOTS anti-tuberculosis treatment to patients in the unregulated private sector (public-private mix [PPM]), may help address these challenges, albeit at increased cost.We extended a microsimulation model of TB in India calibrated to demographic, epidemiologic, and care trends to evaluate 1) replacing DST with Xpert; 2) replacing microscopy and culture with Xpert to diagnose multidrug-resistant TB (MDR-TB) and non-MDR-TB; 3) implementing nationwide PPM; and combinations of (3) with (1) or (2).PPM (assuming costs of $38/person) and Xpert improved health and increase costs relative to the status quo. PPM alone or with Xpert cost <1 gross domestic product/capita per quality-adjusted life-year gained relative to the next best intervention, and dominated Xpert interventions excluding PPM.While both PPM and Xpert are promising tools for combatting TB in India, PPM should be prioritized over Xpert, as private sector engagement is more cost-effective than Xpert alone and, if sufficient resources are available, would substantially increase the value of Xpert if both interventions are implemented together.
View details for DOI 10.5588/ijtld.15.0158
View details for PubMedID 26260835
-
Quantifying demographic and socioeconomic transitions for computational epidemiology: an open-source modeling approach applied to India
POPULATION HEALTH METRICS
2015; 13
Abstract
Demographic and socioeconomic changes such as increasing urbanization, migration, and female education shape population health in many low- and middle-income countries. These changes are rarely reflected in computational epidemiological models, which are commonly used to understand population health trends and evaluate policy interventions. Our goal was to create a "backbone" simulation modeling approach to allow computational epidemiologists to explicitly reflect changing demographic and socioeconomic conditions in population health models.We developed, evaluated, and "open-sourced" a generalized approach to incorporate longitudinal, commonly available demographic and socioeconomic data into epidemiological simulations, illustrating the feasibility and utility of our approach with data from India. We constructed a series of nested microsimulations of increasing complexity, calibrating each model to longitudinal sociodemographic and vital registration data. We then selected the model that was most consistent with the data (i.e., greater accuracy) while containing the fewest parameters (i.e., greater parsimony). We validated the selected model against additional data sources not used for calibration.We found that standard computational epidemiology models that do not incorporate demographic and socioeconomic trends quickly diverged from past mortality and population size estimates, while our approach remained consistent with observed data over decadal time courses. Our approach additionally enabled the examination of complex relations between demographic, socioeconomic and health parameters, such as the relationship between changes in educational attainment or urbanization and changes in fertility, mortality, and migration rates.Incorporating demographic and socioeconomic trends in computational epidemiology is feasible through the "open source" approach, and could critically alter population health projections and model-based evaluations of health policy interventions in unintuitive ways.
View details for DOI 10.1186/s12963-015-0053-1
View details for Web of Science ID 000358760100001
View details for PubMedCentralID PMC4521358
-
Optimizing patient treatment decisions in an era of rapid technological advances: the case of hepatitis C treatment.
Health care management science
2015: -?
Abstract
How long should a patient with a treatable chronic disease wait for more effective treatments before accepting the best available treatment? We develop a framework to guide optimal treatment decisions for a deteriorating chronic disease when treatment technologies are improving over time. We formulate an optimal stopping problem using a discrete-time, finite-horizon Markov decision process. The goal is to maximize a patient's quality-adjusted life expectancy. We derive structural properties of the model and analytically solve a three-period treatment decision problem. We illustrate the model with the example of treatment for chronic hepatitis C virus (HCV). Chronic HCV affects 3-4 million Americans and has been historically difficult to treat, but increasingly effective treatments have been commercialized in the past few years. We show that the optimal treatment decision is more likely to be to accept currently available treatment-despite expectations for future treatment improvement-for patients who have high-risk history, who are older, or who have more comorbidities. Insights from this study can guide HCV treatment decisions for individual patients. More broadly, our model can guide treatment decisions for curable chronic diseases by finding the optimal treatment policy for individual patients in a heterogeneous population.
View details for PubMedID 26188961
-
Research aimed at improving both mood and weight (RAINBOW) in primary care: A type 1 hybrid design randomized controlled trial.
Contemporary clinical trials
2015; 43: 260-278
Abstract
Effective interventions targeting comorbid obesity and depression are critical given the increasing prevalence and worsened outcomes for patients with both conditions. RAINBOW is a type 1 hybrid design randomized controlled trial. The objective is to evaluate the clinical and cost effectiveness and implementation potential of an integrated, technology-enhanced, collaborative care model for treating comorbid obesity and depression in primary care. Obese and depressed adults (n=404) will be randomized to usual care enhanced with the provision of a pedometer and information about the health system's services for mood or weight management (control) or with the Integrated Coaching for Better Mood and Weight (I-CARE) program (intervention). The 12-month I-CARE program synergistically integrates two proven behavioral interventions: problem-solving therapy with as-needed intensification of pharmacotherapy for depression (PEARLS) and standardized behavioral treatment for obesity (Group Lifestyle Balance™). It utilizes traditional (e.g., office visits and phone consults) and emerging care delivery modalities (e.g., patient web portal and mobile applications). Follow-up assessments will occur at 6, 12, 18, and 24months. We hypothesize that compared with controls, I-CARE participants will have greater improvements in weight and depression severity measured by the 20-item Depression Symptom Checklist at 12months, which will be sustained at 24months. We will also assess I-CARE's cost-effectiveness and use mixed methods to examine its potential for reach, adoption, implementation, and maintenance. This study offers the potential to change how obese and depressed adults are treated-through a new model of accessible and integrative lifestyle medicine and mental health expertise-in primary care.
View details for DOI 10.1016/j.cct.2015.06.010
View details for PubMedID 26096714
View details for PubMedCentralID PMC4537656
-
Computing Expected Value of Partial Sample Information from Probabilistic Sensitivity Analysis Using Linear Regression Metamodeling
MEDICAL DECISION MAKING
2015; 35 (5): 584-595
Abstract
Decision makers often desire both guidance on the most cost-effective interventions given current knowledge and also the value of collecting additional information to improve the decisions made (i.e., from value of information [VOI] analysis). Unfortunately, VOI analysis remains underused due to the conceptual, mathematical, and computational challenges of implementing Bayesian decision-theoretic approaches in models of sufficient complexity for real-world decision making. In this study, we propose a novel practical approach for conducting VOI analysis using a combination of probabilistic sensitivity analysis, linear regression metamodeling, and unit normal loss integral function-a parametric approach to VOI analysis. We adopt a linear approximation and leverage a fundamental assumption of VOI analysis, which requires that all sources of prior uncertainties be accurately specified. We provide examples of the approach and show that the assumptions we make do not induce substantial bias but greatly reduce the computational time needed to perform VOI analysis. Our approach avoids the need to analytically solve or approximate joint Bayesian updating, requires only one set of probabilistic sensitivity analysis simulations, and can be applied in models with correlated input parameters.
View details for DOI 10.1177/0272989X15578125
View details for Web of Science ID 000356431100004
View details for PubMedID 25840900
-
Uptake and utilization of directly acting antiviral medications for hepatitis C infection in US veterans
JOURNAL OF VIRAL HEPATITIS
2015; 22 (5): 489-495
Abstract
New drugs therapies have revolutionized the treatment of hepatitis C virus (HCV) infection. The objectives of this study were to evaluate uptake and utilization of boceprevir and telaprevir in the Department of Veterans Affairs (VA). We evaluated whether therapies conformed to response-guided protocols, whether they replaced standard interferon plus ribavirin treatment, and whether IL-28B was used to guide treatment. We performed an administrative data-based analysis of all patients receiving pharmacologic treatment for HCV in VA from October 2009 to July 2013. There were 12 737 new HCV prescriptions in VA during this time, with 5564 boceprevir or telaprevir prescriptions (44%) and 7173 prescriptions (56%) written for standard interferon plus ribavirin treatment. Prescriptions for the new treatments heavily favoured boceprevir vs telaprevir (83% vs 17%). Sixty-two percent (62%) of boceprevir-treated patients completed their minimum-specified protocol, while 69.2% of telaprevir-treated patients completed their minimum-specified protocol. From October 2010 to July 2012, 4090 patients had an IL-28B test; less than 16% of these tests guided subsequent HCV prescriptions. Uptake of boceprevir and telaprevir was rapid; the number of patients initiating treatment approximately doubled in the period after their introduction. While new prescriptions favor boceprevir or telaprevir over standard interferon plus ribavirin therapy, there appears to still be a strong role of interferon plus ribavirin in treating HCV patients. This work can inform our understanding of how other new effective HCV therapies will be used, their diffusion, and the timing of their diffusion in actual clinical practice.
View details for DOI 10.1111/jvh.12344
View details for PubMedID 25417805
-
The Know-Do Gap in Quality of Health Care for Childhood Diarrhea and Pneumonia in Rural India
JAMA PEDIATRICS
2015; 169 (4): 349-357
Abstract
In rural India, as in many developing countries, childhood mortality remains high and the quality of health care available is low. Improving care in such settings, where most health care practitioners do not have formal training, requires an assessment of the practitioners' knowledge of appropriate care and the actual care delivered (the know-do gap).To assess the knowledge of local health care practitioners and the quality of care provided by them for childhood diarrhea and pneumonia in rural Bihar, India.We conducted an observational, cross-sectional study of the knowledge and practice of 340 health care practitioners concerning the diagnosis and treatment of childhood diarrhea and pneumonia in Bihar, India, from June 29 through September 8, 2012. We used data from vignette interviews and unannounced standardized patients (SPs).For SPs and vignettes, practitioner performance was measured using the numbers of key diagnostic questions asked and examinations conducted. The know-do gap was calculated by comparing fractions of practitioners asking key diagnostic questions on each method. Multivariable regressions examined the relation among diagnostic performance, prescription of potentially harmful treatments, and the practitioners' characteristics. We also examined correct treatment recommended by practitioners with both methods.Practitioners asked a mean of 2.9 diagnostic questions and suggested a mean of 0.3 examinations in the diarrhea vignette; mean numbers were 1.4 and 0.8, respectively, for the pneumonia vignette. Although oral rehydration salts, the correct treatment for diarrhea, are commonly available, only 3.5% of practitioners offered them in the diarrhea vignette. With SPs, no practitioner offered the correct treatment for diarrhea, and 13.0% of practitioners offered the correct treatment for pneumonia. Diarrhea treatment has a large know-do gap; practitioners asked diagnostic questions more frequently in vignettes than for SPs. Although only 20.9% of practitioners prescribed treatments that were potentially harmful in the diarrhea vignettes, 71.9% offered them to SPs (P < .001). Unqualified practitioners were more likely to prescribe potentially harmful treatments for diarrhea (adjusted odds ratio, 5.11 [95% CI, 1.24-21.13]). Higher knowledge scores were associated with better performance for treating diarrhea but not pneumonia.Practitioners performed poorly with vignettes and SPs, with large know-do gaps, especially for childhood diarrhea. Efforts to improve health care for major causes of childhood mortality should emphasize strategies that encourage pediatric health care practitioners to diagnose and manage these conditions correctly through better monitoring and incentives in addition to practitioner training initiatives.
View details for DOI 10.1001/jamapediatrics.2014.3445
View details for Web of Science ID 000354995600018
View details for PubMedID 25686357
-
Modeling and calibration for exposure to time-varying, modifiable risk factors: the example of smoking behavior in India.
Medical decision making
2015; 35 (2): 196-210
Abstract
Risk factors increase the incidence and severity of chronic disease. To examine future trends and develop policies addressing chronic diseases, it is important to capture the relationship between exposure and disease development, which is challenging given limited data.To develop parsimonious risk factor models embeddable in chronic disease models, which are useful when longitudinal data are unavailable.The model structures encode relevant features of risk factors (e.g., time-varying, modifiable) and can be embedded in chronic disease models. Calibration captures time-varying exposures for the risk factor models using available cross-sectional data. We illustrate feasibility with the policy-relevant example of smoking in India.The model is calibrated to the prevalence of male smoking in 12 Indian regions estimated from the 2009-2010 Indian Global Adult Tobacco Survey. Nelder-Mead searches (250,000 starting locations) identify distributions of starting, quitting, and restarting rates that minimize the difference between modeled and observed age-specific prevalence. We compare modeled life expectancies to estimates in the absence of time-varying risk exposures and consider gains from hypothetical smoking cessation programs delivered for 1 to 30 years.Calibration achieves concordance between modeled and observed outcomes. Probabilities of starting to smoke rise and fall with age, while quitting and restarting probabilities fall with age. Accounting for time-varying smoking exposures is important, as not doing so produces smaller estimates of life expectancy losses. Estimated impacts of smoking cessation programs delivered for different periods depend on the fact that people who have been induced to abstain from smoking longer are less likely to restart.The approach described is feasible for important risk factors for numerous chronic diseases. Incorporating exposure-change rates can improve modeled estimates of chronic disease outcomes and of the long-term effects of interventions targeting risk factors.
View details for DOI 10.1177/0272989X13518272
View details for PubMedID 24477078
View details for PubMedCentralID PMC4115057
-
Quantifying demographic and socioeconomic transitions for computational epidemiology: an open-source modeling approach applied to India.
Population health metrics
2015; 13: 19-?
Abstract
Demographic and socioeconomic changes such as increasing urbanization, migration, and female education shape population health in many low- and middle-income countries. These changes are rarely reflected in computational epidemiological models, which are commonly used to understand population health trends and evaluate policy interventions. Our goal was to create a "backbone" simulation modeling approach to allow computational epidemiologists to explicitly reflect changing demographic and socioeconomic conditions in population health models.We developed, evaluated, and "open-sourced" a generalized approach to incorporate longitudinal, commonly available demographic and socioeconomic data into epidemiological simulations, illustrating the feasibility and utility of our approach with data from India. We constructed a series of nested microsimulations of increasing complexity, calibrating each model to longitudinal sociodemographic and vital registration data. We then selected the model that was most consistent with the data (i.e., greater accuracy) while containing the fewest parameters (i.e., greater parsimony). We validated the selected model against additional data sources not used for calibration.We found that standard computational epidemiology models that do not incorporate demographic and socioeconomic trends quickly diverged from past mortality and population size estimates, while our approach remained consistent with observed data over decadal time courses. Our approach additionally enabled the examination of complex relations between demographic, socioeconomic and health parameters, such as the relationship between changes in educational attainment or urbanization and changes in fertility, mortality, and migration rates.Incorporating demographic and socioeconomic trends in computational epidemiology is feasible through the "open source" approach, and could critically alter population health projections and model-based evaluations of health policy interventions in unintuitive ways.
View details for DOI 10.1186/s12963-015-0053-1
View details for PubMedID 26236157
View details for PubMedCentralID PMC4521358
-
New, Expensive Treatments for Chronic Hepatitis C: Insuring Good Outcomes?
Digestive diseases and sciences
2015; 60 (11): 3153–54
View details for PubMedID 26386855
-
Sofosbuvir-Based Treatment Regimens for Chronic, Genotype 1 Hepatitis C Virus Infection in US Incarcerated Populations A Cost-Effectiveness Analysis
ANNALS OF INTERNAL MEDICINE
2014; 161 (8): 546-U43
Abstract
Prevalence of chronic hepatitis C virus (HCV) infection is high among incarcerated persons in the United States. New, short-duration, high-efficacy therapies may expand treatment eligibility in this population.To assess the cost-effectiveness of sofosbuvir for HCV treatment in incarcerated populations.Markov model.Published literature and expert opinion.Treatment-naive men with chronic, genotype 1 HCV monoinfection.Lifetime.Societal.No treatment, 2-drug therapy (pegylated interferon and ribavirin), or 3-drug therapy with either boceprevir or sofosbuvir. For inmates with short remaining sentences (<1.5 years), only no treatment or sofosbuvir 3-drug therapy was feasible; for those with long sentences (≥1.5 years; mean, 10 years), all strategies were considered. After release, eligible persons could receive sofosbuvir 3-drug therapy.Discounted costs (in 2013 U.S. dollars), discounted quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios.The strategies yielded 13.12, 13.57, 14.43, and 15.18 QALYs, respectively, for persons with long sentences. Sofosbuvir produced the largest absolute reductions in decompensated cirrhosis (16%) and hepatocellular carcinoma (9%), resulting in 2.1 additional QALYs at an added cost exceeding $54,000 compared with no treatment. For persons with short sentences, sofosbuvir cost $25,700 per QALY gained compared with no treatment; for those with long sentences, it dominated other treatments, costing $28,800 per QALY gained compared with no treatment.High reinfection rates in prison attenuated cost-effectiveness for persons with long sentences.Data on sofosbuvir's long-term effectiveness and price are limited. The analysis did not consider women, Hispanic persons, or patients co-infected with HIV or hepatitis B virus.Sofosbuvir-based treatment is cost-effective for incarcerated persons, but affordability is an important consideration.National Institutes of Health.
View details for DOI 10.7326/M14-0602
View details for Web of Science ID 000343906800014
View details for PubMedCentralID PMC4313741
-
Sofosbuvir-based treatment regimens for chronic, genotype 1 hepatitis C virus infection in U.S. incarcerated populations: a cost-effectiveness analysis.
Annals of internal medicine
2014; 161 (8): 546-553
Abstract
Prevalence of chronic hepatitis C virus (HCV) infection is high among incarcerated persons in the United States. New, short-duration, high-efficacy therapies may expand treatment eligibility in this population.To assess the cost-effectiveness of sofosbuvir for HCV treatment in incarcerated populations.Markov model.Published literature and expert opinion.Treatment-naive men with chronic, genotype 1 HCV monoinfection.Lifetime.Societal.No treatment, 2-drug therapy (pegylated interferon and ribavirin), or 3-drug therapy with either boceprevir or sofosbuvir. For inmates with short remaining sentences (<1.5 years), only no treatment or sofosbuvir 3-drug therapy was feasible; for those with long sentences (≥1.5 years; mean, 10 years), all strategies were considered. After release, eligible persons could receive sofosbuvir 3-drug therapy.Discounted costs (in 2013 U.S. dollars), discounted quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios.The strategies yielded 13.12, 13.57, 14.43, and 15.18 QALYs, respectively, for persons with long sentences. Sofosbuvir produced the largest absolute reductions in decompensated cirrhosis (16%) and hepatocellular carcinoma (9%), resulting in 2.1 additional QALYs at an added cost exceeding $54,000 compared with no treatment. For persons with short sentences, sofosbuvir cost $25,700 per QALY gained compared with no treatment; for those with long sentences, it dominated other treatments, costing $28,800 per QALY gained compared with no treatment.High reinfection rates in prison attenuated cost-effectiveness for persons with long sentences.Data on sofosbuvir's long-term effectiveness and price are limited. The analysis did not consider women, Hispanic persons, or patients co-infected with HIV or hepatitis B virus.Sofosbuvir-based treatment is cost-effective for incarcerated persons, but affordability is an important consideration.National Institutes of Health.
View details for DOI 10.7326/M14-0602
View details for PubMedID 25329202
-
A New Cost-effectiveness Microsimulation Model for Glatiramer Acetate and Dimethyl Fumarate
SAGE PUBLICATIONS LTD. 2014: 932–33
View details for Web of Science ID 000337854400120
-
OPTIMAL INFORMATION ACQUISITION POLICIES: APPLICATION TO HEPATITIS C SCREENING
ELSEVIER SCIENCE INC. 2014: A190–A191
View details for DOI 10.1016/j.jval.2014.03.1112
View details for Web of Science ID 000341082001172
-
Exploration and adoption of evidence-based practice by US child welfare agencies
CHILDREN AND YOUTH SERVICES REVIEW
2014; 39: 147-152
Abstract
To examine the extent to which child welfare agencies adopt new practices and to determine the barriers to and facilitators of adoption of new practices.Data came from telephone interviews with the directors of the 92 public child welfare agencies that constituted the probability sample for the first National Survey of Child and Adolescent Well-being (NSCAWI). In a semi-structured 40 minute interview administered by a trained Research Associate, agency directors were asked about agency demographics, knowledge of evidence-based practices, use of technical assistance and actual use of evidence-based practices.. Of the 92 agencies, 83 or 90% agreed to be interviewed.Agencies reported that the majority of staff had a BA degree (53.45%) and that they either paid for (52.6%) or provided (80.7%) continuing education. Although agencies routinely collect standardized child outcomes (90%) they much less frequently collect measures of child functioning (30.9%). Almost all agencies (94%) had started a new program or practice but only 24.8% were evidence-based and strategies used to explore new programs or practices usually involved local or state contracts. Factors that were associated with program success included internal support for the innovation (27.3%), and an existing evidence base (23.5%).Directors of child welfare agencies frequently institute new programs or practices but they are not often evidence-based. Because virtually all agencies provide some continuing education adding discussions of evidence-based programs/practices may spur adaption. Reliance on local and state colleagues to explore new programs and practices suggests that developing well informed social networks may be a way to increase the spread of evidence0based practices.
View details for DOI 10.1016/j.childyouth.2013.10.004
View details for Web of Science ID 000334135900018
View details for PubMedCentralID PMC3960081
-
Explaining variations in state foster care maintenance rates and the implications for implementing new evidence-based programs
CHILDREN AND YOUTH SERVICES REVIEW
2014; 39: 183-206
Abstract
U.S. Child Welfare systems are involved in the lives of millions of children, and total spending exceeds $26 billion annually. Out-of-home foster care is a critical and expensive Child Welfare service, a major component of which is the maintenance rate paid to support housing and caring for a foster child. Maintenance rates vary widely across states and over time, but reasons for this variation are not well understood. As evidence-based programs are disseminated to state Child Welfare systems, it is important to understand what may be the important drivers in the uptake of these practices including state spending on core system areas.We assembled a unique, longitudinal, state-level panel dataset (1990-2008) for all 50 states with annual data on foster care maintenance rates and measures of child population in need, poverty, employment, urbanicity, proportion minority, political party control of the state legislature and governorship, federal funding, and lawsuits involving state foster care systems. All monetary values were expressed in per-capita terms and inflation adjusted to 2008 dollars. We used longitudinal panel regressions with robust standard errors and state and year fixed effects to estimate the relationship between state foster care maintenance rates and the other factors in our dataset, lagging all factors by one year to mitigate the possibility that maintenance rates influenced their predictors. Exploratory analyses related maintenance rates to Child Welfare outcomes.State foster care maintenance rates have increased in nominal terms, but in many states, have not kept pace with inflation, leading to lower real rates in 2008 compared to those in 1991 for 54% of states for 2 year-olds, 58% for 9 year-olds, and 65% for 16 year-olds. In multivariate analyses including socioeconomic, demographic, and political factors, monthly foster care maintenance rates declined $15 for each 1% increase in state unemployment and declined $40 if a state's governorship and legislature became Republican, though significance was marginal. In analyses also examining state revenue, federal funding, and legal challenges, maintenance rates increased as the federal share of maximum TANF payments increased. However, >50% of variation in foster care maintenance rates was explained by unobserved state-level factors as measured by state fixed effects. These factors did not appear to be strongly related to 2008 Child Welfare outcomes like foster care placement stability and maltreatment which were also not correlated with foster care maintenance rates.Despite being part of a social safety net, foster care maintenance rates have declined in real terms since 1991 in many states, and there is no strong evidence that they increase in response to harsher economic climates or to federal programs or legal reviews. State variation in maintenance rates was not related to Child Welfare outcomes, though further analysis of this important relationship is needed. Variability in state foster care maintenance rates appears highly idiosyncratic, an important contextual factor to consider when designing and disseminating evidence-based services.
View details for DOI 10.1016/j.childyouth.2013.10.002
View details for Web of Science ID 000334135900023
View details for PubMedCentralID PMC3960086
-
Cost-effectiveness of tolvaptan in autosomal dominant polycystic kidney disease.
Annals of internal medicine
2014; 160 (2): 143-?
View details for DOI 10.7326/L14-5001-7
View details for PubMedID 24445704
View details for PubMedCentralID PMC4096316
-
Cost-Effectiveness of Treatment of Diabetic Macular Edema
ANNALS OF INTERNAL MEDICINE
2014; 160 (1): 18-?
Abstract
Macular edema is the most common cause of vision loss among patients with diabetes.To determine the cost-effectiveness of different treatments of diabetic macular edema (DME).Markov model.Published literature and expert opinion.Patients with clinically significant DME.Lifetime.Societal.Laser treatment, intraocular injections of triamcinolone or a vascular endothelial growth factor (VEGF) inhibitor, or a combination of both.Discounted costs, gains in quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios (ICERs).All treatments except laser monotherapy substantially reduced costs, and all treatments except triamcinolone monotherapy increased QALYs. Laser treatment plus a VEGF inhibitor achieved the greatest benefit, gaining 0.56 QALYs at a cost of $6975 for an ICER of $12 410 per QALY compared with laser treatment plus triamcinolone. Monotherapy with a VEGF inhibitor achieved similar outcomes to combination therapy with laser treatment plus a VEGF inhibitor. Laser monotherapy and triamcinolone monotherapy were less effective and more costly than combination therapy.VEGF inhibitor monotherapy was sometimes preferred over laser treatment plus a VEGF inhibitor, depending on the reduction in quality of life with loss of visual acuity. When the VEGF inhibitor bevacizumab was as effective as ranibizumab, it was preferable because of its lower cost.Long-term outcome data for treated and untreated diseases are limited.The most effective treatment of DME is VEGF inhibitor injections with or without laser treatment. This therapy compares favorably with cost-effective interventions for other conditions.Agency for Healthcare Research and Quality.
View details for Web of Science ID 000330249700003
View details for PubMedCentralID PMC4020006
-
Cost-effectiveness of treatment of diabetic macular edema.
Annals of internal medicine
2014; 160 (1): 18-29
Abstract
Macular edema is the most common cause of vision loss among patients with diabetes.To determine the cost-effectiveness of different treatments of diabetic macular edema (DME).Markov model.Published literature and expert opinion.Patients with clinically significant DME.Lifetime.Societal.Laser treatment, intraocular injections of triamcinolone or a vascular endothelial growth factor (VEGF) inhibitor, or a combination of both.Discounted costs, gains in quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios (ICERs).All treatments except laser monotherapy substantially reduced costs, and all treatments except triamcinolone monotherapy increased QALYs. Laser treatment plus a VEGF inhibitor achieved the greatest benefit, gaining 0.56 QALYs at a cost of $6975 for an ICER of $12 410 per QALY compared with laser treatment plus triamcinolone. Monotherapy with a VEGF inhibitor achieved similar outcomes to combination therapy with laser treatment plus a VEGF inhibitor. Laser monotherapy and triamcinolone monotherapy were less effective and more costly than combination therapy.VEGF inhibitor monotherapy was sometimes preferred over laser treatment plus a VEGF inhibitor, depending on the reduction in quality of life with loss of visual acuity. When the VEGF inhibitor bevacizumab was as effective as ranibizumab, it was preferable because of its lower cost.Long-term outcome data for treated and untreated diseases are limited.The most effective treatment of DME is VEGF inhibitor injections with or without laser treatment. This therapy compares favorably with cost-effective interventions for other conditions.Agency for Healthcare Research and Quality.
View details for DOI 10.7326/M13-0768
View details for PubMedID 24573663
-
Disease control implications of India's changing multi-drug resistant tuberculosis epidemic.
PloS one
2014; 9 (3)
Abstract
Multi-drug resistant tuberculosis (MDR TB) is a major health challenge in India that is gaining increasing public attention, but the implications of India's evolving MDR TB epidemic are poorly understood. As India's MDR TB epidemic is transitioning from a treatment-generated to transmission-generated epidemic, we sought to evaluate the potential effectiveness of the following two disease control strategies on reducing the prevalence of MDR TB: a) improving treatment of non-MDR TB; b) shortening the infectious period between the activation of MDR TB and initiation of effective MDR treatment.We developed a dynamic transmission microsimulation model of TB in India. The model followed individuals by age, sex, TB status, drug resistance status, and treatment status and was calibrated to Indian demographic and epidemiologic TB time trends. The main effectiveness measure was reduction in the average prevalence reduction of MDR TB over the ten years after control strategy implementation. We find that improving non-MDR cure rates to avoid generating new MDR cases will provide substantial non-MDR TB benefits but will become less effective in reducing MDR TB prevalence over time because more cases will occur from direct transmission - by 2015, the model estimates 42% of new MDR cases are transmission-generated and this proportion continues to rise over time, assuming equal transmissibility of MDR and drug-susceptible TB. Strategies that disrupt MDR transmission by shortening the time between MDR activation and treatment are projected to provide greater reductions in MDR prevalence compared with improving non-MDR treatment quality: implementing MDR diagnostic improvements in 2017 is expected to reduce MDR prevalence by 39%, compared with 11% reduction from improving non-MDR treatment quality.As transmission-generated MDR TB becomes a larger driver of the MDR TB epidemic in India, rapid and accurate MDR TB diagnosis and treatment will become increasingly effective in reducing MDR TB cases compared to non-MDR TB treatment improvements.
View details for DOI 10.1371/journal.pone.0089822
View details for PubMedID 24608234
View details for PubMedCentralID PMC3946521
-
Extended Abstract: Combining Statistical Analysis and Markov Models with Public Health Data to Infer Age-Specific Background Mortality Rates for Hepatitis C Infection in the US
International Conference for Smart Health (ICSH)
SPRINGER-VERLAG BERLIN. 2014: 148–149
View details for Web of Science ID 000348361400015
-
Disease Control Implications of India's Changing Multi-Drug Resistant Tuberculosis Epidemic.
PloS one
2014; 9 (3)
View details for DOI 10.1371/journal.pone.0089822
View details for PubMedID 24608234
-
Tuberculosis treatment discontinuation and symptom persistence: an observational study of Bihar, India's public care system covering >100,000,000 inhabitants.
BMC public health
2014; 14: 418-?
Abstract
The effectiveness of India's TB control programs depend critically on patients completing appropriate treatment. Discontinuing treatment prior to completion can leave patients infectious and symptomatic. Developing strategies to reduce early discontinuation requires characterizing its patterns and their link to symptom persistence.The 2011 BEST-TB survey (360 clusters, 11 districts) sampled patients (n = 1007) from Bihar's public healthcare system who had initiated treatment >6 months prior to being interviewed, administering questionnaires to patients about TB treatment duration and symptoms, prior treatment, and sociodemographic characteristics. Multivariate logistic regression models estimated the risk of treatment discontinuation for these characteristics. Similar models estimated probabilities of symptom persistence to 25 weeks post-treatment initiation adjusting for the same predictors and treatment duration. All models included district fixed effects, robust standard errors, and adjustments for the survey sampling design. Treatment default timing and symptom persistence relied solely on self-report.24% of patients discontinued treatment prior to 25 weeks. Higher likelihood of discontinuation occurred in those who had failed to complete previous TB treatment episodes (aOR: 4.77 [95% CI: 1.98-11.53]) and those seeing multiple providers (3.67 per provider [1.94-6.95]). Symptoms persisted in 42% of patients discontinuing treatment within 5 weeks versus 28% for completing 25 weeks of treatment. Symptom persistence was more likely for those with prior TB treatment (aOR: 5.05 [1.90-13.38]); poorer patients (2.94 [1.51-5.72]); and women (1.79 [1.07-2.99]). Predictors for treatment discontinuation prior to 16 weeks were similar.Premature TB treatment discontinuation and symptom persistence is particularly high among individuals who have failed to complete treatment for a prior episode. Strategies to identify and promote treatment completion in this group appear promising. Likewise, effective TB regimens of shortened duration currently in trials may eventually help to achieve higher treatment completion rates.
View details for DOI 10.1186/1471-2458-14-418
View details for PubMedID 24886314
View details for PubMedCentralID PMC4041057
-
Cost-Effectiveness of Same-Day Discharge After Elective Percutaneous Coronary intervention
LIPPINCOTT WILLIAMS & WILKINS. 2013
View details for Web of Science ID 000332162900413
-
ARGENTINA'S GENERIC DRUG LAW: WAS IT SUCCESSFUL?
ELSEVIER SCIENCE INC. 2013: A672
View details for DOI 10.1016/j.jval.2013.08.1957
View details for Web of Science ID 000326247603043
-
Cost-effectiveness of helicopter versus ground emergency medical services for trauma scene transport in the United States.
Annals of emergency medicine
2013; 62 (4): 351-364 e19
Abstract
STUDY OBJECTIVE: We determine the minimum mortality reduction that helicopter emergency medical services (EMS) should provide relative to ground EMS for the scene transport of trauma victims to offset higher costs, inherent transport risks, and inevitable overtriage of patients with minor injury. METHODS: We developed a decision-analytic model to compare the costs and outcomes of helicopter versus ground EMS transport to a trauma center from a societal perspective during a patient's lifetime. We determined the mortality reduction needed to make helicopter transport cost less than $100,000 and $50,000 per quality-adjusted life-year gained compared with ground EMS. Model inputs were derived from the National Study on the Costs and Outcomes of Trauma, National Trauma Data Bank, Medicare reimbursements, and literature. We assessed robustness with probabilistic sensitivity analyses. RESULTS: Helicopter EMS must provide a minimum of a 17% relative risk reduction in mortality (1.6 lives saved/100 patients with the mean characteristics of the National Study on the Costs and Outcomes of Trauma cohort) to cost less than $100,000 per quality-adjusted life-year gained and a reduction of at least 33% (3.7 lives saved/100 patients) to cost less than $50,000 per quality-adjusted life-year. Helicopter EMS becomes more cost-effective with significant reductions in patients with minor injury who are triaged to air transport or if long-term disability outcomes are improved. CONCLUSION: Helicopter EMS needs to provide at least a 17% mortality reduction or a measurable improvement in long-term disability to compare favorably with other interventions considered cost-effective. Given current evidence, it is not clear that helicopter EMS achieves this mortality or disability reduction. Reducing overtriage of patients with minor injury to helicopter EMS would improve its cost-effectiveness.
View details for DOI 10.1016/j.annemergmed.2013.02.025
View details for PubMedID 23582619
-
A new cost-effectiveness microsimulation model for glatiramer acetate and dimethyl fumarate
SAGE PUBLICATIONS LTD. 2013: 361–62
View details for Web of Science ID 000328751402247
-
Cost-effectiveness of helicopter versus ground emergency medical services for trauma scene transport in the United States.
Annals of emergency medicine
2013; 62 (4): 351-364 e19
Abstract
STUDY OBJECTIVE: We determine the minimum mortality reduction that helicopter emergency medical services (EMS) should provide relative to ground EMS for the scene transport of trauma victims to offset higher costs, inherent transport risks, and inevitable overtriage of patients with minor injury. METHODS: We developed a decision-analytic model to compare the costs and outcomes of helicopter versus ground EMS transport to a trauma center from a societal perspective during a patient's lifetime. We determined the mortality reduction needed to make helicopter transport cost less than $100,000 and $50,000 per quality-adjusted life-year gained compared with ground EMS. Model inputs were derived from the National Study on the Costs and Outcomes of Trauma, National Trauma Data Bank, Medicare reimbursements, and literature. We assessed robustness with probabilistic sensitivity analyses. RESULTS: Helicopter EMS must provide a minimum of a 17% relative risk reduction in mortality (1.6 lives saved/100 patients with the mean characteristics of the National Study on the Costs and Outcomes of Trauma cohort) to cost less than $100,000 per quality-adjusted life-year gained and a reduction of at least 33% (3.7 lives saved/100 patients) to cost less than $50,000 per quality-adjusted life-year. Helicopter EMS becomes more cost-effective with significant reductions in patients with minor injury who are triaged to air transport or if long-term disability outcomes are improved. CONCLUSION: Helicopter EMS needs to provide at least a 17% mortality reduction or a measurable improvement in long-term disability to compare favorably with other interventions considered cost-effective. Given current evidence, it is not clear that helicopter EMS achieves this mortality or disability reduction. Reducing overtriage of patients with minor injury to helicopter EMS would improve its cost-effectiveness.
View details for DOI 10.1016/j.annemergmed.2013.02.025
View details for PubMedID 23582619
-
Cost-effectiveness of preoperative imaging for appendicitis after indeterminate ultrasonography in the second or third trimester of pregnancy.
Obstetrics and gynecology
2013; 122 (4): 821-829
Abstract
To assess the cost-effectiveness of diagnostic laparoscopy, computed tomography (CT), and magnetic resonance imaging (MRI) after indeterminate ultrasonography in pregnant women with suspected appendicitis.A decision-analytic model was developed to simulate appendicitis during pregnancy taking into consideration the health outcomes for both the pregnant women and developing fetuses. Strategies included diagnostic laparoscopy, CT, and MRI. Outcomes included positive appendectomy, negative appendectomy, maternal perioperative complications, preterm delivery, fetal loss, childhood cancer, lifetime costs, discounted life expectancy, and incremental cost-effectiveness ratios.Magnetic resonance imaging is the most cost-effective strategy, costing $6,767 per quality-adjusted life-year gained relative to CT, well below the generally accepted $50,000 per quality-adjusted life-year threshold. In a setting where MRI is unavailable, CT is cost-effective even when considering the increased risk of radiation-associated childhood cancer ($560 per quality-adjusted life-year gained relative to diagnostic laparoscopy). Unless the negative appendectomy rate is less than 1%, imaging of any type is more cost-effective than proceeding directly to diagnostic laparoscopy.Depending on imaging costs and resource availability, both CT and MRI are potentially cost-effective. The risk of radiation-associated childhood cancer from CT has little effect on population-level outcomes or cost-effectiveness but is a concern for individual patients. For pregnant women with suspected appendicitis, an extremely high level of clinical diagnostic certainty must be reached before proceeding to operation without preoperative imaging.
View details for DOI 10.1097/AOG.0b013e3182a4a085
View details for PubMedID 24084540
-
Cost-effectiveness of tolvaptan in autosomal dominant polycystic kidney disease.
Annals of internal medicine
2013; 159 (6): 382-389
Abstract
Chinese translationIn the TEMPO (Tolvaptan Efficacy and Safety in Management of Autosomal Dominant Polycystic Kidney Disease and Its Outcomes) trial, tolvaptan significantly reduced expansion of kidney volume and loss of kidney function.To determine how the benefits of tolvaptan seen in TEMPO may relate to longer-term health outcomes, such as progression to end-stage renal disease (ESRD) and death, and cost-effectiveness.A decision-analytic model.Published literature from 1993 to 2012.Persons with early autosomal dominant polycystic kidney disease.Lifetime.Societal.Patients received tolvaptan therapy until death, development of ESRD, or liver complications or no tolvaptan therapy.Median age at ESRD onset, life expectancy, discounted quality-adjusted life-years and lifetime costs (in 2010 U.S. dollars), and incremental cost-effectiveness ratios.Tolvaptan prolonged the median age at ESRD onset by 6.5 years and increased life expectancy by 2.6 years. At $5760 per month, tolvaptan cost $744 100 per quality-adjusted life-year gained compared with standard care.For patients with autosomal dominant polycystic kidney disease that progressed more slowly, the cost per quality-adjusted life-year gained was even greater for tolvaptan.Although TEMPO followed patients for 3 years, the main analysis assumed that clinical benefits persisted over patients' lifetimes.Assuming that the benefits of tolvaptan persist in the longer term, the drug may slow progression to ESRD and reduce mortality rates. However, barring an approximately 95% reduction in price, cost-effectiveness does not compare favorably with many other commonly accepted medical interventions.National Institutes of Health and Agency for Healthcare Research and Quality.
View details for DOI 10.7326/0003-4819-159-6-201309170-00004
View details for PubMedID 24042366
-
Cost-effectiveness of tolvaptan in autosomal dominant polycystic kidney disease.
Annals of internal medicine
2013; 159 (6): 382-389
Abstract
Chinese translationIn the TEMPO (Tolvaptan Efficacy and Safety in Management of Autosomal Dominant Polycystic Kidney Disease and Its Outcomes) trial, tolvaptan significantly reduced expansion of kidney volume and loss of kidney function.To determine how the benefits of tolvaptan seen in TEMPO may relate to longer-term health outcomes, such as progression to end-stage renal disease (ESRD) and death, and cost-effectiveness.A decision-analytic model.Published literature from 1993 to 2012.Persons with early autosomal dominant polycystic kidney disease.Lifetime.Societal.Patients received tolvaptan therapy until death, development of ESRD, or liver complications or no tolvaptan therapy.Median age at ESRD onset, life expectancy, discounted quality-adjusted life-years and lifetime costs (in 2010 U.S. dollars), and incremental cost-effectiveness ratios.Tolvaptan prolonged the median age at ESRD onset by 6.5 years and increased life expectancy by 2.6 years. At $5760 per month, tolvaptan cost $744 100 per quality-adjusted life-year gained compared with standard care.For patients with autosomal dominant polycystic kidney disease that progressed more slowly, the cost per quality-adjusted life-year gained was even greater for tolvaptan.Although TEMPO followed patients for 3 years, the main analysis assumed that clinical benefits persisted over patients' lifetimes.Assuming that the benefits of tolvaptan persist in the longer term, the drug may slow progression to ESRD and reduce mortality rates. However, barring an approximately 95% reduction in price, cost-effectiveness does not compare favorably with many other commonly accepted medical interventions.National Institutes of Health and Agency for Healthcare Research and Quality.
View details for DOI 10.7326/0003-4819-159-6-201309170-00004
View details for PubMedID 24042366
-
Prioritizing guideline-recommended interventions.
Annals of internal medicine
2013; 159 (3): 223-224
View details for DOI 10.7326/0003-4819-159-3-201308060-00014
View details for PubMedID 23922066
-
COMPARATIVE EFFECTIVENESS AND HEALTH ECONOMIC EVALUATION OF SYSTEMIC ANTI-INFLAMMATORY THERAPIES FOR ACUTE GOUT FLARES
BMJ PUBLISHING GROUP. 2013: 459–60
View details for DOI 10.1136/annrheumdis-2012-eular.2887
View details for Web of Science ID 000208898502321
-
THE IMPACT OF GENERIC DRUG POLICY ON DRUG PRICING
ELSEVIER SCIENCE INC. 2013: A11–A11
View details for Web of Science ID 000318916400053
-
Analyzing Screening Policies for Childhood Obesity
MANAGEMENT SCIENCE
2013; 59 (4): 782-795
Abstract
Due to the health and economic costs of childhood obesity, coupled with studies suggesting the benefits of comprehensive (dietary, physical activity and behavioral counseling) intervention, the United States Preventive Services Task Force recently recommended childhood screening and intervention for obesity beginning at age six. Using a longitudinal data set consisting of the body mass index of 3164 children up to age 18 and another longitudinal data set containing the body mass index at ages 18 and 40 and the presence or absence of disease (hypertension and diabetes) at age 40 for 747 people, we formulate and numerically solve - separately for boys and girls - a dynamic programming problem for the optimal biennial (i.e., at ages 2, 4, …, 16) obesity screening thresholds. Unlike most screening problem formulations, we take a societal viewpoint, where the state of the system at each age is the population-wide probability density function of the body mass index. Compared to the biennial version of the task force's recommendation, the screening thresholds derived from the dynamic program achieve a relative reduction in disease prevalence of 3% at the same screening (and treatment) cost, or - due to the flatness of the disease vs. screening tradeoff curve - achieves the same disease prevalence at a 28% relative reduction in cost. Compared to the task force's policy, which uses the 95th percentile of body mass index (from cross-sectional growth charts tabulated by the Centers for Disease Control and Prevention) as the screening threshold for each age, the dynamic programming policy treats mostly 16 year olds (including many who are not obese) and very few males under 14 years old. While our results suggest that adult hypertension and diabetes are minimized by focusing childhood obesity screening and treatment on older adolescents, the shortcomings in the available data and the narrowness of the medical outcomes considered prevent us from making a recommendation about childhood obesity screening policies.
View details for DOI 10.1287/mnsc.1120.1587
View details for Web of Science ID 000317196900002
View details for PubMedCentralID PMC3744381
-
Analyzing Screening Policies for Childhood Obesity.
Management science
2013; 59 (4): 782-795
Abstract
Due to the health and economic costs of childhood obesity, coupled with studies suggesting the benefits of comprehensive (dietary, physical activity and behavioral counseling) intervention, the United States Preventive Services Task Force recently recommended childhood screening and intervention for obesity beginning at age six. Using a longitudinal data set consisting of the body mass index of 3164 children up to age 18 and another longitudinal data set containing the body mass index at ages 18 and 40 and the presence or absence of disease (hypertension and diabetes) at age 40 for 747 people, we formulate and numerically solve - separately for boys and girls - a dynamic programming problem for the optimal biennial (i.e., at ages 2, 4, …, 16) obesity screening thresholds. Unlike most screening problem formulations, we take a societal viewpoint, where the state of the system at each age is the population-wide probability density function of the body mass index. Compared to the biennial version of the task force's recommendation, the screening thresholds derived from the dynamic program achieve a relative reduction in disease prevalence of 3% at the same screening (and treatment) cost, or - due to the flatness of the disease vs. screening tradeoff curve - achieves the same disease prevalence at a 28% relative reduction in cost. Compared to the task force's policy, which uses the 95th percentile of body mass index (from cross-sectional growth charts tabulated by the Centers for Disease Control and Prevention) as the screening threshold for each age, the dynamic programming policy treats mostly 16 year olds (including many who are not obese) and very few males under 14 years old. While our results suggest that adult hypertension and diabetes are minimized by focusing childhood obesity screening and treatment on older adolescents, the shortcomings in the available data and the narrowness of the medical outcomes considered prevent us from making a recommendation about childhood obesity screening policies.
View details for DOI 10.1287/mnsc.1120.1587
View details for PubMedID 23956465
View details for PubMedCentralID PMC3744381
-
Cost-Effectiveness of Statins for Primary Cardiovascular Prevention in Chronic Kidney Disease
JOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGY
2013; 61 (12): 1250-1258
Abstract
The authors sought to evaluate the cost-effectiveness of statins for primary prevention of myocardial infarction (MI) and stroke in patients with chronic kidney disease (CKD).Patients with CKD have an elevated risk of MI and stroke. Although HMG Co-A reductase inhibitors (“statins”) may prevent cardiovascular events in patients with non–dialysis-requiring CKD, adverse drug effects and competing risks could materially influence net effects and clinical decision-making.We developed a decision-analytic model of CKD and cardiovascular disease (CVD) to determine the cost-effectiveness of low-cost generic statins for primary CVD prevention in men and women with hypertension and mild-to-moderate CKD. Outcomes included MI and stroke rates, discounted quality-adjusted life years (QALYs) and lifetime costs (2010 USD), and incremental cost-effectiveness ratios.For 65-year-old men with moderate hypertension and mild-to-moderate CKD, statins reduced the combined rate of MI and stroke, yielded 0.10 QALYs, and increased costs by $1,800 ($18,000 per QALY gained). For patients with lower baseline cardiovascular risks, health and economic benefits were smaller; for 65-year-old women, statins yielded 0.06 QALYs and increased costs by $1,900 ($33,400 per QALY gained). Results were sensitive to rates of rhabdomyolysis and drug costs. Statins are less cost-effective when obtained at average retail prices, particularly in patients at lower CVD risk.Although statins reduce absolute CVD risk in patients with CKD, the increased risk of rhabdomyolysis, and competing risks associated with progressive CKD, partly offset these gains. Low-cost generic statins appear cost-effective for primary prevention of CVD in patients with mild-to-moderate CKD and hypertension.
View details for DOI 10.1016/j.jacc.2012.12.034
View details for PubMedID 23500327
-
Cost-Effectiveness Analysis of Risk-Factor Guided and Birth-Cohort Screening for Chronic Hepatitis C Infection in the United States
PLOS ONE
2013; 8 (3)
Abstract
No consensus exists on screening to detect the estimated 2 million Americans unaware of their chronic hepatitis C infections. Advisory groups differ, recommending birth-cohort screening for baby boomers, screening only high-risk individuals, or no screening. We assessed one-time risk assessment and screening to identify previously undiagnosed 40-74 year-olds given newly available hepatitis C treatments.A Markov model evaluated alternative risk-factor guided and birth-cohort screening and treatment strategies. Risk factors included drug use history, blood transfusion before 1992, and multiple sexual partners. Analyses of the National Health and Nutrition Examination Survey provided sex-, race-, age-, and risk-factor-specific hepatitis C prevalence and mortality rates. Nine strategies combined screening (no screening, risk-factor guided screening, or birth-cohort screening) and treatment (standard therapy-peginterferon alfa and ribavirin, Interleukin-28B-guided (IL28B) triple-therapy-standard therapy plus a protease inhibitor, or universal triple therapy). Response-guided treatment depended on HCV genotype. Outcomes include discounted lifetime costs (2010 dollars) and quality adjusted life-years (QALYs). Compared to no screening, risk-factor guided and birth-cohort screening for 50 year-olds gained 0.7 to 3.5 quality adjusted life-days and cost $168 to $568 per person. Birth-cohort screening provided more benefit per dollar than risk-factor guided screening and cost $65,749 per QALY if followed by universal triple therapy compared to screening followed by IL28B-guided triple therapy. If only 10% of screen-detected, eligible patients initiate treatment at each opportunity, birth-cohort screening with universal triple therapy costs $241,100 per QALY. Assuming treatment with triple therapy, screening all individuals aged 40-64 years costs less than $100,000 per QALY.The cost-effectiveness of one-time birth-cohort hepatitis C screening for 40-64 year olds is comparable to other screening programs, provided that the healthcare system has sufficient capacity to deliver prompt treatment and appropriate follow-on care to many newly screen-detected individuals.
View details for DOI 10.1371/journal.pone.0058975
View details for Web of Science ID 000316549400032
View details for PubMedID 23533595
View details for PubMedCentralID PMC3606430
-
The Utility of Childhood and Adolescent Obesity Assessment in Relation to Adult
MEDICAL DECISION MAKING
2013; 33 (2): 163-175
Abstract
High childhood obesity prevalence has raised concerns about future adult health, generating calls for obesity screening of young children.To estimate how well childhood obesity predicts adult obesity and to forecast obesity-related health of future US adults.Longitudinal statistical analyses; microsimulations combining multiple data sets.National Longitudinal Survey of Youth, Population Study of Income Dynamics, and National Health and Nutrition Evaluation Surveys.The authors estimated test characteristics and predictive values of childhood body mass index to identify 2-, 5-, 10-, and 15 year-olds who will become obese adults. The authors constructed models relating childhood body mass index to obesity-related diseases through middle age stratified by sex and race.Twelve percent of 18-year-olds were obese. While screening at age 5 would miss 50% of those who become obese adults, screening at age 15 would miss 9%. The predictive value of obesity screening below age 10 was low even when maternal obesity was included as a predictor. Obesity at age 5 was a substantially worse predictor of health in middle age than was obesity at age 15. For example, the relative risk of developing diabetes as adults for obese white male 15-year-olds was 4.5 versus otherwise similar nonobese 15-year-olds. For obese 5-year-olds, the relative risk was 1.6.Main results do not include Hispanics due to sample size. Past relationships between childhood and adult obesity and health may change in the future.Early childhood obesity assessment adds limited information to later childhood assessment. Targeted later childhood approaches or universal strategies to prevent unhealthy weight gain should be considered.
View details for DOI 10.1177/0272989X12447240
View details for Web of Science ID 000316684200006
View details for PubMedID 22647830
-
Performance of serum biomarkers for the early detection of invasive aspergillosis in febrile, neutropenic patients: a multi-state model.
PloS one
2013; 8 (6)
Abstract
The performance of serum biomarkers for the early detection of invasive aspergillosis expectedly depends on the timing of test results relative to the empirical administration of antifungal therapy during neutropenia, although a dynamic evaluation framework is lacking.We developed a multi-state model describing simultaneously the likelihood of empirical antifungal therapy and the risk of invasive aspergillosis during neutropenia. We evaluated whether the first positive test result with a biomarker is an independent predictor of invasive aspergillosis when both diagnostic information used to treat and risk factors of developing invasive aspergillosis are taken into account over time. We applied the multi-state model to a homogeneous cohort of 185 high-risk patients with acute myeloid leukemia. Patients were prospectively screened for galactomannan antigenemia twice a week for immediate treatment decision; 2,214 serum samples were collected on the same days and blindly assessed for (1->3)- β-D-glucan antigenemia and a quantitative PCR assay targeting a mitochondrial locus.The usual evaluation framework of biomarker performance was unable to distinguish clinical benefits of β-glucan or PCR assays. The multi-state model evidenced that the risk of invasive aspergillosis is a complex time function of neutropenia duration and risk management. The quantitative PCR assay accelerated the early detection of invasive aspergillosis (P = .010), independently of other diagnostic information used to treat, while β-glucan assay did not (P = .53).The performance of serum biomarkers for the early detection of invasive aspergillosis is better apprehended by the evaluation of time-varying predictors in a multi-state model. Our results provide strong rationale for prospective studies testing a preemptive antifungal therapy, guided by clinical, radiological, and bi-weekly blood screening with galactomannan antigenemia and a standardized quantitative PCR assay.
View details for DOI 10.1371/journal.pone.0065776
View details for PubMedID 23799048
View details for PubMedCentralID PMC3683047
-
Palm oil taxes and cardiovascular disease mortality in India: economic-epidemiologic model.
BMJ (Clinical research ed.)
2013; 347: f6048-?
Abstract
To examine the potential effect of a tax on palm oil on hyperlipidemia and on mortality due to cardiovascular disease in India.Economic-epidemiologic model.A microsimulation model of mortality due to myocardial infarction and stroke among Indian populations was constructed, incorporating nationally representative data on systolic blood pressure, total cholesterol, tobacco smoking, diabetes, and cardiovascular event history, and stratified by age, sex, and urban/rural residence. Household expenditure data were used to estimate the change in consumption of palm oil following changes in oil price and the potential substitution of alternative oils that might occur after imposition of a tax. A 20% excise tax on palm oil purchases was simulated over the period 2014-23.The model was used to project future mortality due to myocardial infarction and stroke, as well as the potential effect of a tax on food insecurity, accounting for the effect of increased food prices.A 20% tax on palm oil purchases would be expected to avert approximately 363 000 (95% confidence interval 247 000 to 479 000) deaths from myocardial infarctions and strokes over the period 2014-23 in India (1.3% reduction in cardiovascular deaths) if people do not substitute other oils for reduced palm oil consumption. Given estimates of substitution of palm oil with other oils following a 20% price increase for palm oil, the beneficial effects of increased polyunsaturated fat consumption would be expected to enhance the projected reduction in deaths to as much as 421 000 (256 000 to 586 000). The tax would be expected to benefit men more than women and urban populations more than rural populations, given differential consumption and cardiovascular risk. In a scenario incorporating the effect of taxation on overall food expenditures, the tax may increase food insecurity by <1%, resulting in 16 000 (95% confidence interval 12 000 to 22 000) deaths.Curtailing palm oil intake through taxation may modestly reduce hyperlipidemia and cardiovascular mortality, but with potential distributional consequences differentially benefiting male and urban populations, as well as affecting food security.
View details for DOI 10.1136/bmj.f6048
View details for PubMedID 24149818
-
Palm oil taxes and cardiovascular disease mortality in India: economic-epidemiologic model.
BMJ (Clinical research ed.)
2013; 347: f6048-?
Abstract
To examine the potential effect of a tax on palm oil on hyperlipidemia and on mortality due to cardiovascular disease in India.Economic-epidemiologic model.A microsimulation model of mortality due to myocardial infarction and stroke among Indian populations was constructed, incorporating nationally representative data on systolic blood pressure, total cholesterol, tobacco smoking, diabetes, and cardiovascular event history, and stratified by age, sex, and urban/rural residence. Household expenditure data were used to estimate the change in consumption of palm oil following changes in oil price and the potential substitution of alternative oils that might occur after imposition of a tax. A 20% excise tax on palm oil purchases was simulated over the period 2014-23.The model was used to project future mortality due to myocardial infarction and stroke, as well as the potential effect of a tax on food insecurity, accounting for the effect of increased food prices.A 20% tax on palm oil purchases would be expected to avert approximately 363 000 (95% confidence interval 247 000 to 479 000) deaths from myocardial infarctions and strokes over the period 2014-23 in India (1.3% reduction in cardiovascular deaths) if people do not substitute other oils for reduced palm oil consumption. Given estimates of substitution of palm oil with other oils following a 20% price increase for palm oil, the beneficial effects of increased polyunsaturated fat consumption would be expected to enhance the projected reduction in deaths to as much as 421 000 (256 000 to 586 000). The tax would be expected to benefit men more than women and urban populations more than rural populations, given differential consumption and cardiovascular risk. In a scenario incorporating the effect of taxation on overall food expenditures, the tax may increase food insecurity by <1%, resulting in 16 000 (95% confidence interval 12 000 to 22 000) deaths.Curtailing palm oil intake through taxation may modestly reduce hyperlipidemia and cardiovascular mortality, but with potential distributional consequences differentially benefiting male and urban populations, as well as affecting food security.
View details for DOI 10.1136/bmj.f6048
View details for PubMedID 24149818
- Analyzing Screening Policies for Childhood Obesity Management Science 2013; 59 (April): 782-795
-
Mental Health Services Use by Children Investigated by Child Welfare Agencies
PEDIATRICS
2012; 130 (5): 861-869
Abstract
To examine the rates and predictors of mental health services use for a nationally representative cohort of youths who had been investigated for alleged maltreatment.Data came from caregiver and caseworker baseline and 18-month interviews in the second National Survey of Child and Adolescent Well-being. These interviews took place from March 2008 to September 2008 and September 2010 to March 2011. Data on family and child characteristics and service use were gathered and examined by using weighted univariate and multivariate analyses.Children had numerous challenges: 61.8% had a previous report of maltreatment, 46.3% had poor socialization skills, and 23.9% had a mental health problem measured by the Child Behavior Checklist (CBCL). At baseline, 33.3% received some mental health service and this varied by age, with younger children receiving fewer services. This percentage decreased to 30.9% at the 18-month follow-up, although the youngest children had increases in services use. For younger children, race/ethnicity, out-of-home placement, chronic physical health problems, low adaptive behaviors, and CBCL scores in the clinical range were related to use. For children ≥ 11, out-of-home placement, high CBCL scores, and family risk factors predicted services use at 18 months.Mental health services utilization increases as young children come into contact with schools and medical providers or have more intensive involvement with child welfare. Minority children receive fewer services adjusting for need. Over the 18-month follow-up, there was a decrease in service use that may be a result of the tremendous financial challenges taking place in the United States.
View details for DOI 10.1542/peds.2012-1330
View details for Web of Science ID 000310505900055
View details for PubMedID 23045565
View details for PubMedCentralID PMC3483894
-
Screening and Rapid Molecular Diagnosis of Tuberculosis in Prisons in Russia and Eastern Europe: A Cost-Effectiveness Analysis
PLOS MEDICINE
2012; 9 (11)
Abstract
Prisons of the former Soviet Union (FSU) have high rates of multidrug-resistant tuberculosis (MDR-TB) and are thought to drive general population tuberculosis (TB) epidemics. Effective prison case detection, though employing more expensive technologies, may reduce long-term treatment costs and slow MDR-TB transmission.We developed a dynamic transmission model of TB and drug resistance matched to the epidemiology and costs in FSU prisons. We evaluated eight strategies for TB screening and diagnosis involving, alone or in combination, self-referral, symptom screening, mass miniature radiography (MMR), and sputum PCR with probes for rifampin resistance (Xpert MTB/RIF). Over a 10-y horizon, we projected costs, quality-adjusted life years (QALYs), and TB and MDR-TB prevalence. Using sputum PCR as an annual primary screening tool among the general prison population most effectively reduced overall TB prevalence (from 2.78% to 2.31%) and MDR-TB prevalence (from 0.74% to 0.63%), and cost US$543/QALY for additional QALYs gained compared to MMR screening with sputum PCR reserved for rapid detection of MDR-TB. Adding sputum PCR to the currently used strategy of annual MMR screening was cost-saving over 10 y compared to MMR screening alone, but produced only a modest reduction in MDR-TB prevalence (from 0.74% to 0.69%) and had minimal effect on overall TB prevalence (from 2.78% to 2.74%). Strategies based on symptom screening alone were less effective and more expensive than MMR-based strategies. Study limitations included scarce primary TB time-series data in FSU prisons and uncertainties regarding screening test characteristics.In prisons of the FSU, annual screening of the general inmate population with sputum PCR most effectively reduces TB and MDR-TB prevalence, doing so cost-effectively. If this approach is not feasible, the current strategy of annual MMR is both more effective and less expensive than strategies using self-referral or symptom screening alone, and the addition of sputum PCR for rapid MDR-TB detection may be cost-saving over time.
View details for DOI 10.1371/journal.pmed.1001348
View details for PubMedID 23209384
-
Evaluating Child Welfare Policies with Decision-Analytic Simulation Models
ADMINISTRATION AND POLICY IN MENTAL HEALTH AND MENTAL HEALTH SERVICES RESEARCH
2012; 39 (6): 466-477
Abstract
The objective was to demonstrate decision-analytic modeling in support of Child Welfare policymakers considering implementing evidence-based interventions. Outcomes included permanency (e.g., adoptions) and stability (e.g., foster placement changes). Analyses of a randomized trial of KEEP-a foster parenting intervention-and NSCAW-1 estimated placement change rates and KEEP's effects. A microsimulation model generalized these findings to other Child Welfare systems. The model projected that KEEP could increase permanency and stability, identifying strategies targeting higher-risk children and geographical regions that achieve benefits efficiently. Decision-analytic models enable planners to gauge the value of potential implementations.
View details for DOI 10.1007/s10488-011-0370-z
View details for Web of Science ID 000309862900006
View details for PubMedID 21861204
View details for PubMedCentralID PMC3589566
-
Cost-Effectiveness of Systemic Therapies for Acute Gouty Arthritis.
WILEY-BLACKWELL. 2012: S1036–S1037
View details for Web of Science ID 000309748305381
-
Assessing Screening Policies for Childhood Obesity
OBESITY
2012; 20 (7): 1437-1443
Abstract
To address growing concerns over childhood obesity, the United States Preventive Services Task Force (USPSTF) recently recommended that children undergo obesity screening beginning at age 6. An Expert Committee recommends starting at age 2. Analysis is needed to assess these recommendations and investigate whether there are better alternatives. We model the age- and sex-specific population-wide distribution of BMI through age 18 using National Longitudinal Survey of Youth (NLSY) data. The impact of treatment on BMI is estimated using the targeted systematic review performed to aid the USPSTF. The prevalence of hypertension and diabetes at age 40 are estimated from the Panel Study of Income Dynamics (PSID). We fix the screening interval at 2 years, and derive the age- and sex-dependent BMI thresholds that minimize adult disease prevalence, subject to referring a specified percentage of children for treatment yearly. We compare this optimal biennial policy to biennial versions of the USPSTF and Expert Committee recommendations. Compared to the USPSTF recommendation, the optimal policy reduces adult disease prevalence by 3% in relative terms (the absolute reductions are <1%) at the same treatment referral rate, or achieves the same disease prevalence at a 28% reduction in treatment referral rate. If compared to the Expert Committee recommendation, the reductions change to 6 and 40%, respectively. The optimal policy treats mostly 16-year olds and few children under age 14. Our results suggest that adult disease is minimized by focusing childhood obesity screening and treatment on older adolescents.
View details for DOI 10.1038/oby.2011.373
View details for Web of Science ID 000305840200015
View details for PubMedID 22240724
-
Accounting for Biases When Linking Empirical Studies and Simulation Models
MEDICAL DECISION MAKING
2012; 32 (3): 397-399
View details for DOI 10.1177/0272989X12441398
View details for Web of Science ID 000304704300004
View details for PubMedID 22593033
-
New Protease Inhibitors for the Treatment of Chronic Hepatitis C A Cost-Effectiveness Analysis
ANNALS OF INTERNAL MEDICINE
2012; 156 (4): 279-U68
Abstract
Chronic hepatitis C virus is difficult to treat and affects approximately 3 million Americans. Protease inhibitors increase the effectiveness of standard therapy, but they are costly. A genetic assay may identify patients most likely to benefit from this treatment advance.To assess the cost-effectiveness of new protease inhibitors and an interleukin (IL)-28B genotyping assay for treating chronic hepatitis C virus.Decision-analytic Markov model.Published literature and expert opinion.Treatment-naive patients with chronic, genotype 1 hepatitis C virus monoinfection.Lifetime.Societal.Strategies are defined by the use of IL-28B genotyping and type of treatment (standard therapy [pegylated interferon with ribavirin]; triple therapy [standard therapy and a protease inhibitor]). Interleukin-28B-guided triple therapy stratifies patients with CC genotypes to standard therapy and those with non-CC types to triple therapy.Discounted costs (in 2010 U.S. dollars) and quality-adjusted life-years (QALYs); incremental cost-effectiveness ratios.For patients with mild and advanced fibrosis, universal triple therapy reduced the lifetime risk for hepatocellular carcinoma by 38% and 28%, respectively, and increased quality-adjusted life expectancy by 3% and 8%, respectively, compared with standard therapy. Gains from IL-28B-guided triple therapy were smaller. If the protease inhibitor costs $1100 per week, universal triple therapy costs $102,600 per QALY (mild fibrosis) or $51,500 per QALY (advanced fibrosis) compared with IL-28B-guided triple therapy and $70,100 per QALY (mild fibrosis) and $36,300 per QALY (advanced fibrosis) compared with standard therapy.Results were sensitive to the cost of protease inhibitors and treatment adherence rates.Data on the long-term comparative effectiveness of the new protease inhibitors are lacking.Both universal triple therapy and IL-28B-guided triple therapy are cost-effective when the least-expensive protease inhibitor are used for patients with advanced fibrosis.Stanford University.
View details for PubMedID 22351713
-
Diabetes, Its Treatment, and Catastrophic Medical Spending in 35 Developing Countries
DIABETES CARE
2012; 35 (2): 319-326
Abstract
To assess the individual financial impact of having diabetes in developing countries, whether diabetic individuals possess appropriate medications, and the extent to which health insurance may protect diabetic individuals by increasing medication possession or decreasing the risk of catastrophic spending.Using 2002-2003 World Health Survey data (n = 121,051 individuals; 35 low- and middle-income countries), we examined possession of medications to treat diabetes and estimated the relationship between out-of-pocket medical spending (2005 international dollars), catastrophic medical spending, and diabetes. We assessed whether health insurance modified these relationships.Diabetic individuals experience differentially higher out-of-pocket medical spending, particularly among individuals with high levels of spending (excess spending of $157 per year [95% CI 130-184] at the 95th percentile), and a greater chance of incurring catastrophic medical spending (17.8 vs. 13.9%; difference 3.9% [95% CI 0.2-7.7]) compared with otherwise similar individuals without diabetes. Diabetic individuals with insurance do not have significantly lower risks of catastrophic medical spending (18.6 vs. 17.7%; difference not significant), nor were they significantly more likely to possess diabetes medications (22.8 vs. 20.6%; difference not significant) than those who were otherwise similar but without insurance. These effects were more pronounced and significant in lower-income countries.In low-income countries, despite insurance, diabetic individuals are more likely to experience catastrophic medical spending and often do not possess appropriate medications to treat diabetes. Research into why policies in these countries may not adequately protect people from catastrophic spending or enhance possession of critical medications is urgently needed.
View details for DOI 10.2337/dc11-1770
View details for Web of Science ID 000299856000024
View details for PubMedID 22238276
View details for PubMedCentralID PMC3263916
-
Multi-Country Analysis of Palm Oil Consumption and Cardiovascular Disease Mortality for Countries at Different Stages of Economic Development: 1980-1997
GLOBALIZATION AND HEALTH
2011; 7
Abstract
Cardiovascular diseases represent an increasing share of the global disease burden. There is concern that increased consumption of palm oil could exacerbate mortality from ischemic heart disease (IHD) and stroke, particularly in developing countries where it represents a major nutritional source of saturated fat.The study analyzed country-level data from 1980-1997 derived from the World Health Organization's Mortality Database, U.S. Department of Agriculture international estimates, and the World Bank (234 annual observations; 23 countries). Outcomes included mortality from IHD and stroke for adults aged 50 and older. Predictors included per-capita consumption of palm oil and cigarettes and per-capita Gross Domestic Product as well as time trends and an interaction between palm oil consumption and country economic development level. Analyses examined changes in country-level outcomes over time employing linear panel regressions with country-level fixed effects, population weighting, and robust standard errors clustered by country. Sensitivity analyses included further adjustment for other major dietary sources of saturated fat.In developing countries, for every additional kilogram of palm oil consumed per-capita annually, IHD mortality rates increased by 68 deaths per 100,000 (95% CI [21-115]), whereas, in similar settings, stroke mortality rates increased by 19 deaths per 100,000 (95% CI [-12-49]) but were not significant. For historically high-income countries, changes in IHD and stroke mortality rates from palm oil consumption were smaller (IHD: 17 deaths per 100,000 (95% CI [5.3-29]); stroke: 5.1 deaths per 100,000 (95% CI [-1.2-11.0])). Inclusion of other major saturated fat sources including beef, pork, chicken, coconut oil, milk cheese, and butter did not substantially change the differentially higher relationship between palm oil and IHD mortality in developing countries.Increased palm oil consumption is related to higher IHD mortality rates in developing countries. Palm oil consumption represents a saturated fat source relevant for policies aimed at reducing cardiovascular disease burdens.
View details for DOI 10.1186/1744-8603-7-45
View details for Web of Science ID 000300306700001
View details for PubMedID 22177258
View details for PubMedCentralID PMC3271960
-
Cost Effectiveness of Fibrosis Assessment Prior to Treatment for Chronic Hepatitis C Patients
PLOS ONE
2011; 6 (12)
Abstract
Chronic hepatitis C (HCV) is a liver disease affecting over 3 million Americans. Liver biopsy is the gold standard for assessing liver fibrosis and is used as a benchmark for initiating treatment, though it is expensive and carries risks of complications. FibroTest is a non-invasive biomarker assay for fibrosis, proposed as a screening alternative to biopsy.We assessed the cost-effectiveness of FibroTest and liver biopsy used alone or sequentially for six strategies followed by treatment of eligible U.S. patients: FibroTest only; FibroTest with liver biopsy for ambiguous results; FibroTest followed by biopsy to rule in; or to rule out significant fibrosis; biopsy only (recommended practice); and treatment without screening. We developed a Markov model of chronic HCV that tracks fibrosis progression. Outcomes were expressed as expected lifetime costs (2009 USD), quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios (ICER).Treatment of chronic HCV without fibrosis screening is preferred for both men and women. For genotype 1 patients treated with pegylated interferon and ribavirin, the ICERs are $5,400/QALY (men) and $6,300/QALY (women) compared to FibroTest only; the ICERs increase to $27,200/QALY (men) and $30,000/QALY (women) with the addition of telaprevir. For genotypes 2 and 3, treatment is more effective and less costly than all alternatives. In clinical settings where testing is required prior to treatment, FibroTest only is more effective and less costly than liver biopsy. These results are robust to multi-way and probabilistic sensitivity analyses.Early treatment of chronic HCV is superior to the other fibrosis screening strategies. In clinical settings where testing is required, FibroTest screening is a cost-effective alternative to liver biopsy.
View details for DOI 10.1371/journal.pone.0026783
View details for Web of Science ID 000298171400004
View details for PubMedID 22164204
View details for PubMedCentralID PMC3229483
-
Economic evaluation research in the context of Child Welfare policy: A structured literature review and recommendations
CHILD ABUSE & NEGLECT
2011; 35 (9): 722-740
Abstract
With over 1 million children served by the US Child Welfare system at a cost of $20 billion annually, this study examines the economic evaluation literature on interventions to improve outcomes for children at risk for and currently involved with the system, identifies areas where additional research is needed, and discusses the use of decision-analytic modeling to advance Child Welfare policy and practice.The review included 19 repositories of peer-reviewed and non-peer-reviewed "gray" literatures, including items in English published before November, 2009. Original research articles were included if they evaluated interventions based on costs and outcomes. Review articles were included to assess the relevance of these techniques over time and to highlight the increasing discussion of methods needed to undertake such research. Items were categorized by their focus on: interventions for the US Child Welfare system; primary prevention of entry into the system; and use of models to make long-term projections of costs and outcomes.Searches identified 2,640 articles, with 49 ultimately included (19 reviews and 30 original research articles). Between 1988 and 2009, reviews consistently advocated economic evaluation and increasingly provided methodological guidance. 21 of the original research articles focused on Child Welfare, while 9 focused on child mental health. Of the 21 Child Welfare articles, 81% (17) focused on the US system. 47% (8/17) focused exclusively on primary prevention, though 83% of the US system, peer-reviewed articles focused exclusively on prevention (5/6). 9 of the 17 articles included empirical follow-up (mean sample size: 264 individuals; mean follow-up: 3.8 years). 10 of the 17 articles used modeling to project longer-term outcomes, but 80% of the articles using modeling were not peer-reviewed. Although 60% of modeling studies included interventions for children in the system, all peer-reviewed modeling articles focused on prevention.Methodological guidance for economic evaluations in Child Welfare is increasingly available. Such analyses are feasible given the availability of nationally representative data on children involved with Child Welfare and evidence-based interventions.Policy analyses considering the long-term costs and effects of interventions to improve Child Welfare outcomes are scarce, feasible, and urgently needed.
View details for DOI 10.1016/j.chiabu.2011.05.012
View details for Web of Science ID 000295946000007
View details for PubMedID 21944552
View details for PubMedCentralID PMC3230248
-
The Business Case for Quality Improvement Oral Anticoagulation for Atrial Fibrillation
CIRCULATION-CARDIOVASCULAR QUALITY AND OUTCOMES
2011; 4 (4): 416-424
Abstract
The potential to save money within a short time frame provides a more compelling "business case" for quality improvement than merely demonstrating cost-effectiveness. Our objective was to demonstrate the potential for cost savings from improved control in patients anticoagulated for atrial fibrillation.Our population consisted of 67 077 Veterans Health Administration patients anticoagulated for atrial fibrillation between October 1, 2006, and September 30, 2008. We simulated the number of adverse events and their associated costs and utilities, both before and after various degrees of improvement in percent time in therapeutic range (TTR). The simulation had a 2-year time horizon, and costs were calculated from the perspective of the payer. In the base-case analysis, improving TTR by 5% prevented 1114 adverse events, including 662 deaths; it gained 863 quality-adjusted life-years and saved $15.9 million compared with the status quo, not accounting for the cost of the quality improvement program. Improving TTR by 10% prevented 2087 events, gained 1606 quality-adjusted life-years, and saved $29.7 million. In sensitivity analyses, costs were most sensitive to the estimated risk of stroke and the expected stroke reduction from improved TTR. Utilities were most sensitive to the estimated risk of death and the expected mortality benefit from improved TTR.A quality improvement program to improve anticoagulation control probably would be cost-saving for the payer, even if it were only modestly effective in improving control and even without considering the value of improved health. This study demonstrates how to make a business case for a quality improvement initiative.
View details for DOI 10.1161/CIRCOUTCOMES.111.960591
View details for Web of Science ID 000292872500008
View details for PubMedID 21712521
-
Diabetes mellitus and tuberculosis in countries with high tuberculosis burdens: individual risks and social determinants
INTERNATIONAL JOURNAL OF EPIDEMIOLOGY
2011; 40 (2): 417-428
Abstract
A growing body of evidence supports the role of type 2 diabetes as an individual-level risk factor for tuberculosis (TB), though evidence from developing countries with the highest TB burdens is lacking. In developing countries, TB is most common among the poor, in whom diabetes may be less common. We assessed the relationship between individual-level risk, social determinants and population health in these settings.We performed individual-level analyses using the World Health Survey (n = 124,607; 46 countries). We estimated the relationship between TB and diabetes, adjusting for gender, age, body mass index, education, housing quality, crowding and health insurance. We also performed a longitudinal country-level analysis using data on per-capita gross domestic product and TB prevalence and incidence and diabetes prevalence for 1990-95 and 2003-04 (163 countries) to estimate the relationship between increasing diabetes prevalence and TB, identifying countries at risk for disease interactions.In lower income countries, individuals with diabetes are more likely than non-diabetics to have TB [univariable odds ratio (OR): 2.39; 95% confidence interval (CI): 1.84-3.10; multivariable OR: 1.81; 95% CI: 1.37-2.39]. Increases in TB prevalence and incidence over time were more likely to occur when diabetes prevalence also increased (OR: 4.7; 95% CI: 1.0-22.5; OR: 8.6; 95% CI: 1.9-40.4). Large populations, prevalent TB and projected increases in diabetes make countries like India, Peru and the Russia Federation areas of particular concern.Given the association between diabetes and TB and projected increases in diabetes worldwide, multi-disease health policies should be considered.
View details for DOI 10.1093/ije/dyq238
View details for Web of Science ID 000289165800020
View details for PubMedID 21252210
View details for PubMedCentralID PMC3621385
-
Screening with OGTT alone or in combination with the Indian diabetes risk score or genotyping of TCF7L2 to detect undiagnosed type 2 diabetes in Asian Indians
INDIAN JOURNAL OF MEDICAL RESEARCH
2011; 133 (3): 294-299
Abstract
With increasing number of people with diabetes worldwide, particularly in India, it is necessary to search for low cost screening methods. We compared the effectiveness and costs of screening for undiagnosed type 2 diabetes mellitus (T2DM), using oral glucose tolerance testing (OGTT) alone, or following a positive result from the Indian Diabetes Risk Score (IDRS) or following a positive result from genotyping of the TCF7L2 polymorphisms in Asian Indians.In subjects without known diabetes (n=961) recruited from the Chennai Urban Rural Epidemiology Study (CURES), OGTT, IDRS, and genotyping of rs12255372 (G/T) and rs7903146(C/T) of TCF7L2 polymorphisms were done. IDRS includes four parameters: age, abdominal obesity, family history of T2DM and physical activity.OGTT identified 72 subjects with newly diagnosed diabetes (NDD), according to the World Health Organization criteria of fasting plasma glucose ≥ 126 mg/dl or a plasma glucose ≥ 200 mg/dl, 2 h after 75 g oral glucose load. IDRS screening (cut-off ≥ 60) yielded 413 positive subjects, which included 54 (75%) of the 72 NDD subjects identified by OGTT. Genotyping yielded 493 positive subjects which only included 36 (50%) of the 72 NDD subjects showing less discriminatory power. Screening with both SNPs missed 27 (37.5%) NDD subjects identified by IDRS. In contrast, IDRS missed only 9 (12.5%) of the NDD subjects identified by genotyping. Total screening cost for OGTT alone, or with IDRS were rs 384,400 and 182,810 respectively. Comparing OGTT alone to IDRS followed by OGTT, the incremental cost per additional NDD subject detected by doing OGTT on everyone was rs 11,199 (rs 201,590 for detecting additional 18 NDD subjects).For screening a population of subjects without diagnosed diabetes in India, a simple diabetes risk score is more effective and less expensive than genotyping or doing OGTT on the whole population.
View details for Web of Science ID 000289322400010
View details for PubMedID 21441683
View details for PubMedCentralID PMC3103154
-
Quantifying Child Mortality Reductions Related to Measles Vaccination
PLOS ONE
2010; 5 (11)
Abstract
This study characterizes the historical relationship between coverage of measles containing vaccines (MCV) and mortality in children under 5 years, with a view toward ongoing global efforts to reduce child mortality.Using country-level, longitudinal panel data, from 44 countries over the period 1960-2005, we analyzed the relationship between MCV coverage and measles mortality with (1) logistic regressions for no measles deaths in a country-year, and (2) linear regressions for the logarithm of the measles death rate. All regressions allowed a flexible, non-linear relationship between coverage and mortality. Covariates included birth rate, death rates from other causes, percent living in urban areas, population density, per-capita GDP, use of the two-dose MCV, year, and mortality coding system. Regressions used lagged covariates, country fixed effects, and robust standard errors clustered by country. The likelihood of no measles deaths increased nonlinearly with higher MCV coverage (ORs: 13.8 [1.6-122.7] for 80-89% to 40.7 [3.2-517.6] for ≥95%), compared to pre-vaccination risk levels. Measles death rates declined nonlinearly with higher MCV coverage, with benefits accruing more slowly above 90% coverage. Compared to no coverage, predicted average reductions in death rates were -79% at 70% coverage, -93% at 90%, and -95% at 95%.40 years of experience with MCV vaccination suggests that extremely high levels of vaccination coverage are needed to produce sharp reductions in measles deaths. Achieving sustainable benefits likely requires a combination of extended vaccine programs and supplementary vaccine efforts.
View details for DOI 10.1371/journal.pone.0013842
View details for Web of Science ID 000283838600016
View details for PubMedID 21079809
View details for PubMedCentralID PMC2973966
-
Empirically Evaluating Decision-Analytic Models
VALUE IN HEALTH
2010; 13 (5): 667-674
Abstract
Model-based cost-effectiveness analyses support decision-making. To augment model credibility, evaluation via comparison to independent, empirical studies is recommended.We developed a structured reporting format for model evaluation and conducted a structured literature review to characterize current model evaluation recommendations and practices. As an illustration, we applied the reporting format to evaluate a microsimulation of human papillomavirus and cervical cancer. The model's outputs and uncertainty ranges were compared with multiple outcomes from a study of long-term progression from high-grade precancer (cervical intraepithelial neoplasia [CIN]) to cancer. Outcomes included 5 to 30-year cumulative cancer risk among women with and without appropriate CIN treatment. Consistency was measured by model ranges overlapping study confidence intervals.The structured reporting format included: matching baseline characteristics and follow-up, reporting model and study uncertainty, and stating metrics of consistency for model and study results. Structured searches yielded 2963 articles with 67 meeting inclusion criteria and found variation in how current model evaluations are reported. Evaluation of the cervical cancer microsimulation, reported using the proposed format, showed a modeled cumulative risk of invasive cancer for inadequately treated women of 39.6% (30.9-49.7) at 30 years, compared with the study: 37.5% (28.4-48.3). For appropriately treated women, modeled risks were 1.0% (0.7-1.3) at 30 years, study: 1.5% (0.4-3.3).To support external and projective validity, cost-effectiveness models should be iteratively evaluated as new studies become available, with reporting standardized to facilitate assessment. Such evaluations are particularly relevant for models used to conduct comparative effectiveness analyses.
View details for DOI 10.1111/j.1524-4733.2010.00698.x
View details for Web of Science ID 000280674200021
View details for PubMedID 20230547
-
Inpatient treatment of diabetic patients in Asia: evidence from India, China, Thailand and Malaysia
DIABETIC MEDICINE
2010; 27 (1): 101-108
Abstract
The prevalence of Type 2 diabetes mellitus (DM) has grown rapidly, but little is known about the drivers of inpatient spending in low- and middle-income countries. This study aims to compare the clinical presentation and expenditure on hospital admission for inpatients with a primary diagnosis of Type 2 DM in India, China, Thailand and Malaysia.We analysed data on adult, Type 2 DM patients admitted between 2005 and 2008 to five tertiary hospitals in the four countries, reporting expenditures relative to income per capita in 2007.Hospital admission spending for diabetic inpatients with no complications ranged from 11 to 75% of per-capita income. Spending for patients with complications ranged from 6% to over 300% more than spending for patients without complications treated at the same hospital. Glycated haemoglobin was significantly higher for the uninsured patients, compared with insured patients, in India (8.6 vs. 8.1%), Hangzhou, China (9.0 vs. 8.1%), and Shandong, China (10.9 vs. 9.9%). When the hospital admission expenditures of the insured and uninsured patients were statistically different in India and China, the uninsured always spent less than the insured patients.With the rising prevalence of DM, households and health systems in these countries will face greater economic burdens. The returns to investment in preventing diabetic complications appear substantial. Countries with large out-of-pocket financing burdens such as India and China are associated with the widest gaps in resource use between insured and uninsured patients. This probably reflects both overuse by the insured and underuse by the uninsured.
View details for DOI 10.1111/j.1464-5491.2009.02874.x
View details for Web of Science ID 000273451900015
View details for PubMedID 20121896
-
Program Spending to Increase Adherence: South African Cervical Cancer Screening
PLOS ONE
2009; 4 (5)
Abstract
Adherence is crucial for public health program effectiveness, though the benefits of increasing adherence must ultimately be weighed against the associated costs. We sought to determine the relationship between investment in community health worker (CHW) home visits and increased attendance at cervical cancer screening appointments in Cape Town, South Africa.We conducted an observational study of 5,258 CHW home visits made in 2003-4 as part of a community-based screening program. We estimated the functional relationship between spending on these visits and increased appointment attendance (adherence). Increased adherence was noted after each subsequent CHW visit. The costs of making the CHW visits was based on resource use including both personnel time and vehicle-related expenses valued in 2004 Rand. The CHW program cost R194,018, with 1,576 additional appointments attended. Adherence increased from 74% to 90%; 55% to 87%; 48% to 77%; and 56% to 80% for 6-, 12-, 24-, and 36-month appointments. Average per-woman costs increased by R14-R47. The majority of this increase occurred with the first 2 CHW visits (90%, 83%, 74%, and 77%; additional cost: R12-R26).We found that study data can be used for program planning, identifying spending levels that achieve adherence targets given budgetary constraints. The results, derived from a single disease program, are retrospective, and should be prospectively replicated.
View details for DOI 10.1371/journal.pone.0005691
View details for Web of Science ID 000266415100005
View details for PubMedID 19492097
View details for PubMedCentralID PMC2683936
-
Knowledge-based errors in anesthesia: a paired, controlled trial of learning and retention
CANADIAN JOURNAL OF ANAESTHESIA-JOURNAL CANADIEN D ANESTHESIE
2009; 56 (1): 35-45
Abstract
Optimizing patient safety by improving the training of physicians is a major challenge of medical education. In this pilot study, we hypothesized that a brief lecture, targeted to rare but potentially dangerous situations, could improve anesthesia practitioners' knowledge levels with significant retention of learning at six months.In this paired controlled trial, anesthesia residents and attending physicians at Massachusetts General Hospital took the same 14-question multiple choice examination three times: at baseline, immediately after a brief lecture, and six months later. The lecture covered material on seven "intervention" questions; the remaining seven were "control" questions. The authors measured immediate knowledge acquisition, defined as the change in percentage of correct answers on intervention questions between baseline and post-lecture, and measured learning retention as the difference between baseline and six months. Both measurements were corrected for change in performance on control questions.Fifty of the 89 subjects completed all three examinations. The post-lecture increase in percentage of questions answered correctly, adjusted for control, was 22.2% [95% confidence interval (CI) 16.0-28.4%; P < 0.01], while the adjusted increase at six months was 7.9% (95% CI 1.1-14.7%; P = 0.024).A brief lecture improved knowledge, and the subjects retained a significant amount of this learning at six months. Exposing residents or other practitioners to this type of inexpensive teaching intervention may help them to avoid preventable uncommon errors that are rooted in unfamiliarity with the situation or the equipment. The methods used for this study may also be applied to compare the effect of various other teaching modalities while, at the same time, preserving participant anonymity and making adjustments for ongoing learning.
View details for DOI 10.1007/s12630-008-9002-9
View details for Web of Science ID 000263012800006
View details for PubMedID 19247776
-
Re: Cost-Effectiveness of Cervical Cancer Screening With Human Papillomavirus DNA Testing and HPV-16,18 Vaccination Response
JNCI-JOURNAL OF THE NATIONAL CANCER INSTITUTE
2008; 100 (22): 1654–55
View details for DOI 10.1093/jnci/djn369
View details for Web of Science ID 000261170000014
-
Trade-offs in cervical cancer prevention - Balancing benefits and risks
ARCHIVES OF INTERNAL MEDICINE
2008; 168 (17): 1881-1889
Abstract
New screening and vaccination technologies will provide women with more options for cervical cancer prevention. Because the risk of cervical cancer diminishes with effective routine screening, women may wish to consider additional attributes, such as the likelihood of false-positive results and diagnostic procedures for mild abnormalities likely to resolve without intervention in their screening choices.We used an empirically calibrated simulation model of cervical cancer in the United States to assess the benefits and potential risks associated with prevention strategies differing by primary screening test, triage test for abnormal results (cytologic testing, human papillomavirus [HPV] DNA test), and screening frequency. Outcomes included colposcopy referrals, cervical intraepithelial neoplasia (CIN) types 1 and 2 or 3, lifetime cancer risk, and quality-adjusted life expectancy.Across strategies, colposcopy referrals and diagnostic workups varied 3-fold, although diagnostic rates of CIN 2 or 3 were similar and 95% of positive screening test results were for mild abnormalities likely to resolve on their own. For a representative group of a thousand 20-year-old women undergoing triennial screening for 10 years, we expect 1038 colposcopy referrals (7 CIN 2 or 3 diagnoses) from combined cytologic and HPV DNA testing and fewer than 200 referrals (6-7 CIN 2 or 3 diagnoses) for strategies that use triage testing. Similarly, for a thousand 40-year-old women, combined cytologic and HPV DNA testing led to 489 referrals (9 CIN 2 or 3), whereas alternative strategies resulted in fewer than 150 referrals (7-8 CIN 2 or 3). Using cytologic testing followed by triage testing in younger women minimizes both diagnostic workups and positive HPV test results, whereas in older women diagnostic workups are minimized with HPV DNA testing followed by cytologic triage testing.Clinically relevant information highlighting trade-offs among cervical cancer prevention strategies allows for inclusion of personal preferences into women's decision making about screening and provides additional dimensions to the construction of clinical guidelines.
View details for Web of Science ID 000259393000007
View details for PubMedID 18809815
View details for PubMedCentralID PMC2746633
-
Cost-effectiveness of cervical cancer screening with human papillomavirus DNA testing and HPV-16,18 vaccination
JOURNAL OF THE NATIONAL CANCER INSTITUTE
2008; 100 (5): 308-320
Abstract
The availability of human papillomavirus (HPV) DNA testing and vaccination against HPV types 16 and 18 (HPV-16,18) motivates questions about the cost-effectiveness of cervical cancer prevention in the United States for unvaccinated older women and for girls eligible for vaccination.An empirically calibrated model was used to assess the quality-adjusted life years (QALYs), lifetime costs, and incremental cost-effectiveness ratios (2004 US dollars per QALY) of screening, vaccination of preadolescent girls, and vaccination combined with screening. Screening varied by initiation age (18, 21, or 25 years), interval (every 1, 2, 3, or 5 years), and test (HPV DNA testing of cervical specimens or cytologic evaluation of cervical cells with a Pap test). Testing strategies included: 1) cytology followed by HPV DNA testing for equivocal cytologic results (cytology with HPV test triage); 2) HPV DNA testing followed by cytology for positive HPV DNA results (HPV test with cytology triage); and 3) combined HPV DNA testing and cytology. Strategies were permitted to switch once at age 25, 30, or 35 years.For unvaccinated women, triennial cytology with HPV test triage, beginning by age 21 years and switching to HPV testing with cytology triage at age 30 years, cost $78,000 per QALY compared with the next best strategy. For girls vaccinated before age 12 years, this same strategy, beginning at age 25 years and switching at age 35 years, cost $41,000 per QALY with screening every 5 years and $188,000 per QALY screening triennially, each compared with the next best strategy. These strategies were more effective and cost-effective than screening women of all ages with cytology alone or cytology with HPV triage annually or biennially.For both vaccinated and unvaccinated women, age-based screening by use of HPV DNA testing as a triage test for equivocal results in younger women and as a primary screening test in older women is expected to be more cost-effective than current screening recommendations.
View details for DOI 10.1093/jnci/djn019
View details for Web of Science ID 000253796800008
View details for PubMedID 18314477
View details for PubMedCentralID PMC3099548
-
Cost-effectiveness of HPV 16, 18 vaccination in Brazil
VACCINE
2007; 25 (33): 6257-6270
Abstract
We use an empirically calibrated model to estimate the cost-effectiveness of cervical cancer prevention in Brazil, a country with a high cervical cancer burden. Assuming 70% coverage, HPV 16, 18 vaccination of adolescent girls is expected to reduce the lifetime risk of cancer by approximately 42.7% (range, 33.2-53.5%); screening three times per lifetime is expected to reduce risk by 21.9-30.7% depending on the screening test, and a combined approach of vaccination and screening is expected to reduce cancer risk by a mean of 60.8% (range, 52.8-70.1%). In Brazil; provided the cost per vaccinated woman is less than I$ 25, implying a per dose cost of approximately I$ 5, vaccination before age 12, followed by screening three times per lifetime between ages 35 and 45, would be considered very cost-effective using the country's per capita gross domestic product as a cost-effectiveness threshold. Assuming a coverage rate of 70%, this strategy would be expected to prevent approximately 100,000 cases of invasive cervical cancer over a 5-year period. Vaccination strategies identified as cost-effective may be unaffordable in countries with similar socioeconomic profiles as Brazil without assistance; these results can provide guidance to the global community by identifying health investments of highest priority and with the greatest promise.
View details for DOI 10.1016/j.vaccine.2007.05.058
View details for PubMedID 17606315
-
Modeling human papillomavirus and cervical cancer in the United States for analyses of screening and vaccination.
Population health metrics
2007; 5: 11-?
Abstract
To provide quantitative insight into current U.S. policy choices for cervical cancer prevention, we developed a model of human papillomavirus (HPV) and cervical cancer, explicitly incorporating uncertainty about the natural history of disease.We developed a stochastic microsimulation of cervical cancer that distinguishes different HPV types by their incidence, clearance, persistence, and progression. Input parameter sets were sampled randomly from uniform distributions, and simulations undertaken with each set. Through systematic reviews and formal data synthesis, we established multiple epidemiologic targets for model calibration, including age-specific prevalence of HPV by type, age-specific prevalence of cervical intraepithelial neoplasia (CIN), HPV type distribution within CIN and cancer, and age-specific cancer incidence. For each set of sampled input parameters, likelihood-based goodness-of-fit (GOF) scores were computed based on comparisons between model-predicted outcomes and calibration targets. Using 50 randomly resampled, good-fitting parameter sets, we assessed the external consistency and face validity of the model, comparing predicted screening outcomes to independent data. To illustrate the advantage of this approach in reflecting parameter uncertainty, we used the 50 sets to project the distribution of health outcomes in U.S. women under different cervical cancer prevention strategies.Approximately 200 good-fitting parameter sets were identified from 1,000,000 simulated sets. Modeled screening outcomes were externally consistent with results from multiple independent data sources. Based on 50 good-fitting parameter sets, the expected reductions in lifetime risk of cancer with annual or biennial screening were 76% (range across 50 sets: 69-82%) and 69% (60-77%), respectively. The reduction from vaccination alone was 75%, although it ranged from 60% to 88%, reflecting considerable parameter uncertainty about the natural history of type-specific HPV infection. The uncertainty surrounding the model-predicted reduction in cervical cancer incidence narrowed substantially when vaccination was combined with every-5-year screening, with a mean reduction of 89% and range of 83% to 95%.We demonstrate an approach to parameterization, calibration and performance evaluation for a U.S. cervical cancer microsimulation model intended to provide qualitative and quantitative inputs into decisions that must be taken before long-term data on vaccination outcomes become available. This approach allows for a rigorous and comprehensive description of policy-relevant uncertainty about health outcomes under alternative cancer prevention strategies. The model provides a tool that can accommodate new information, and can be modified as needed, to iteratively assess the expected benefits, costs, and cost-effectiveness of different policies in the U.S.
View details for PubMedID 17967185
-
Chapter 18: Public health policy for cervical cancer prevention: the role of decision science, economic evaluation, and mathematical modeling.
Vaccine
2006; 24: S3 155-63
Abstract
Several factors are changing the landscape of cervical cancer control, including a better understanding of the natural history of human papillomavirus (HPV), reliable assays for detecting high-risk HPV infections, and a soon to be available HPV-16/18 vaccine. There are important differences in the relevant policy questions for different settings. By synthesizing and integrating the best available data, the use of modeling in a decision analytic framework can identify those factors most likely to influence outcomes, can guide the design of future clinical studies and operational research, can provide insight into the cost-effectiveness of different strategies, and can assist in early decision-making when considered with criteria such as equity, public preferences, and political and cultural constraints.
View details for PubMedID 16950003
-
Public health policy for cervical cancer prevention: The role of decision science, economic evaluation, and mathematical modeling
VACCINE
2006; 24: 155-163
Abstract
Several factors are changing the landscape of cervical cancer control, including a better understanding of the natural history of human papillomavirus (HPV), reliable assays for detecting high-risk HPV infections, and a soon to be available HPV-16/18 vaccine. There are important differences in the relevant policy questions for different settings. By synthesizing and integrating the best available data, the use of modeling in a decision analytic framework can identify those factors most likely to influence outcomes, can guide the design of future clinical studies and operational research, can provide insight into the cost-effectiveness of different strategies, and can assist in early decision-making when considered with criteria such as equity, public preferences, and political and cultural constraints.
View details for DOI 10.1016/j.vaccine.2006.05.112
View details for Web of Science ID 000240470000020
-
Estimating the cost of cervical cancer screening in five developing countries.
Cost effectiveness and resource allocation : C/E
2006; 4: 13-?
Abstract
Cost-effectiveness analyses (CEAs) can provide useful information to policymakers concerned with the broad allocation of resources as well as to local decision makers choosing between different options for reducing the burden from a single disease. For the latter, it is important to use country-specific data when possible and to represent cost differences between countries that might make one strategy more or less attractive than another strategy locally. As part of a CEA of cervical cancer screening in five developing countries, we supplemented limited primary cost data by developing other estimation techniques for direct medical and non-medical costs associated with alternative screening approaches using one of three initial screening tests: simple visual screening, HPV DNA testing, and cervical cytology. Here, we report estimation methods and results for three cost areas in which data were lacking.To supplement direct medical costs, including staff, supplies, and equipment depreciation using country-specific data, we used alternative techniques to quantify cervical cytology and HPV DNA laboratory sample processing costs. We used a detailed quantity and price approach whose face validity was compared to an adaptation of a US laboratory estimation methodology. This methodology was also used to project annual sample processing capacities for each laboratory type. The cost of sample transport from the clinic to the laboratory was estimated using spatial models. A plausible range of the cost of patient time spent seeking and receiving screening was estimated using only formal sector employment and wages as well as using both formal and informal sector participation and country-specific minimum wages. Data sources included primary data from country-specific studies, international databases, international prices, and expert opinion. Costs were standardized to year 2000 international dollars using inflation adjustment and purchasing power parity.Cervical cytology laboratory processing costs were I$1.57-3.37 using the quantity and price method compared to I$1.58-3.02 from the face validation method. HPV DNA processing costs were I$6.07-6.59. Rural laboratory transport costs for cytology were I$0.12-0.64 and I$0.14-0.74 for HPV DNA laboratories. Under assumptions of lower resource efficiency, these estimates increased to I$0.42-0.83 and I$0.54-1.06. Estimates of the value of an hour of patient time using only formal sector participation were I$0.07-4.16, increasing to I$0.30-4.80 when informal and unpaid labor was also included. The value of patient time for traveling, waiting, and attending a screening visit was I$0.68-17.74. With the total cost of screening for cytology and HPV DNA testing ranging from I$4.85-40.54 and I$11.30-48.77 respectively, the cost of the laboratory transport, processing, and patient time accounted for 26-66% and 33-65% of the total costs. From a payer perspective, laboratory transport and processing accounted for 18-48% and 25-60% of total direct medical costs of I$4.11-19.96 and I$10.57-28.18 respectively.Cost estimates of laboratory processing, sample transport, and patient time account for a significant proportion of total cervical cancer screening costs in five developing countries and provide important inputs for CEAs of alternative screening modalities.
View details for PubMedID 16887041
View details for PubMedCentralID PMC1570139
-
Cost-effectiveness of cervical-cancer screening in five developing countries
NEW ENGLAND JOURNAL OF MEDICINE
2005; 353 (20): 2158-2168
Abstract
Cervical-cancer screening strategies that involve the use of conventional cytology and require multiple visits have been impractical in developing countries.We used computer-based models to assess the cost-effectiveness of a variety of cervical-cancer screening strategies in India, Kenya, Peru, South Africa, and Thailand. Primary data were combined with data from the literature to estimate age-specific incidence and mortality rates for cancer and the effectiveness of screening for and treatment of precancerous lesions. We assessed the direct medical, time, and program-related costs of strategies that differed according to screening test, targeted age and frequency, and number of clinic visits required. Single-visit strategies involved the assumption that screening and treatment could be provided in the same day. Outcomes included the lifetime risk of cancer, years of life saved, lifetime costs, and cost-effectiveness ratios (cost per year of life saved).The most cost-effective strategies were those that required the fewest visits, resulting in improved follow-up testing and treatment. Screening women once in their lifetime, at the age of 35 years, with a one-visit or two-visit screening strategy involving visual inspection of the cervix with acetic acid or DNA testing for human papillomavirus (HPV) in cervical cell samples, reduced the lifetime risk of cancer by approximately 25 to 36 percent, and cost less than 500 dollars per year of life saved. Relative cancer risk declined by an additional 40 percent with two screenings (at 35 and 40 years of age), resulting in a cost per year of life saved that was less than each country's per capita gross domestic product--a very cost-effective result, according to the Commission on Macroeconomics and Health.Cervical-cancer screening strategies incorporating visual inspection of the cervix with acetic acid or DNA testing for HPV in one or two clinical visits are cost-effective alternatives to conventional three-visit cytology-based screening programs in resource-poor settings.
View details for Web of Science ID 000233288600008
View details for PubMedID 16291985
-
The costs of reducing loss to follow-up in South African cervical cancer screening.
Cost effectiveness and resource allocation : C/E
2005; 3: 11-?
Abstract
This study was designed to quantify the resources used in reestablishing contact with women who missed their scheduled cervical cancer screening visits and to assess the success of this effort in reducing loss to follow-up in a developing country setting.Women were enrolled in this Cape Town, South Africa-based screening study between 2000 and 2003, and all had scheduled follow-up visits in 2003. Community health worker (CHW) time, vehicle use, maintenance, and depreciation were estimated from weekly logs and cost accounting systems. The percentage of women who attended their scheduled visit, those who attended after CHW contact(s), and those who never returned despite attempted contact(s) were determined. The number of CHW visits per woman was also estimated.3,711 visits were scheduled in 2003. Of these, 2,321 (62.5%) occurred without CHW contact, 918 (24.8%) occurred after contact(s), and 472 (12.7%) did not occur despite contact(s). Loss to follow-up was reduced from 21% to 6%, 39% to 10%, and 50% to 24% for 6, 12, and 24-month visits. CHWs attempted 3,200 contacts in 530 trips. On average, 3 CHWs attempted to contact 6 participants over each 111 minute trip. The per-person cost (2003 Rand) for these activities was 12.75, 24.92, and 40.50 for 6, 12, and 24-month visits.CHW contact with women who missed scheduled visits increased their return rate. Cost-effectiveness analyses aimed at policy decisions about cervical cancer screening in developing countries should incorporate these findings.
View details for PubMedID 16288646
View details for PubMedCentralID PMC1308836
-
Papanicolaou screening in developing countries
AMERICAN JOURNAL OF CLINICAL PATHOLOGY
2005; 124 (2): 314-315
View details for Web of Science ID 000230770200021
View details for PubMedID 16116686
-
Male involvement in cardiovascular preventive healthcare in two rural Costa Rican communities
PREVENTIVE MEDICINE
2005; 40 (6): 690-695
Abstract
Gender differences in health system usage can lead to differences in the incidence of morbidity and mortality. We conducted a pilot screening targeted towards men to evaluate gender differences in cardiovascular disease risk factor detection and time since last clinic visit.Three evening sessions in two communities screened 148 people, mean age 47.7 years. Height, weight, body mass index, blood pressure, blood glucose, and total cholesterol were measured. A questionnaire on past medical history was administered. Participants with elevated measurements were referred to appropriate care.Men accounted for 60.1% of those screened; 65.5% of the group was overweight, and 22.3% was obese with 42.6% hypertension, 39.2% hypercholesterolemia, and 2.7% high blood glucose. Among men aged 35 to 65, 65.2% were overweight, 20.3% obese, 46.4% hypertensive, 42.0% hypercholesterolemic, and 1.5% with high blood glucose. Within the last 2 years, 53.3% of men and 9.1% of women aged 35 to 65 had not visited a doctor (P = 0.004).A significant portion of those screened had elevated cardiovascular disease risk factors. Given that men visited doctors significantly less frequently, efforts to involve men in prevention of cardiovascular disease within these communities are warranted.
View details for DOI 10.1016/j.ypmed.2004.09.009
View details for Web of Science ID 000229006700011
View details for PubMedID 15850866
-
Randomized controlled community-based nutrition and exercise intervention improves glycemia and cardiovascular risk factors in type 2 diabetic patients in rural Costa Rica
DIABETES CARE
2003; 26 (1): 24-29
Abstract
The prevalence of type 2 diabetes, especially in developing countries, has grown over the past decades. We performed a controlled clinical study to determine whether a community-based, group-centered public health intervention addressing nutrition and exercise can ameliorate glycemic control and associated cardiovascular risk factors in type 2 diabetic patients in rural Costa Rica.A total of 75 adults with type 2 diabetes, mean age 59 years, were randomly assigned to the intervention group or the control group. All participants received basic diabetes education. The subjects in the intervention group participated in 11 weekly nutrition classes (90 min each session). Subjects for whom exercise was deemed safe also participated in triweekly walking groups (60 min each session). Glycosylated hemoglobin, fasting plasma glucose, total cholesterol, triglycerides, HDL and LDL cholesterol, height, weight, BMI, and blood pressure were measured at baseline and the end of the study (after 12 weeks).The intervention group lost 1.0 +/- 2.2 kg compared with a weight gain in the control group of 0.4 +/- 2.3 kg (P = 0.028). Fasting plasma glucose decreased 19 +/- 55 mg/dl in the intervention group and increased 16 +/- 78 mg/dl in the control group (P = 0.048). Glycosylated hemoglobin decreased 1.8 +/- 2.3% in the intervention group and 0.4 +/- 2.3% in the control group (P = 0.028).Glycemic control of type 2 diabetic patients can be improved through community-based, group-centered public health interventions addressing nutrition and exercise. This pilot study provides an economically feasible model for programs that aim to improve the health status of people with type 2 diabetes.
View details for Web of Science ID 000185504900004
View details for PubMedID 12502654