Kristin Sainani (n e Cobb)
Professor (Teaching) of Epidemiology and Population Health
Honors & Awards
-
Biosciences Award for Excellence in Graduate Teaching, Stanford University (2018)
-
Teaching Award, Div of Epidemiology (HRP) (2005, 2007, 2009)
-
Rennie Taylor/Alton Blakeslee Fellowship for Science Writing, Council for the Advancement of Science Writing (2001-2002)
-
Howard Hughes Predoctoral Fellow in the Biological Sciences, Howard Hughes Medical Institute (1997-2002)
Professional Education
-
Certificate, UC-Santa Cruz, Science Writing (2002)
-
PhD, Stanford, Epidemiology (2002)
-
MS, Stanford, Statistics (1999)
Current Research and Scholarly Interests
Science writing, science communication, biostatistics. Research areas: osteoporosis, stress fractures, sports injuries, female athlete triad.
2024-25 Courses
- Introduction to Health Sciences Statistics
HUMBIO 89 (Aut) - Introduction to Probability and Statistics for Epidemiology
EPI 159, EPI 259 (Aut, Sum) - Quality & Safety in U.S. Healthcare
CIM 203 (Sum) -
Independent Studies (5)
- Curricular Practical Training
EPI 291 (Aut, Win, Spr, Sum) - Directed Reading in Epidemiology
EPI 299 (Aut, Win, Spr, Sum) - Graduate Research
EPI 399 (Aut, Win, Spr, Sum) - Medical Scholars Research
HRP 370 (Aut, Win, Spr, Sum) - Undergraduate Research
EPI 199 (Aut, Win, Spr, Sum)
- Curricular Practical Training
-
Prior Year Courses
2023-24 Courses
- Intermediate Biostatistics: Analysis of Discrete Data
BIOMEDIN 233, EPI 261, STATS 261 (Win) - Intermediate Biostatistics: Regression, Prediction, Survival Analysis
EPI 262, STATS 262 (Spr) - Introduction to Probability and Statistics for Clinical Research
EPI 258 (Spr) - Introduction to Probability and Statistics for Epidemiology
EPI 259, HUMBIO 89X (Aut, Sum) - Quality & Safety in U.S. Healthcare
CIM 203 (Sum) - Scientific Writing
EPI 214 (Win)
2022-23 Courses
- Intermediate Biostatistics: Analysis of Discrete Data
BIOMEDIN 233, EPI 261, STATS 261 (Win) - Intermediate Biostatistics: Regression, Prediction, Survival Analysis
EPI 262, STATS 262 (Spr) - Introduction to Health Sciences Statistics
HUMBIO 89 (Aut) - Introduction to Probability and Statistics for Clinical Research
EPI 258 (Spr) - Introduction to Probability and Statistics for Epidemiology
EPI 259, HUMBIO 89X (Aut, Sum) - Preparation and Practice: Scientific Communication and Media I
EPI 271 (Sum) - Quality & Safety in U.S. Healthcare
BIOMEDIN 254 (Spr, Sum) - Scientific Writing
EPI 214 (Win)
2021-22 Courses
- Intermediate Biostatistics: Analysis of Discrete Data
BIOMEDIN 233, EPI 261, STATS 261 (Win) - Intermediate Biostatistics: Regression, Prediction, Survival Analysis
EPI 262, STATS 262 (Spr) - Introduction to Health Sciences Statistics
HUMBIO 89 (Aut) - Introduction to Probability and Statistics for Clinical Research
EPI 258 (Spr) - Introduction to Probability and Statistics for Epidemiology
EPI 259, HUMBIO 89X (Aut, Sum) - Preparation & Practice: Science Communication & Media
BIOS 292 (Sum) - Quality & Safety in U.S. Healthcare
BIOMEDIN 254, HRP 254 (Spr) - Scientific Writing
EPI 214 (Win)
- Intermediate Biostatistics: Analysis of Discrete Data
Stanford Advisees
-
Doctoral Dissertation Reader (AC)
Jonathan Altamirano, Yan Min, Matthew Sigurdson, Annabel Tan -
Doctoral Dissertation Advisor (AC)
Aubrey Roberts, Axel Wolff -
Master's Program Advisor
Ellie Diamond -
Doctoral (Program)
Jonathan Altamirano, Cesar Baeta, Jasmyn Burdsall, Sylvie Dobrota Lai, Michael Hittle, Sam Jaros, Richard Liang, Yan Min, Andrew Nepomuceno, Anna Nguyen, Rishi Parikh, Amadeia Rector, Aubrey Roberts, Hanyang Shen, Matthew Sigurdson, Simon John Christoph Soerensen, Shamsi Soltani, Annabel Tan
All Publications
-
Response to "homeopathy: a null field or effective psychotherapy?".
Journal of clinical epidemiology
2024: 111266
View details for DOI 10.1016/j.jclinepi.2024.111266
View details for PubMedID 38266741
-
Mental Health Matters: A Cross-Sectional Survey on Depression and Anxiety Symptoms and the Female and Male Athlete Triad
CLINICAL JOURNAL OF SPORT MEDICINE
2023; 33 (4): 368-375
View details for DOI 10.1097/JSM.0000000000001150
View details for Web of Science ID 001021401900006
-
Homeopathy can offer empirical insights on treatment effects in a null field.
Journal of clinical epidemiology
2023
Abstract
A "null field" is a scientific field where there is nothing to discover and where observed associations are thus expected to simply reflect the magnitude of bias. We aimed to characterize a null field using a known example, homeopathy (a pseudoscientific medical approach based on using highly diluted substances), as a prototype.We identified 50 randomized placebo-controlled trials of homeopathy interventions from highly-cited meta-analyses. The primary outcome variable was the observed effect size in the studies. Variables related to study quality or impact were also extracted.The mean effect size for homeopathy was 0.36 standard deviations (Hedges' g; 95% CI: 0.21, 0.51) better than placebo, which corresponds to an odds ratio of 1.94 (95% CI: 1.69, 2.23) in favor of homeopathy. 80% of studies had positive effect sizes (favoring homeopathy). Effect size was significantly correlated with citation counts from journals in the Directory of Open Access Journals and CiteWatch. We identified common statistical errors in 25 studies.A null field like homeopathy can exhibit large effect sizes, high rates of favorable results, and high citation impact in the published scientific literature. Null fields may represent a useful negative control for the scientific process.
View details for DOI 10.1016/j.jclinepi.2023.01.010
View details for PubMedID 36736709
-
With Great Power Comes Great Responsibility: Common Errors in Meta-Analyses and Meta-Regressions in Strength & Conditioning Research.
Sports medicine (Auckland, N.Z.)
2022
Abstract
Meta-analysis and meta-regression are often highly cited and may influence practice. Unfortunately, statistical errors in meta-analyses are widespread and can lead to flawed conclusions. The purpose of this article was to review common statistical errors in meta-analyses and to document their frequency in highly cited meta-analyses from strength and conditioning research.We identified five errors in one highly cited meta-regression from strength and conditioning research: implausible outliers; overestimated effect sizes that arise from confusing standard deviation with standard error; failure to account for correlated observations; failure to account for within-study variance; and a focus on within-group rather than between-group results. We then quantified the frequency of these errors in 20 of the most highly cited meta-analyses in the field of strength and conditioning research from the past 20 years.We found that 85% of the 20 most highly cited meta-analyses in strength and conditioning research contained statistical errors. Almost half (45%) contained at least one effect size that was mistakenly calculated using standard error rather than standard deviation. In several cases, this resulted in obviously wrong effect sizes, for example, effect sizes of 11 or 14 standard deviations. Additionally, 45% failed to account for correlated observations despite including numerous effect sizes from the same study and often from the same group within the same study.Statistical errors in meta-analysis and meta-regression are common in strength and conditioning research. We highlight five errors that authors, editors, and readers should check for when preparing or critically reviewing meta-analyses.
View details for DOI 10.1007/s40279-022-01766-0
View details for PubMedID 36208412
-
Calculating Sample Size for Reliability Studies.
PM & R : the journal of injury, function, and rehabilitation
2022
View details for DOI 10.1002/pmrj.12850
View details for PubMedID 35596122
-
Wish List for Improving the Quality of Statistics in Sport Science.
International journal of sports physiology and performance
2022: 1-2
View details for DOI 10.1123/ijspp.2022-0023
View details for PubMedID 35276666
-
Prevalence of Female and Male Athlete Triad Risk Factors in Ultramarathon Runners.
Clinical journal of sport medicine : official journal of the Canadian Academy of Sport Medicine
2021
Abstract
OBJECTIVE: To identify the prevalence of male and female athlete triad risk factors in ultramarathon runners and explore associations between sex hormones and bone mineral density (BMD).DESIGN: Multiyear cross-sectional study.SETTING: One hundred-mile ultramarathon.PARTICIPANTS: Competing runners were recruited in 2018 and 2019.ASSESSMENT OF RISK FACTORS: Participants completed a survey assessing eating behaviors, menstrual history, and injury history; dual-energy x-ray absorptiometry for BMD; and laboratory evaluation of sex hormones, vitamin D, and ferritin (2019 cohort only).MAIN OUTCOME MEASURE: A Triad Cumulative Risk Assessment Score was calculated for each participant.RESULTS: One hundred twenty-three runners participated (83 males and 40 females, mean age 46.2 and 41.8 years, respectively). 44.5% of men and 62.5% of women had elevated risk for disordered eating. 37.5% of women reported a history of bone stress injury (BSI) and 16.7% had BMD Z scores <-1.0. 20.5% of men had a history of BSI and 30.1% had Z-scores <-1.0. Low body mass index (BMI) (<18.5 kg/m2) was seen in 15% of women and no men. The Triad Cumulative Risk Assessment classified 61.1% of women and 29.2% of men as moderate risk and 5.6% of both men and women as high risk.CONCLUSIONS: Our study is the first to measure BMD in both male and female ultramarathon runners. Our male population had a higher prevalence of low BMD than the general population; females were more likely to report history of BSI. Risk of disordered eating was elevated among our participants but was not associated with either low BMD or low BMI.
View details for DOI 10.1097/JSM.0000000000000956
View details for PubMedID 34232162
-
Multinomial and Ordinal Logistic Regression.
PM & R : the journal of injury, function, and rehabilitation
2021
View details for DOI 10.1002/pmrj.12622
View details for PubMedID 33905601
-
Ten common statistical errors from all phases of research, and their fixes.
PM & R : the journal of injury, function, and rehabilitation
2020
View details for DOI 10.1002/pmrj.12395
View details for PubMedID 32358859
-
Call to increase statistical collaboration in sports science, sport and exercise medicine and sports physiotherapy.
British journal of sports medicine
2020
View details for DOI 10.1136/bjsports-2020-102607
View details for PubMedID 32816788
-
How to Be a Statistical Detective.
PM & R : the journal of injury, function, and rehabilitation
2019
View details for DOI 10.1002/pmrj.12305
View details for PubMedID 31850680
-
The 2016 California policy to eliminate nonmedical vaccine exemptions and changes in vaccine coverage: An empirical policy analysis
PLOS MEDICINE
2019; 16 (12)
View details for DOI 10.1371/journal.pmed.1002994.r005
View details for Web of Science ID 000507280500007
-
An Update on Triad Prevalence and Exploratory Hormonal Biomarker Analyses in Ultramarathon Runners
CLINICAL JOURNAL OF SPORT MEDICINE
2024; 34 (5): 469-473
View details for DOI 10.1097/JSM.0000000000001222
View details for Web of Science ID 001301020300003
-
The association between overuse and musculoskeletal injuries and the female athlete triad in Division I collegiate athletes.
PM & R : the journal of injury, function, and rehabilitation
2024
Abstract
Although the female athlete triad (Triad) has been associated with increased risk of bone-stress injuries (BSIs), limited research among collegiate athletes has addressed the associations between the Triad and non-BSI injuries.To elucidate the relationship between Triad and both BSI and non-BSI in female athletes.Retrospective cohort study.Primary and tertiary care student athlete clinic.National Collegiate Athletic Association Division I female athletes at a single institution.Participants completed a pre-participation questionnaire and dual-energy x-ray absorptiometry, which was used to generate a Triad cumulative risk assessment score (Triad score). The number of overuse musculoskeletal injuries that occurred while the athletes were still competing collegiately were identified through chart review.BSI and non-BSI were treated as count variables. The association between BSI, non-BSI, and Triad score was measured using Poisson regression to calculate rate ratios.Of 239 athletes, 43% of athletes (n = 103) sustained at least one injury. Of those, 40% (n = 95) sustained at least one non-BSI and 10% (n = 24) sustained at least one BSI over an average follow-up 2.5 years. After accounting for sport type (non-lean, runner, other endurance sport, or other lean advantage sport) and baseline age, we found that every additional Triad score risk point was associated with a significant 17% increase in the rate of BSI (rate ratio [RR] 1.17, 95% confidence interval [CI] 1.03-1.33; p = .016). However, Triad score was unrelated to non-BSI (1.00, 95% CI 0.91-1.11; p = .99). Compared with athletes in non-lean sports (n = 108), athletes in other lean advantage sports (n = 30) had an increased rate of non-BSI (RR: 2.09, p = .004) whereas distance runners (n = 46) had increased rates of BSI (RR: 7.65, p < .001) and non-BSI (RR: 2.25, p < .001).Higher Triad score is associated with an increased risk of BSI but not non-BSI in collegiate athletes.
View details for DOI 10.1002/pmrj.13201
View details for PubMedID 38837318
-
Healthy runner project: a 7-year, multisite nutrition education intervention to reduce bone stress injury incidence in collegiate distance runners
BMJ OPEN SPORT & EXERCISE MEDICINE
2024; 10 (1)
View details for DOI 10.1136/bmjsem-2023-001545corr1
View details for Web of Science ID 001181297000002
-
Higher Triad Risk Scores Are Associated With Increased Risk for Trabecular-Rich Bone Stress Injuries in Female Runners.
Clinical journal of sport medicine : official journal of the Canadian Academy of Sport Medicine
2023
Abstract
OBJECTIVE: Bone stress injuries (BSIs) in trabecular-rich bone are associated with greater biological risk factors compared with cortical-rich bone. We hypothesized that female runners with high Female Athlete Triad (Triad)-related risk would be at greater risk for trabecular-rich BSIs than runners with low Triad-related risk.DESIGN: Prospective cohort study.SETTING: Two NCAA institutions.PARTICIPANTS: Female runners were followed prospectively for up to 5 years.INTERVENTION: The intervention consisted of team nutrition presentations focused on optimizing energy availability plus individualized nutrition sessions. Triad Cumulative Risk Assessment (CRA) categories were assigned yearly based on low-energy availability, menstrual status, age of menarche, low body mass index, low bone mineral density, and prior BSI.MAIN OUTCOME MEASURES: The outcome was the annual incidence of trabecular- and cortical-rich BSI. Generalized Estimating Equations (GEE, to account for the correlated nature of the observations) with a Poisson distribution and log link were used for statistical modeling.RESULTS: Cortical-rich BSI rates were higher than trabecular-rich BSI rates (0.32 vs 0.13 events per person-year). Female runners with high Triad-related risk had a significantly higher incidence rate ratio of trabecular-rich BSI (RR: 4.40, P = 0.025) and cortical-rich BSI (RR: 2.87, P = 0.025) than women with low Triad-related risk. Each 1-point increase in Triad CRA score was associated with a significant 26% increased risk of trabecular-rich BSI (P = 0.0007) and a nonsignificant 14% increased risk of cortical-rich BSI (P = 0.054).CONCLUSIONS: Increased Triad CRA scores were strongly associated with increased risk for trabecular-rich BSI. Incorporating Triad CRA scores in clinical care could guide BSI prevention.
View details for DOI 10.1097/JSM.0000000000001180
View details for PubMedID 37655940
-
Dual-Energy X-ray Absorptiometry Percent Fat Z-score As A Predictor Of Female Athlete Menstrual Status
LIPPINCOTT WILLIAMS & WILKINS. 2023: 335
View details for Web of Science ID 001158156601096
-
Perception Of Thinness Promoting Faster Running Is Associated With Lower Energy Availability In Collegiate Runners
LIPPINCOTT WILLIAMS & WILKINS. 2023: 776
View details for Web of Science ID 001158156602432
-
Exploratory Analyses: How to Meaningfully Interpret and Report Them.
PM & R : the journal of injury, function, and rehabilitation
2023
View details for DOI 10.1002/pmrj.12980
View details for PubMedID 37029465
-
Healthy Runner Project: a 7-year, multisite nutrition education intervention to reduce bone stress injury incidence in collegiate distance runners.
BMJ open sport & exercise medicine
2023; 9 (2): e001545
Abstract
Objectives: We evaluated the effect of a nutrition education intervention on bone stress injury (BSI) incidence among female distance runners at two NCAA Division I institutions.Methods: Historical BSI rates were measured retrospectively (2010-2013); runners were then followed prospectively in pilot (2013-2016) and intervention (2016-2020) phases. The primary aim was to compare BSI rates in the historical and intervention phases. Pilot phase data are included only for descriptive purposes. The intervention comprised team nutrition presentations focused on optimising energy availability plus individualised nutrition sessions for runners with elevated Female Athlete Triad risk. Annual BSI rates were calculated using a generalised estimating equation Poisson regression model adjusted for age and institution. Post hoc analyses were stratified by institution and BSI type (trabecular-rich or cortical-rich).Results: The historical phase included 56 runners and 90.2 person-years; the intervention phase included 78 runners and 137.3 person-years. Overall BSI rates were not reduced from the historical (0.52 events per person-year) to the intervention (0.43 events per person-year) phase. Post hoc analyses demonstrated trabecular-rich BSI rates dropped significantly from 0.18 to 0.10 events per person-year from the historical to intervention phase (p=0.047). There was a significant interaction between phase and institution (p=0.009). At Institution 1, the overall BSI rate dropped from 0.63 to 0.27 events per person-year from the historical to intervention phase (p=0.041), whereas no decline was observed at Institution 2.Conclusion: Our findings suggest that a nutrition intervention emphasising energy availability may preferentially impact trabecular-rich BSI and depend on team environment, culture and resources.
View details for DOI 10.1136/bmjsem-2023-001545
View details for PubMedID 37180969
-
Perceptions Of Weight And Nutrition On Performance Among Division 1 Distance Runners, A Pilot Study
LIPPINCOTT WILLIAMS & WILKINS. 2022: 326
View details for Web of Science ID 000888056601327
-
Female Athlete Triad Risk Factors Are More Strongly Associated With Trabecular-Rich Versus Cortical-Rich Bone Stress Injuries in Collegiate Athletes.
Orthopaedic journal of sports medicine
2022; 10 (9): 23259671221123588
Abstract
Background: Bone stress injuries (BSIs) are common in athletes. Risk factors for BSI may differ by skeletal anatomy and relative contribution of trabecular-rich and cortical-rich bone.Hypothesis: We hypothesized that Female Athlete Triad (Triad) risk factors would be more strongly associated with BSIs sustained at trabecular-rich versus cortical-rich skeletal sites.Study Design: Cohort study; Level of evidence, 2.Methods: The study population comprised 321 female National Collegiate Athletic Association Division I athletes participating in 16 sports from 2008 to 2014. Triad risk factors and a Triad cumulative risk score were assessed using responses to preparticipation examination and dual energy x-ray absorptiometry to measure lumbar spine and whole-body bone mineral density (BMD). Sports-related BSIs were diagnosed by a physician and confirmed radiologically. Athletes were grouped into those sustaining a subsequent trabecular-rich BSI, a subsequent cortical-rich BSI, and those without a BSI. Data were analyzed with multinomial logistic regression adjusted for participation in cross-country running versus other sports.Results: A total of 19 participants sustained a cortical-rich BSI (6%) and 10 sustained a trabecular-rich BSI (3%) over the course of collegiate sports participation. The Triad cumulative risk score was significantly related to both trabecular-rich and cortical-rich BSI. However, lower BMD and weight were associated with significantly greater risk for trabecular-rich than cortical-rich BSIs. For every value lower than 1 SD, the odds ratios (95% CIs) for trabecular-rich versus cortical-rich BSI were 3.08 (1.25-7.56) for spine BMD; 2.38 (1.22-4.64) for whole-body BMD; and 5.26 (1.48-18.70) for weight. Taller height was a significantly better predictor of cortical-rich than trabecular-rich BSI.Conclusion: The Triad cumulative risk score was significantly associated with both trabecular-rich and cortical-rich BSI, but Triad-related risk factors appeared more strongly related to trabecular-rich BSI. In particular, low BMD and low weight were associated with significantly higher increases in the risk of trabecular-rich BSI than cortical-rich BSI. These findings suggest Triad risk factors are more common in athletes sustaining BSI in trabecular-rich than cortical-rich locations.
View details for DOI 10.1177/23259671221123588
View details for PubMedID 36157087
-
Evaluating Genetic Predictors Of Bone Health In Ultramarathon Runners: Are Females Overriding Their Genetic Predisposition?
LIPPINCOTT WILLIAMS & WILKINS. 2022: 526-527
View details for Web of Science ID 000888056602275
-
Adherence to contemporary antiretroviral treatment regimens and impact on immunological and virologic outcomes in a US healthcare system.
PloS one
2022; 17 (2): e0263742
Abstract
BACKGROUND: Only a few recent reports have examined longitudinal adherence patterns in US clinics and its impact on immunological and virological outcomes among large cohorts initiating contemporary antiretroviral therapy (ART) in US clinics.METHODS: We followed all persons with HIV (PLWH) in a California clinic population initiating ART between 2010 and 2017. We estimated longitudinal adherence for each PLWH by calculating the medication possession ratio within multiple 6-month intervals using pharmacy refill records.RESULTS: During the study, 2315 PWLH were followed for a median time of 210.8 weeks and only 179 (7.7%) were lost-to-follow-up. The mean adherence was 84.9%. Age (Hazard Ratio (HR): (95% confidence interval): 1.25 (1.20-1.31) per 10-year increase) and Black race (HR: 0.62 (0.53-0.73) vs. White) were associated with adherence in the cohort. A 10% percent increase in adherence increased the odds of being virally suppressed by 37% (OR and 95% CI: 1.37 [1.33-1.41]) and was associated with an increase in mean CD4 count by 8.54 cells/ul in the next 6-month interval (p-value <0.0001).CONCLUSIONS: Our study shows that despite large improvements in retention in care, demographic disparities in adherence to ART persist. Adherence was lower among younger patients and black patients. Our study confirmed the strong association between adherence to ART and viral suppression but could only establish a weak association between adherence and CD4 count. These findings reaffirm the importance of adherence and retention in care and further highlight the need for tailored patient-centered HIV Care Models as a strategy to improve PLWH's outcomes.
View details for DOI 10.1371/journal.pone.0263742
View details for PubMedID 35157724
-
Impacts of COVID-19 on Mental Health and Training in US Professional Endurance Athletes.
Clinical journal of sport medicine : official journal of the Canadian Academy of Sport Medicine
2021
Abstract
OBJECTIVE: We examined how professional athletes are affected by COVID-19. Our primary aim was to assess changes in mental health that occurred after COVID-19 restrictions, and our secondary aim was to assess changes in exercise volume and intensity.DESIGN: Cross-sectional study.SETTING: United States.PARTICIPANTS: Strava professional endurance athletes.ASSESSMENT OF RISK FACTORS: Participants completed a survey, and a subset of participants consented to have their activity data analyzed. The survey included questions on COVID-19 symptoms, exercise, and mental health, as measured by a modified Patient Health Questionnaire.MAIN OUTCOME MEASURES: Participants were asked about 2 periods in 2020: before COVID-19 (January 1-March 14) and during COVID-19 (March 15-August 25), and activity data from both periods were downloaded. Activity data consisted of Global Positioning System and self-reported uploads.RESULTS: One hundred thirty-one male and female Strava athletes were enrolled, and a subset of athletes (n = 114) consented to have their activity data analyzed. During COVID-19 restrictions, 22.2% of participants reported feeling down or depressed and 27.4% of participants reported feeling nervous or anxious at least half the days in a week compared with 3.8% and 4.6% before COVID-19 restrictions, respectively (P < 0.0001). Activity data revealed a significant increase (P < 0.0001) in exercise minutes per day during COVID-19 (mean = 103.00, SD = 42.1) compared with before COVID-19 restrictions (mean = 92.4, SD = 41.3), with no significant changes in intensity.CONCLUSIONS: Athletes reported significant increases in feeling down or depressed and nervous or anxious despite an increase in exercise duration during COVID-19. Future research should assess how to support athletes with mental health resources.
View details for DOI 10.1097/JSM.0000000000000983
View details for PubMedID 34711711
-
Sun Protective Behaviors and Attitudes of Runners.
Sports (Basel, Switzerland)
2021; 10 (1)
Abstract
Sun exposure is a risk factor for skin cancer. Knowledge and behaviors around sun exposure protective measures are poorly described in athletes including runners. Our primary objective was to describe sun exposure behaviors and knowledge in a population of runners. A cross-sectional online survey was administered to 697 runners to measure the frequency of seven sun protective behaviors: sunscreen use on the face or body; wearing a hat, sunglasses, or long sleeves; running in shade; and avoidance of midday running. Between 54% and 84% of runners reported that they engaged in these behaviors at least sometimes, but only 7% to 45% reported frequent use. Of 525 runners who gave a primary reason for not using sunscreen regularly, 49.0% cited forgetfulness; 17.3% cited discomfort; and only a small percentage cited maintaining a tan (6.1%) or optimizing vitamin D (5.1%). Of 689 runners who responded to a question about what factor most influences their overall sun exposure habits, 39.2% cited fear of skin cancer, 28.7% cited comfort level, and 15.8% cited fear of skin aging. In addition to the seven individual behaviors, we also asked runners how frequently they took precautions to protect against the sun overall. We explored associations between participant characteristics and the overall use of sun protection using ordinal logistic regression. Overall, sun protection was used more frequently in runners who were female, older, or had a history of skin cancer. Runners appear to recognize the importance of sun protection and the potential consequences of not using it, but report forgetfulness and discomfort as the biggest barriers to consistent use. Interventions using habit-formation strategies and self-regulation training may prove to be most useful in closing this gap between knowledge and practice.
View details for DOI 10.3390/sports10010001
View details for PubMedID 35050966
-
Ordinal Prediction Model of 90-Day Modified Rankin Scale in Ischemic Stroke.
Frontiers in neurology
2021; 12: 727171
Abstract
Background and Purpose: Prediction models for functional outcomes after ischemic stroke are useful for statistical analyses in clinical trials and guiding patient expectations. While there are models predicting dichotomous functional outcomes after ischemic stroke, there are no models that predict ordinal mRS outcomes. We aimed to create a model that predicts, at the time of hospital discharge, a patient's modified Rankin Scale (mRS) score on day 90 after ischemic stroke. Methods: We used data from three multi-center prospective studies: CRISP, DEFUSE 2, and DEFUSE 3 to derive and validate an ordinal logistic regression model that predicts the 90-day mRS score based on variables available during the stroke hospitalization. Forward selection was used to retain independent significant variables in the multivariable model. Results: The prediction model was derived using data on 297 stroke patients from the CRISP and DEFUSE 2 studies. National Institutes of Health Stroke Scale (NIHSS) at discharge and age were retained as significant (p < 0.001) independent predictors of the 90-day mRS score. When applied to the external validation set (DEFUSE 3, n = 160), the model accurately predicted the 90-day mRS score within one point for 78% of the patients in the validation cohort. Conclusions: A simple model using age and NIHSS score at time of discharge can predict 90-day mRS scores in patients with ischemic stroke. This model can be useful for prognostication in routine clinical care and to impute missing data in clinical trials.
View details for DOI 10.3389/fneur.2021.727171
View details for PubMedID 34744968
-
Lower Trabecular Bone Score and Spine Bone Mineral Density Are Associated with Bone Stress Injuries and Triad Risk Factors in Collegiate Athletes.
PM & R : the journal of injury, function, and rehabilitation
2020
Abstract
INTRODUCTION: Determinants of bone health and injury are important to identify in athletes. Bone mineral density (BMD) is commonly measured in athletes with Female Athlete Triad (Triad) risk factors; Trabecular Bone Score (TBS) has been proposed to predict fracture risk independent of BMD. Evaluation of TBS and spine BMD to bone stress injury (BSI) risk has not been studied in female collegiate athletes.OBJECTIVE: We hypothesized that spine BMD and TBS would each independently predict BSI and the combined measures would improve injury prediction in female collegiate athletes. We also hypothesized each measure would be correlated with Triad risk factors.DESIGN: Retrospective cohort SETTING: Academic Institution METHODS: Dual energy x-ray absorptiometry (DXA) of lumbar spine was used to calculate BMD and TBS values. Chart review was used to identify BSI that occurred after the DXA measurement and to obtain Triad risk factors. We used logistic regression to examine the ability of TBS and BMD alone or in combination to predict prospective BSI.RESULTS: Within 321 athletes, 29 (9.0%) sustained a BSI after DXA. BMD and TBS were highly correlated (Pearson's correlation r=0.62, P<0.0001). Spine BMD and TBS had similar ability to predict BSI; the C-statistic and 95% confidence intervals were: 0.69 (0.58, 0.81) for spine BMD versus 0.68 (0.57, 0.79) for TBS. No improvement in discrimination was observed with combined BMD+TBS (C-statistic 0.70 [0.59, 0.81]). Both TBS and BMD predicted trabecular-rich BSI (defined as pelvis, femoral neck and calcaneus) better than cortical-rich BSI. Both measures had similar correlations with Triad risk factors.CONCLUSION: Lower BMD and TBS values are associated with elevated risk for BSI and similar correlation to Triad risk factors. TBS does not improve prediction of BSI. Collectively, our findings suggest BMD may be a sufficient measure of skeletal integrity from DXA in female collegiate athletes. This article is protected by copyright. All rights reserved.
View details for DOI 10.1002/pmrj.12510
View details for PubMedID 33037847
-
Identifying Triad Risk Factors In Ultramarathon Runners
LIPPINCOTT WILLIAMS & WILKINS. 2020: 68
View details for Web of Science ID 000590026300193
-
Genetic Predictions Of Bone Mineral Density In Ultramarathon Runners: For Men, But Not For Women
LIPPINCOTT WILLIAMS & WILKINS. 2020: 783–84
View details for Web of Science ID 000590026303200
-
Predictors And Prevalence Of Low Bone Mineral Density And Bone Stress Injuries In Ultramarathon Runners
LIPPINCOTT WILLIAMS & WILKINS. 2020: 492
View details for Web of Science ID 000590026301707
-
Systematic review of the use of "magnitude-based inference" in sports science and medicine.
PloS one
2020; 15 (6): e0235318
Abstract
Magnitude-based inference (MBI) is a controversial statistical method that has been used in hundreds of papers in sports science despite criticism from statisticians. To better understand how this method has been applied in practice, we systematically reviewed 232 papers that used MBI. We extracted data on study design, sample size, and choice of MBI settings and parameters. Median sample size was 10 per group (interquartile range, IQR: 8-15) for multi-group studies and 14 (IQR: 10-24) for single-group studies; few studies reported a priori sample size calculations (15%). Authors predominantly applied MBI's default settings and chose "mechanistic/non-clinical" rather than "clinical" MBI even when testing clinical interventions (only 16 studies out of 232 used clinical MBI). Using these data, we can estimate the Type I error rates for the typical MBI study. Authors frequently made dichotomous claims about effects based on the MBI criterion of a "likely" effect and sometimes based on the MBI criterion of a "possible" effect. When the sample size is n = 8 to 15 per group, these inferences have Type I error rates of 12%-22% and 22%-45%, respectively. High Type I error rates were compounded by multiple testing: Authors reported results from a median of 30 tests related to outcomes; and few studies specified a primary outcome (14%). We conclude that MBI has promoted small studies, promulgated a "black box" approach to statistics, and led to numerous papers where the conclusions are not supported by the data. Amidst debates over the role of p-values and significance testing in science, MBI also provides an important natural experiment: we find no evidence that moving researchers away from p-values or null hypothesis significance testing makes them less prone to dichotomization or over-interpretation of findings.
View details for DOI 10.1371/journal.pone.0235318
View details for PubMedID 32589653
-
Comment on: 'Moving Sport and Exercise Science Forward: A Call for the Adoption of More Transparent Research Practices'.
Sports medicine (Auckland, N.Z.)
2020
View details for DOI 10.1007/s40279-020-01298-5
View details for PubMedID 32447716
-
Virological Failure and Acquired Genotypic Resistance Associated With Contemporary Antiretroviral Treatment Regimens.
Open forum infectious diseases
2020; 7 (9): ofaa316
Abstract
There are few descriptions of virologic failure (VF) and acquired drug resistance (HIVDR) in large cohorts initiating contemporary antiretroviral therapy (ART).We studied all persons with HIV (PWH) in a California clinic population initiating ART between 2010 and 2017. VF was defined as not attaining virologic suppression, discontinuing ART, or virologic rebound prompting change in ART.During the study, 2315 PWH began ART. Six companion drugs were used in 93.3% of regimens: efavirenz, elvitegravir/c, dolutegravir, b-darunavir, rilpivirine, and raltegravir. During a median follow-up of 36 months, 214 (9.2%) PWH experienced VF (2.8 per 100 person-years) and 62 (2.7%) experienced HIVDR (0.8 per 100 person-years). In multivariable analyses, younger age, lower CD4 count, higher virus load, and b-atazanavir were associated with increased VF risk; lower CD4 count, higher virus load, and nevirapine were associated with increased HIVDR risk. Compared with efavirenz, dolutegravir, raltegravir, and b-darunavir were associated with reduced HIVDR risk. Risks of VF and HIVDR were not significantly associated with ART initiation year. Of the 62 PWH with HIVDR, 42 received an non-nucleoside RT inhibitor (NNRTI), 15 an integrase-strand transfer inhibitor (INSTI), and 5 a protease inhibitor (PI). Among those with HIVDR on an NNRTI or first-generation INSTI, 59% acquired dual class resistance and 29% developed tenofovir resistance; those receiving a PI or dolutegravir developed just M184V.Despite the frequent use of contemporary ART regimens, VF and HIVDR continue to occur. Further efforts are required to improve long-term ART virological responses to prevent the consequences of ongoing HIV-1 replication including virus transmission and HIVDR.
View details for DOI 10.1093/ofid/ofaa316
View details for PubMedID 32904894
View details for PubMedCentralID PMC7462367
-
Assessing diagnostic and severity grading accuracy of ultrasound measurements for carpal tunnel syndrome compared to electrodiagnostics.
PM & R : the journal of injury, function, and rehabilitation
2020
Abstract
The combined sensory index (CSI) is the most sensitive electrodiagnostic criteria for carpal tunnel syndrome (CTS), and the CSI and Bland criteria have been shown to predict surgical treatment outcomes. The proposed ultrasound measurements have not been assessed against the CSI for diagnostic accuracy and grading of CTS severity.The primary objective of this paper was to investigate the use of ultrasound evaluations for both diagnosis and assessment of severity grading of CTS in comparison to electrodiagnostic assessment.All patients underwent an electrodiagnostic evaluation using the CSI and Bland severity grading. Each patient underwent an ultrasound evaluation including cross sectional area (CSA), the change in CSA from the forearm to the tunnel (∆CSA), and the wrist-forearm ratio (WFR). These measurements were assessed for diagnostic and severity grading accuracy using the CSI as the gold standard.Tertiary academic center PARTICIPANTS: All patients referred for electrodiagnostic evaluation for CTS were eligible for the study. Only those with idiopathic CTS were included and those with prior CTS treatment were also excluded. Ninety-five patients were included in the study.Not Applicable.The primary study outcome measure was concordance between CSI diagnosis and severity categories and the ultrasound measurements. Both outcomes were also assessed using Bland criteria.Optimal cut-points for diagnosis of CTS were found to be CSA ≥ 12 mm2 , ∆CSA ≥ 4 mm2 , WFR ≥ 1.4. Using these cut-points, C-statistics comparing diagnosis of CTS using ultrasound measurements versus using the CSI ranged from 0.893-0.966. When looking at CSI severity grading compared to ∆CSA, however, the C-statistics were 0.640-0.661 with substantial overlap between severity groups.While ultrasound measurements had high diagnostic accuracy for CTS based on the CSI criteria, ultrasound measurements were unable to adequately distinguish between CSI severity groups among patients with CTS. This article is protected by copyright. All rights reserved.
View details for DOI 10.1002/pmrj.12533
View details for PubMedID 33306874
-
Increase in Blood Pressure Associated With Tyrosine Kinase Inhibitors Targeting Vascular Endothelial Growth Factor.
JACC. CardioOncology
2019; 1 (1): 24-36
Abstract
This study quantified the change in blood pressure (BP) during antivascular endothelial growth factor (VEGF) tyrosine kinase inhibitor (TKI) therapy, compared BPs between TKIs, and analyzed change in BP during antihypertensive therapy.TKIs targeting VEGF are associated with hypertension. The absolute change in BP during anti-VEGF TKI treatment is not well characterized outside clinical trials.A retrospective single-center study included patients with metastatic renal cell carcinoma who received anti-VEGF TKIs between 2007 and 2018. Mixed models analyzed 3,088 BPs measured at oncology clinics.In 228 patients (baseline systolic blood pressure [SBP] 130.2 ± 16.3 mm Hg, diastolic blood pressure [DBP] 76.8 ± 9.3 mm Hg), anti-VEGF TKIs were associated with mean increases in SBP of 8.5 mm Hg (p < 0.0001) and DBP of 6.7 mm Hg (p <0.0001). Of the anti-VEGF TKIs evaluated, axitinib was associated with the greatest BP increase, with an increase in SBP of 12.6 mm Hg (p < 0.0001) and in DBP of 10.3 mm Hg (p < 0.0001) relative to baseline. In pairwise comparisons between agents, axitinib was associated with greater SBPs than cabozantinib by 8.4 mm Hg (p = 0.004) and pazopanib by 5.1 mm Hg (p = 0.01). Subsequent anti-VEGF TKI courses were associated with small increases in DBP, but not SBP, relative to the first course. During anti-VEGF TKI therapy, calcium-channel blockers and potassium-sparing diuretic agents were associated with the largest BP reductions, with decreases in SBP of 5.6 mm Hg (p < 0.0001) and 9.9 mm Hg (p = 0.007), respectively.Anti-VEGF TKIs are associated with increased BP; greatest increases are observed with axitinib. Calcium-channel blockers and potassium-sparing diuretic agents were associated with the largest reductions in BP.
View details for DOI 10.1016/j.jaccao.2019.08.012
View details for PubMedID 34396159
View details for PubMedCentralID PMC8352203
-
Case Studies in Statistics
PM&R
2019; 11 (6): 654–56
View details for DOI 10.1002/pmrj.12178
View details for Web of Science ID 000470071000009
-
Case Studies in Statistics.
PM & R : the journal of injury, function, and rehabilitation
2019
Abstract
The following hypothetical example is based on a real paper that I recently reviewed, but with specific details changed. See if you can spot the statistical error. This article is protected by copyright. All rights reserved.
View details for PubMedID 31033199
-
The 2016 California policy to eliminate nonmedical vaccine exemptions and changes in vaccine coverage: An empirical policy analysis.
PLoS medicine
2019; 16 (12): e1002994
Abstract
Vaccine hesitancy, the reluctance or refusal to receive vaccination, is a growing public health problem in the United States and globally. State policies that eliminate nonmedical ("personal belief") exemptions to childhood vaccination requirements are controversial, and their effectiveness to improve vaccination coverage remains unclear given limited rigorous policy analysis. In 2016, a California policy (Senate Bill 277) eliminated nonmedical exemptions from school entry requirements. The objective of this study was to estimate the association between California's 2016 policy and changes in vaccine coverage.We used a quasi-experimental state-level synthetic control analysis and a county-level difference-in-differences analysis to estimate the impact of the 2016 California policy on vaccination coverage and prevalence of exemptions to vaccine requirements (nonmedical and medical). We used publicly available state-level data from the US Centers for Disease Control and Prevention on coverage of measles, mumps, and rubella (MMR) vaccination, nonmedical exemption, and medical exemption in children entering kindergarten. We used county-level data individually requested from state departments of public health on overall vaccine coverage and exemptions. Based on data availability, we included state-level data for 45 states, including California, from 2011 to 2017 and county-level data for 17 states from 2010 to 2017. The prespecified primary study outcome was MMR vaccination in the state analysis and overall vaccine coverage in the county analysis. In the state-level synthetic control analysis, MMR coverage in California increased by 3.3% relative to its synthetic control in the postpolicy period (top 2 of 43 states evaluated in the placebo tests, top 5%), nonmedical exemptions decreased by 2.4% (top 2 of 43 states evaluated in the placebo tests, top 5%), and medical exemptions increased by 0.4% (top 1 of 44 states evaluated in the placebo tests, top 2%). In the county-level analysis, overall vaccination coverage increased by 4.3% (95% confidence interval [CI] 2.9%-5.8%, p < 0.001), nonmedical exemptions decreased by 3.9% (95% CI 2.4%-5.4%, p < 0.001), and medical exemptions increased by 2.4% (95% CI 2.0%-2.9%, p < 0.001). Changes in vaccination coverage across counties after the policy implementation from 2015 to 2017 ranged from -6% to 26%, with larger increases in coverage in counties with lower prepolicy vaccine coverage. Results were robust to alternative model specifications. The limitations of the study were the exclusion of a subset of US states from the analysis and the use of only 2 years of postpolicy data based on data availability.In this study, implementation of the California policy that eliminated nonmedical childhood vaccine exemptions was associated with an estimated increase in vaccination coverage and a reduction in nonmedical exemptions at state and county levels. The observed increase in medical exemptions was offset by the larger reduction in nonmedical exemptions. The largest increases in vaccine coverage were observed in the most "high-risk" counties, meaning those with the lowest prepolicy vaccine coverage. Our findings suggest that government policies removing nonmedical exemptions can be effective at increasing vaccination coverage.
View details for DOI 10.1371/journal.pmed.1002994
View details for PubMedID 31869328
-
Magnitude-Based Inference is Not Bayesian and is Not a Valid Method of Inference.
Scandinavian journal of medicine & science in sports
2019
Abstract
Recently, Diong2 wrote a commentary for the Scandinavian Journal of Medicine and Science in Sports critiquing a study by Pamboris and colleagues3 for not correctly interpreting confidence intervals that contained zero. Pamboris and colleagues4 wrote a response letter in return, seeking to rebut Diong's critique and arguing that they were appropriately implementing Magnitude-Based Inference (MBI), which they argue is "based on Bayesian inference and [the] conclusions are robust."4 They neglect, however, that multiple statisticians5-7 including Bayesian statisticians8,9 have strongly critiqued and even called for rejection of MBI as a method of inference. In this commentary, we hope to resolve some of that ambiguity and concisely explain why MBI is not a robust method of statistical inference. This article is protected by copyright. All rights reserved.
View details for DOI 10.1111/sms.13491
View details for PubMedID 31149752
-
Lack of Diagnostic Utility of "Amino Acid Dysregulation Metabotypes".
Biological psychiatry
2018
View details for PubMedID 30595232
-
Bone stress injuries in male distance runners: higher modified Female Athlete Triad Cumulative Risk Assessment scores predict increased rates of injury.
British journal of sports medicine
2018
Abstract
OBJECTIVES: Bone stress injuries (BSI) are common in runners of both sexes. The purpose of this study was to determine if a modified Female Athlete Triad Cumulative Risk Assessment tool would predict BSI in male distance runners.METHODS: 156 male runners at two collegiate programmes were studied using mixed retrospective and prospective design for a total of 7years. Point values were assigned using risk assessment categories including low energy availability, low body mass index (BMI), low bone mineral density (BMD) and prior BSI. The outcome was subsequent development of BSI. Statistical models used a mixed effects Poisson regression model with p<0.05 as threshold for significance. Two regression analyses were performed: (1) baseline risk factors as the independent variable; and (2) annual change in risk factors (longitudinal data) as the independent variable.RESULTS: 42/156 runners (27%) sustained 61 BSIs over an average 1.9 years of follow-up. In the baseline risk factor model, each 1 point increase in prior BSI score was associated with a 57% increased risk for prospective BSI (p=0.0042) and each 1 point increase in cumulative risk score was associated with a 37% increase in prospective BSI risk (p=0.0079). In the longitudinal model, each 1 point increase in cumulative risk score was associated with a 27% increase in prospective BSI risk (p=0.05). BMI (rate ratio (RR)=1.91, p=0.11) and BMD (RR=1.58, p=0.19) risk scores were not associated with BSI.CONCLUSION: A modified cumulative risk assessment tool may help identify male runners at elevated risk for BSI. Identifying risk factors may guide treatment and prevention strategies.
View details for PubMedID 30580252
-
The Burden of Caring for a Child or Adolescent With Pediatric Acute-Onset Neuropsychiatric Syndrome (PANS): An Observational Longitudinal Study.
The Journal of clinical psychiatry
2018; 80 (1)
Abstract
OBJECTIVE: To describe the longitudinal association between disease severity, time established in clinical treatment, and caregiver burden in a community-based patient population diagnosed with pediatric acute-onset neuropsychiatric syndrome (PANS).METHODS: The study included an observational longitudinal cohort design, with Caregiver Burden Inventories (CBIs) collected between April 2013 and November 2016 at the Stanford PANS multidisciplinary clinic. Inclusion criteria for this study were as follows: pediatric patients meeting strict PANS/pediatric autoimmune neuropsychiatric disorders associated with streptococcal infections (PANDAS) diagnostic criteria (n = 187), having a caregiver fill out at least 1 complete CBI during a disease flare (n = 114); and having family who lives locally (n = 97). For longitudinal analyses, only patients whose caregiver had filled out 2 or more CBIs (n = 94 with 892 CBIs) were included. In the study sample, most primary caregivers were mothers (69 [71.1%] of 97), the majority of PANS patients were male (58 [59.8%] of 97), and mean age at PANS onset was 8.8 years.RESULTS: In a patient's first flare tracked by the clinic, 50% of caregivers exceeded the caregiver burden score threshold used to determine respite need in care receiver adult populations. Longitudinally, flares, compared with quiescence, predicted increases in mean CBI score (6.6 points; 95% CI, 5.1 to 8.0). Each year established in clinic predicted decreased CBI score (-3.5 points per year; 95% CI, -2.3 to -4.6). Also, shorter time between PANS onset and entry into the multidisciplinary clinic predicted greater improvement in mean CBI score over time (0.7 points per year squared; 95% CI, 0.1 to 1.3). Time between PANS onset and treatment with antibiotics or immunomodulation did not moderate the relationship between CBI score and time in clinic.CONCLUSIONS: PANS caregivers suffer high caregiver burden. Neuropsychiatric disease severity predicts increased caregiver burden. Caregiver burden tends to decrease over time in a group of patients undergoing clinical treatment at a specialty PANS clinic. This decrease could be independent of clinical treatment.
View details for PubMedID 30549499
-
Response.
Medicine and science in sports and exercise
2018; 50 (12): 2611
View details for PubMedID 30431544
-
Dealing With Binary Repeated Measures Data
PM&R
2018; 10 (12): 1412–16
View details for PubMedID 30472244
-
Response.
Medicine and science in sports and exercise
2018
View details for PubMedID 30365418
-
A Checklist for Analyzing Data.
PM & R : the journal of injury, function, and rehabilitation
2018; 10 (9): 963–65
View details for PubMedID 30227966
-
Breastfeeding mitigates the effects of maternal HIV on infant infectious morbidity in the Option B+ era: A multicenter prospective cohort study.
AIDS (London, England)
2018
Abstract
OBJECTIVE: The effects of in-utero HIV-exposure on infectious morbidity and mortality in settings with universal maternal treatment and high breastfeeding rates are unclear. Further, the benefits of exclusive feeding options have not been assessed in the Option B+ era. We investigated these in two African settings with high breastfeeding uptake and good HIV treatment infrastructure during the first year of life.METHODS: Cox regression with time-changing variables in a birth cohort of 749 HIV-exposed uninfected and HIV-unexposed uninfected infants from Cape Town, South Africa and Jos, Nigeria.RESULTS: There was no difference in infectious morbidity incidence between HIV-exposed uninfected and HIV-unexposed uninfected infants (hazard ratio [HR], 1.01; 95% CI, 0.78-1.32) after adjusting for confounding variables. Formula-fed infants had significantly higher infectious morbidity incidence when compared with exclusively-breastfed infants ([HR], 1.64; 95% CI, 1.03-2.63) and mixed-breastfed infants ([HR], 1.42; 95% CI, 1.00-2.02) after adjusting for potential confounding variables. There was no significant difference in mortality among HIV-exposed infants and HIV-unexposed infants during the first year of life in this cohort (2.04% versus 0.94%, p-value = 0.38). Notably, exclusive breastfeeding for only 4 months had protective effects on morbidity up to 1 year.CONCLUSION: In settings with universal antiretroviral coverage and high breastfeeding rates, breastfeeding mitigates the effects of in-utero HIV exposure among infants during the first year of life. These findings support previous recommendations for exclusive breastfeeding among HIV-infected women and highlight the role that breastfeeding plays on the health of infants in settings where exclusive breastfeeding is not always feasible or where replacement feeding is recommended.
View details for DOI 10.1097/QAD.0000000000001974
View details for PubMedID 30134300
-
Sport and Triad Risk Factors Influence Bone Mineral Density in Collegiate Athletes.
Medicine and science in sports and exercise
2018
Abstract
PURPOSE: Athletes in weight bearing sports may benefit from higher bone mineral density (BMD). However, some athletes are at risk for impaired BMD with Female Athlete Triad (Triad). The purpose of this study is to understand the influence of sports participation and Triad on BMD. We hypothesize that athletes in high-impact and multi-directional loading sports will have highest BMD, whereas non-impact and low-impact sports will have lowest BMD. Triad risk factors are expected to reduce BMD values independent of sports participation.METHODS: 239 female athletes participating in 16 collegiate sports completed dual energy x-ray absorptiometry (DXA) scans to measure BMD Z-scores of the lumbar spine(LS) and total body(TB). Height and weight were measured to calculate body mass index (BMI). Triad risk assessment variables were obtained from preparticipation examination. Mean BMD Z-scores were compared between sports and by sport category (high-impact, multi-directional, low-impact, and non-impact). Multivariable regression analyses were performed to identify differences of BMD Z-scores accounting for Triad and body size/composition.RESULTS: Athlete populations with lowest average BMD Z-scores included synchronized swimming (LS:-0.34,TB:0.21) swimming/diving (LS:0.34,TB:-0.06), crew/rowing (LS:0.27,TB:0.62), and cross-country (LS:0.29,TB:0.91). Highest values were in gymnastics (LS:1.96,TB:1.37), volleyball (LS:1.90,TB:1.74), basketball (LS:1.73,TB:1.99), and softball (LS:1.68,TB:1.78). All Triad risk factors were associated with lower BMD Z-scores in univariable analyses; only low BMI and oligomenorrhea/amenorrhea were associated in multivariable analyses (all P<0.05). Accounting for Triad risk factors and body size/composition, high-impact sports were associated with higher LS and TB BMD Z-scores and non-impact sports with lower LS and TB BMD Z-scores compared to low-impact sport (all P<0.05).CONCLUSION: Both sport type and Triad risk factors influence BMD. Athletes in low-impact and non-impact sports and athletes with low BMI and oligomenorrhea/amenorrhea are at highest risk for reduced BMD.
View details for PubMedID 29975299
-
Re: Sainani K. Interpreting "null" results Reply
PM&R
2018; 10 (5): 563
View details for PubMedID 29776490
-
Youth Multi-sport Participation Is Associated With Higher Bone Mineral Density In Female Collegiate Distance Runners
LIPPINCOTT WILLIAMS & WILKINS. 2018: 490
View details for Web of Science ID 000456870502137
-
The Problem with "Magnitude-Based Inference".
Medicine and science in sports and exercise
2018
Abstract
PURPOSE: A statistical method called "Magnitude-Based Inference" (MBI) has gained a following in the sports science literature, despite concerns voiced by statisticians. Its proponents have claimed that MBI exhibits superior Type I and Type II error rates compared with standard null hypothesis testing for most cases. I have performed a re-analysis to evaluate this claim.METHODS: Using simulation code provided by MBI's proponents, I estimated Type I and Type II error rates for clinical and non-clinical MBI for a range of effect sizes, sample sizes, and smallest important effects. I plotted these results in a way that makes transparent the empirical behavior of MBI. I also re-ran the simulations after correcting mistakes in the definitions of Type I and Type II error provided by MBI's proponents. Finally, I confirmed the findings mathematically; and I provide general equations for calculating MBI's error rates without the need for simulation.RESULTS: Contrary to what MBI's proponents have claimed, MBI does not exhibit "superior" Type I and Type II error rates to standard null hypothesis testing. As expected, there is a tradeoff between Type I and Type II error. At precisely the small-to-moderate sample sizes that MBI's proponents deem "optimal," MBI reduces the Type II error rate at the cost of greatly inflating the Type I error rate-to two to six times that of standard hypothesis testing.CONCLUSIONS: MBI exhibits worrisome empirical behavior. In contrast to standard null hypothesis testing, which has predictable Type I error rates, the Type I error rates for MBI vary widely depending on the sample size and choice of smallest important effect, and are often unacceptably high. MBI should not be used.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.
View details for PubMedID 29683920
-
Instrumental Variables: Uses and Limitations
PM&R
2018; 10 (3): 303–8
View details for PubMedID 29551169
-
Type of Sports Participation Modulates Risk For Low BMD in Athletes With Female Athlete Triad
WILEY. 2017: S102
View details for Web of Science ID 000418869201110
-
Getting the Right Answer: Four Statistical Principles.
PM & R : the journal of injury, function, and rehabilitation
2017; 9 (9): 933-937
View details for DOI 10.1016/j.pmrj.2017.06.015
View details for PubMedID 28895858
-
Semantic Memory in the Clinical Progression of Alzheimer Disease
COGNITIVE AND BEHAVIORAL NEUROLOGY
2017; 30 (3): 81–89
Abstract
Semantic memory measures may be useful in tracking and predicting progression of Alzheimer disease. We investigated relationships among semantic memory tasks and their 1-year predictive value in women with Alzheimer disease.We conducted secondary analyses of a randomized clinical trial of raloxifene in 42 women with late-onset mild-to-moderate Alzheimer disease. We assessed semantic memory with tests of oral confrontation naming, category fluency, semantic recognition and semantic naming, and semantic density in written narrative discourse. We measured global cognition (Alzheimer Disease Assessment Scale, cognitive subscale), dementia severity (Clinical Dementia Rating sum of boxes), and daily function (Activities of Daily Living Inventory) at baseline and 1 year.At baseline and 1 year, most semantic memory scores correlated highly or moderately with each other and with global cognition, dementia severity, and daily function. Semantic memory task performance at 1 year had worsened one-third to one-half standard deviation. Factor analysis of baseline test scores distinguished processes in semantic and lexical retrieval (semantic recognition, semantic naming, confrontation naming) from processes in lexical search (semantic density, category fluency). The semantic-lexical retrieval factor predicted global cognition at 1 year. Considered separately, baseline confrontation naming and category fluency predicted dementia severity, while semantic recognition and a composite of semantic recognition and semantic naming predicted global cognition. No individual semantic memory test predicted daily function.Semantic-lexical retrieval and lexical search may represent distinct aspects of semantic memory. Semantic memory processes are sensitive to cognitive decline and dementia severity in Alzheimer disease.
View details for PubMedID 28926415
View details for PubMedCentralID PMC5617354
-
Reliability Statistics.
PM & R : the journal of injury, function, and rehabilitation
2017; 9 (6): 622-628
View details for DOI 10.1016/j.pmrj.2017.05.001
View details for PubMedID 28602174
-
Evaluation of Evidence of Statistical Support and Corroboration of Subgroup Claims in Randomized Clinical Trials.
JAMA internal medicine
2017
Abstract
Many published randomized clinical trials (RCTs) make claims for subgroup differences.To evaluate how often subgroup claims reported in the abstracts of RCTs are actually supported by statistical evidence (P < .05 from an interaction test) and corroborated by subsequent RCTs and meta-analyses.This meta-epidemiological survey examines data sets of trials with at least 1 subgroup claim, including Subgroup Analysis of Trials Is Rarely Easy (SATIRE) articles and Discontinuation of Randomized Trials (DISCO) articles. We used Scopus (updated July 2016) to search for English-language articles citing each of the eligible index articles with at least 1 subgroup finding in the abstract.Articles with a subgroup claim in the abstract with or without evidence of statistical heterogeneity (P < .05 from an interaction test) in the text and articles attempting to corroborate the subgroup findings.Study characteristics of trials with at least 1 subgroup claim in the abstract were recorded. Two reviewers extracted the data necessary to calculate subgroup-level effect sizes, standard errors, and the P values for interaction. For individual RCTs and meta-analyses that attempted to corroborate the subgroup findings from the index articles, trial characteristics were extracted. Cochran Q test was used to reevaluate heterogeneity with the data from all available trials.The number of subgroup claims in the abstracts of RCTs, the number of subgroup claims in the abstracts of RCTs with statistical support (subgroup findings), and the number of subgroup findings corroborated by subsequent RCTs and meta-analyses.Sixty-four eligible RCTs made a total of 117 subgroup claims in their abstracts. Of these 117 claims, only 46 (39.3%) in 33 articles had evidence of statistically significant heterogeneity from a test for interaction. In addition, out of these 46 subgroup findings, only 16 (34.8%) ensured balance between randomization groups within the subgroups (eg, through stratified randomization), 13 (28.3%) entailed a prespecified subgroup analysis, and 1 (2.2%) was adjusted for multiple testing. Only 5 (10.9%) of the 46 subgroup findings had at least 1 subsequent pure corroboration attempt by a meta-analysis or an RCT. In all 5 cases, the corroboration attempts found no evidence of a statistically significant subgroup effect. In addition, all effect sizes from meta-analyses were attenuated toward the null.A minority of subgroup claims made in the abstracts of RCTs are supported by their own data (ie, a significant interaction effect). For those that have statistical support (P < .05 from an interaction test), most fail to meet other best practices for subgroup tests, including prespecification, stratified randomization, and adjustment for multiple testing. Attempts to corroborate statistically significant subgroup differences are rare; when done, the initially observed subgroup differences are not reproduced.
View details for DOI 10.1001/jamainternmed.2016.9125
View details for PubMedID 28192563
-
Association of the Female Athlete Triad Risk Assessment Stratification to the Development of Bone Stress Injuries in Collegiate Athletes.
American journal of sports medicine
2017; 45 (2): 302-310
Abstract
The female athlete triad (referred to as the triad) contributes to adverse health outcomes, including bone stress injuries (BSIs), in female athletes. Guidelines were published in 2014 for clinical management of athletes affected by the triad.This study aimed to (1) classify athletes from a collegiate population of 16 sports into low-, moderate-, and high-risk categories using the Female Athlete Triad Cumulative Risk Assessment score and (2) evaluate the predictive value of the risk categories for subsequent BSIs.Cohort study; Level of evidence, 3.A total of 323 athletes completed both electronic preparticipation physical examination and dual-energy x-ray absorptiometry scans. Of these, 239 athletes with known oligomenorrhea/amenorrhea status were assigned to a low-, moderate-, or high-risk category. Chart review was used to identify athletes who sustained a subsequent BSI during collegiate sports participation; the injury required a physician diagnosis and imaging confirmation.Of 239 athletes, 61 (25.5%) were classified into moderate-risk and 9 (3.8%) into high-risk categories. Sports with the highest proportion of athletes assigned to the moderate- and high-risk categories included gymnastics (56.3%), lacrosse (50%), cross-country (48.9%), swimming/diving (42.9%), sailing (33%), and volleyball (33%). Twenty-five athletes (10.5%) assigned to risk categories sustained ≥1 BSI. Cross-country runners contributed the majority of BSIs (16; 64%). After adjusting for age and participation in cross-country, we found that moderate-risk athletes were twice as likely as low-risk athletes to sustain a BSI (risk ratio [RR], 2.6; 95% confidence interval [95% CI], 1.3-5.5) and high-risk athletes were nearly 4 times as likely (RR, 3.8; 95% CI, 1.8-8.0). When examining the 6 individual components of the triad risk assessment score, both the oligomenorrhea/amenorrhea score ( P = .0069) and the prior stress fracture/reaction score ( P = .0315) were identified as independent predictors for subsequent BSIs (after adjusting for cross-country participation and age).Using published guidelines, 29% of female collegiate athletes in this study were classified into moderate- or high-risk categories using the Female Athlete Triad Cumulative Risk Assessment Score. Moderate- and high-risk athletes were more likely to subsequently sustain a BSI; most BSIs were sustained by cross-country runners.
View details for DOI 10.1177/0363546516676262
View details for PubMedID 28038316
-
The Value of Scatter Plots
PM&R
2016; 8 (12): 1213-1217
View details for DOI 10.1016/j.pmrj.2016.10.018
View details for Web of Science ID 000391085900010
View details for PubMedID 27989418
-
Introduction to Survival Analysis.
PM & R : the journal of injury, function, and rehabilitation
2016; 8 (6): 580-5
View details for DOI 10.1016/j.pmrj.2016.04.003
View details for PubMedID 27297490
-
Supply and Perceived Demand for Teleophthalmology in Triage and Consultations in California Emergency Departments
JAMA OPHTHALMOLOGY
2016; 134 (5): 537-543
Abstract
Determining the perceived supply and potential demand for teleophthalmology in emergency departments could help mitigate coverage gaps in emergency ophthalmic care.To evaluate the perceived current need for and availability of ophthalmologist coverage in California emergency departments and the potential effect of telemedicine for ophthalmology triage and consultation.Surveys were remotely administered to 187 of the 254 emergency departments throughout California via the telephone and Internet from June 30 to September 23, 2014. Emergency department nurse managers and physicians from all emergency departments listed in the California Office of Statewide Health Planning and Development database were individually surveyed to assess facility characteristics and resources as well as the perceived usefulness of teleophthalmology consultation. Data analysis was conducted from June 30, 2014, to March 11, 2015.Perceived availability of ophthalmology consultation coverage and perceived effect of telemedicine ophthalmology consultation at each facility.Of the 187 emergency departments surveyed, 18 of 37 rural facilities (48.6%) reported availability of emergency ophthalmology coverage, compared with 112 of 150 nonrural facilities (74.7%). Rural facilities reported a mean (SD) of 23.72 (14.15) miles between the facility and referral location, while nonrural facilities reported a mean of 4.41 (10.23) miles (19.3% difference). On a scale of 1 to 5 (where 1 signifies very low value and 5 signifies very high value), 124 of 187 nurse managers (66.3%) and 80 of 121 physicians (66.1%) rated teleophthalmology as having high or very high value for triage purposes. The most frequently cited potential advantage of emergency teleophthalmology was assistance in patient triage and immediate real-time electronic communication, and the most frequently cited potential disadvantages were unknown cost of contracting and maintenance and concern that eye trauma might make photographs or videos less conclusive.Availability of ophthalmology coverage for emergency eye care is limited, particularly among rural emergency departments in California. Surveyed emergency department nurse managers and physicians indicated moderately high interest and perceived value for a teleophthalmology solution for remote triage and consultation. Overall, the study suggests that teleophthalmology could play a role in mitigating coverage gaps in emergency ophthalmic care and could be further investigated through similar studies in other regions.
View details for DOI 10.1001/jamaophthalmol.2016.0316
View details for Web of Science ID 000375796100016
-
Supply and Perceived Demand for Teleophthalmology in Triage and Consultations in California Emergency Departments.
JAMA ophthalmology
2016
Abstract
Determining the perceived supply and potential demand for teleophthalmology in emergency departments could help mitigate coverage gaps in emergency ophthalmic care.To evaluate the perceived current need for and availability of ophthalmologist coverage in California emergency departments and the potential effect of telemedicine for ophthalmology triage and consultation.Surveys were remotely administered to 187 of the 254 emergency departments throughout California via the telephone and Internet from June 30 to September 23, 2014. Emergency department nurse managers and physicians from all emergency departments listed in the California Office of Statewide Health Planning and Development database were individually surveyed to assess facility characteristics and resources as well as the perceived usefulness of teleophthalmology consultation. Data analysis was conducted from June 30, 2014, to March 11, 2015.Perceived availability of ophthalmology consultation coverage and perceived effect of telemedicine ophthalmology consultation at each facility.Of the 187 emergency departments surveyed, 18 of 37 rural facilities (48.6%) reported availability of emergency ophthalmology coverage, compared with 112 of 150 nonrural facilities (74.7%). Rural facilities reported a mean (SD) of 23.72 (14.15) miles between the facility and referral location, while nonrural facilities reported a mean of 4.41 (10.23) miles (19.3% difference). On a scale of 1 to 5 (where 1 signifies very low value and 5 signifies very high value), 124 of 187 nurse managers (66.3%) and 80 of 121 physicians (66.1%) rated teleophthalmology as having high or very high value for triage purposes. The most frequently cited potential advantage of emergency teleophthalmology was assistance in patient triage and immediate real-time electronic communication, and the most frequently cited potential disadvantages were unknown cost of contracting and maintenance and concern that eye trauma might make photographs or videos less conclusive.Availability of ophthalmology coverage for emergency eye care is limited, particularly among rural emergency departments in California. Surveyed emergency department nurse managers and physicians indicated moderately high interest and perceived value for a teleophthalmology solution for remote triage and consultation. Overall, the study suggests that teleophthalmology could play a role in mitigating coverage gaps in emergency ophthalmic care and could be further investigated through similar studies in other regions.
View details for DOI 10.1001/jamaophthalmol.2016.0316
View details for PubMedID 27010537
-
Raloxifene for women with Alzheimer disease: A randomized controlled pilot trial.
Neurology
2015; 85 (22): 1937-1944
Abstract
To determine whether raloxifene, a selective estrogen receptor modulator, improves cognitive function compared with placebo in women with Alzheimer disease (AD) and to provide an estimate of cognitive effect.This pilot study was conducted as a randomized, double-blind, placebo-controlled trial, with a planned treatment of 12 months. Women with late-onset AD of mild to moderate severity were randomly allocated to high-dose (120 mg) oral raloxifene or identical placebo provided once daily. The primary outcome compared between treatment groups at 12 months was change in the Alzheimer's Disease Assessment Scale, cognitive subscale (ADAS-cog).Forty-two women randomized to raloxifene or placebo were included in intent-to-treat analyses (mean age 76 years, range 68-84), and 39 women contributed 12-month outcomes. ADAS-cog change scores at 12 months did not differ significantly between treatment groups (standardized difference 0.03, 95% confidence interval -0.39 to 0.44, 2-tailed p = 0.89). Raloxifene and placebo groups did not differ significantly on secondary analyses of dementia rating, activities of daily living, behavior, or a global cognition composite score. Caregiver burden and caregiver distress were similar in both groups.Results on the primary outcome showed no cognitive benefits in the raloxifene-treated group.This study provides Class I evidence that for women with AD, raloxifene does not have a significant cognitive effect. The study lacked the precision to exclude a small effect.
View details for DOI 10.1212/WNL.0000000000002171
View details for PubMedID 26537053
View details for PubMedCentralID PMC4664126
-
What is Computer Simulation?
PM & R : the journal of injury, function, and rehabilitation
2015; 7 (12): 1290-1293
View details for DOI 10.1016/j.pmrj.2015.10.010
View details for PubMedID 26597107
-
Obstructive Sleep Apnea Is an Independent Predictor of Postoperative Atrial Fibrillation in Cardiac Surgery
JOURNAL OF CARDIOTHORACIC AND VASCULAR ANESTHESIA
2015; 29 (5): 1140-1147
Abstract
To test the hypothesis that obstructive sleep apnea (OSA) is a risk factor for development of postoperative atrial fibrillation (POAF) after cardiac surgery.Retrospective analysis.Single-center university hospital.Five hundred forty-five patients in sinus rhythm preoperatively undergoing coronary artery bypass grafting (CABG), aortic valve replacement, mitral valve replacement/repair, or combined valve/CABG surgery from January 2008 to April 2011.Retrospective review of medical records.Postoperative atrial fibrillation was defined as atrial fibrillation requiring therapeutic intervention. Of 545 cardiac surgical patients, 226 (41%) patients developed POAF. The risk was higher in 72 OSA patients than 473 patients without OSA (67% v 38%, adjusted hazard ratio 1.83 [95% CI: 1.30-2.58], p<0.001). Of the 32 OSA patients who used home positive airway pressure (PAP) therapy, 18 (56%) developed POAF compared with 29 of 38 (76%) patients who did not use PAP at home (unadjusted hazard ratio 0.63 [95% CI: 0.35-1.15], p = 0.13).OSA is significantly associated with POAF in cardiac surgery patients. Further investigation is needed to determine whether or not use of positive airway pressure in OSA patients reduces the risk of POAF.
View details for DOI 10.1053/j.jvca.2015.03.024
View details for PubMedID 26154572
-
Dealing With Missing Data.
PM & R : the journal of injury, function, and rehabilitation
2015; 7 (9): 990-994
View details for DOI 10.1016/j.pmrj.2015.07.011
View details for PubMedID 26388026
-
Identifying Sex-Specific Risk Factors for Low Bone Mineral Density in Adolescent Runners
AMERICAN JOURNAL OF SPORTS MEDICINE
2015; 43 (6): 1494-1504
Abstract
Adolescent runners may be at risk for low bone mineral density (BMD) associated with sports participation. Few prior investigations have evaluated bone health in young runners, particularly males.To characterize sex-specific risk factors for low BMD in adolescent runners.Cross-sectional study; Level of evidence, 3.Training characteristics, fracture history, eating behaviors and attitudes, and menstrual history were measured using online questionnaires. A food frequency questionnaire was used to identify dietary patterns and measure calcium intake. Runners (female: n = 94, male: n = 42) completed dual-energy x-ray absorptiometry (DXA) to measure lumbar spine (LS) and total body less head (TBLH) BMD and body composition values, including android-to-gynoid (A:G) fat mass ratio. The BMD was standardized to Z-scores using age, sex, and race/ethnicity reference values. Questionnaire values were combined with DXA values to determine risk factors associated with differences in BMD Z-scores in LS and TBLH and low bone mass (defined as BMD Z-score ≤-1).In multivariable analyses, risk factors for lower LS BMD Z-scores in girls included lower A:G ratio, being shorter, and the combination of (interaction between) current menstrual irregularity and a history of fracture (all P < .01). Later age of menarche, lower A:G ratio, lower lean mass, and drinking less milk were associated with lower TBLH BMD Z-scores (P < .01). In boys, lower body mass index (BMI) Z-scores and the belief that being thinner improves performance were associated with lower LS and TBLH BMD Z-scores (all P < .05); lower A:G ratio was additionally associated with lower TBLH Z-scores (P < .01). Thirteen girls (14%) and 9 boys (21%) had low bone mass. Girls with a BMI ≤17.5 kg/m(2) or both menstrual irregularity and a history of fracture were significantly more likely to have low bone mass. Boys with a BMI ≤17.5 kg/m(2) and belief that thinness improves performance were significantly more likely to have low bone mass.This study identified sex-specific risk factors for impaired bone mass in adolescent runners. These risk factors can be helpful to guide sports medicine professionals in evaluation and management of young runners at risk for impaired bone health.
View details for DOI 10.1177/0363546515572142
View details for Web of Science ID 000355379200027
View details for PubMedID 25748470
-
Dealing With Longitudinal Data.
PM & R : the journal of injury, function, and rehabilitation
2015; 7 (6): 649-53
View details for DOI 10.1016/j.pmrj.2015.04.009
View details for PubMedID 25892355
-
Running habits of competitive runners during pregnancy and breastfeeding.
Sports health
2015; 7 (2): 172-176
Abstract
Running is a popular sport that may be performed safely during pregnancy. Few studies have characterized running behavior of competitive female runners during pregnancy and breastfeeding.Women modify their running behavior during pregnancy and breastfeeding.Observational, cross-sectional study.Level 2.One hundred ten female long-distance runners who ran competitively prior to pregnancy completed an online survey characterizing training attitudes and behaviors during pregnancy and postpartum.Seventy percent of runners ran some time during their pregnancy (or pregnancies), but only 31% ran during their third trimester. On average, women reduced training during pregnancy, including cutting their intensity to about half of their nonpregnant running effort. Only 3.9% reported sustaining a running injury while pregnant. Fewer than one third (29.9%) selected fetal health as a reason to continue running during pregnancy. Of the women who breastfed, 84.1% reported running during breastfeeding. Most felt that running had no effect on their ability to breastfeed. Women who ran during breastfeeding were less likely to report postpartum depression than those who did not run (6.7% vs 23.5%, P = 0.051), but we did not detect the same association of running during pregnancy (6.5% vs 15.2%, P = 0.16).Women runners reported a reduction in total training while pregnant, and few sustained running injuries during pregnancy. The effect of running on postpartum depression was not clear from our findings.We characterized running behaviors during pregnancy and breastfeeding in competitive runners. Most continue to run during pregnancy but reduce total training effort. Top reasons for running during pregnancy were fitness, health, and maintaining routine; the most common reason for not running was not feeling well. Most competitive runners run during breastfeeding with little perceived impact.
View details for DOI 10.1177/1941738114549542
View details for PubMedID 25984264
View details for PubMedCentralID PMC4332642
-
Participation in Ball Sports May Represent a Prehabilitation Strategy to Prevent Future Stress Fractures and Promote Bone Health in Young Athletes
PM&R
2015; 7 (2): 222-225
Abstract
Sports participation has many benefits for the young athlete, including improved bone health. However, a subset of athletes may attain suboptimal bone health and be at increased risk for stress fractures. This risk is greater for female than for male athletes. In healthy children, high-impact physical activity has been shown to improve bone health during growth and development. We offer our perspective on the importance of promoting high-impact, multidirectional loading activities, including ball sports, as a method of enhancing bone quality and fracture prevention based on collective research. Ball sports have been associated with greater bone mineral density and enhanced bone geometric properties compared with participation in repetitive, low-impact sports such as distance running or nonimpact sports such as swimming. Runners and infantry who participated in ball sports during childhood were at decreased risk of future stress fractures. Gender-specific differences, including the coexistence of female athlete triad, may negate the benefits of previous ball sports on fracture prevention. Ball sports involve multidirectional loading with high ground reaction forces that may result in stiffer and more fracture-resistant bones. Encouraging young athletes to participate in ball sports may optimize bone health in the setting of adequate nutrition and in female athletes, eumenorrhea. Future research to determine timing, frequency, and type of loading activity could result in a primary prevention program for stress fracture injuries and improved life-long bone health.
View details for DOI 10.1016/j.pmrj.2014.09.017
View details for Web of Science ID 000349995500019
View details for PubMedID 25499072
-
Logistic Regression
PM&R
2014; 6 (12): 1157-1162
View details for DOI 10.1016/j.pmrj.2014.10.006
View details for Web of Science ID 000346402700012
View details for PubMedID 25463689
-
Explanatory Versus Predictive Modeling
PM&R
2014; 6 (9): 841-844
View details for DOI 10.1016/j.pmrj.2014.08.941
View details for Web of Science ID 000342883000012
-
Explanatory versus predictive modeling.
PM & R : the journal of injury, function, and rehabilitation
2014; 6 (9): 841-4
View details for DOI 10.1016/j.pmrj.2014.08.941
View details for PubMedID 25150778
-
Bonferroni, Holm, and Hochberg Corrections: Fun Names, Serious Changes to P Values
PM&R
2014; 6 (6): 544-546
View details for DOI 10.1016/j.pmrj.2014.04.006
View details for Web of Science ID 000337995200010
-
Bonferroni, Holm, and Hochberg corrections: fun names, serious changes to p values.
PM & R : the journal of injury, function, and rehabilitation
2014; 6 (6): 544-546
View details for DOI 10.1016/j.pmrj.2014.04.006
View details for PubMedID 24769263
-
Introduction to Principal Components Analysis
PM&R
2014; 6 (3): 275-278
View details for DOI 10.1016/j.pmrj.2014.02.001
View details for Web of Science ID 000333548000009
View details for PubMedID 24565515
-
Eczema and sensitization to common allergens in the United States: a multiethnic, population-based study.
Pediatric dermatology
2014; 31 (1): 21-26
Abstract
The relationship between food and environmental allergens in contributing to eczema risk is unclear on a multiethnic population level. Our purpose was to determine whether sensitization to specific dietary and environmental allergens as measured according to higher specific immunoglobulin E (IgE) levels is associated with eczema risk in children. National Health and Nutrition Examination Survey participants ages 1 to 17 years were asked whether they had ever received a diagnosis of eczema from a physician (n = 538). Total and specific serum IgE levels for four dietary allergens (egg, cow's milk, peanut, and shrimp) and five environmental allergens (dust mite, cat, dog, Aspergillus, and Alternaria) were measured. Logistic regression was used to examine the association between eczema and IgE levels. In the United States, 10.4 million children (15.6%) have a history of eczema. Eczema was more common in black children (p < 0.001) and in children from families with higher income and education (p = 0.01). The median total IgE levels were higher in children with a history of eczema than in those without (66.4 vs 50.6 kU/L, p = 0.004). In multivariate analysis adjusted for age, race, sex, family income, household education, and physician-diagnosed asthma, eczema was significantly associated with sensitization to cat dander (odds ratio [OR] = 1.2, 95% confidence interval [CI] 1.05, 1.4, p = 0.009) and dog dander (OR = 1.5, 95% CI, 1.2, 1.7, p < 0.001). After correction for multiple comparisons, only sensitization to dog dander remained significant. U.S. children with eczema are most likely to be sensitized to dog dander. Future prospective studies should further explore this relationship.
View details for DOI 10.1111/pde.12237
View details for PubMedID 24283549
-
Understanding Linear Regression
PM&R
2013; 5 (12): 1063-1068
View details for DOI 10.1016/j.pmrj.2013.10.002
View details for Web of Science ID 000328795700009
View details for PubMedID 24140739
-
Higher Caloric Intake in Hospitalized Adolescents With Anorexia Nervosa Is Associated With Reduced Length of Stay and No Increased Rate of Refeeding Syndrome
JOURNAL OF ADOLESCENT HEALTH
2013; 53 (5): 573-578
Abstract
To determine the effect of higher caloric intake on weight gain, length of stay (LOS), and incidence of hypophosphatemia, hypomagnesemia, and hypokalemia in adolescents hospitalized with anorexia nervosa.Electronic medical records of all subjects 10-21 years of age with anorexia nervosa, first admitted to a tertiary children's hospital from Jan 2007 to Dec 2011, were retrospectively reviewed. Demographic factors, anthropometric measures, incidence of hypophosphatemia (≤3.0 mg/dL), hypomagnesemia (≤1.7 mg/dL), and hypokalemia (≤3.5 mEq/L), and daily change in percent median body mass index (BMI) (%mBMI) from baseline were recorded. Subjects started on higher-calorie diets (≥1,400 kcal/d) were compared with those started on lower-calorie diets (<1,400 kcal/d).A total of 310 subjects met eligibility criteria (age, 16.1 ± 2.3 years; 88.4% female, 78.5 ± 8.3 %mBMI), including 88 in the lower-calorie group (1,163 ± 107 kcal/d; range, 720-1,320 kcal/d) and 222 in the higher-calorie group (1,557 ± 265 kcal/d; range, 1,400-2,800 kcal/d). Neither group had initial weight loss. The %mBMI increased significantly (p < .001) from baseline by day 1 in the higher-calorie group and day 2 in the lower-calorie group. Compared with the lower-calorie group, the higher-calorie group had reduced LOS (13.0 ± 7.3 days versus 16.6 ± 9.0 days; p < .0001), but the groups did not differ in rate of change in %mBMI (p = .50) or rates of hypophosphatemia (p = .49), hypomagnesemia (p = 1.0), or hypokalemia (p = .35). Hypophosphatemia was associated with %mBMI on admission (p = .004) but not caloric intake (p = .14).A higher caloric diet on admission is associated with reduced LOS, but not increased rate of weight gain or rates of hypophosphatemia, hypomagnesemia, or hypokalemia. Refeeding hypophosphatemia depends on the degree of malnutrition but not prescribed caloric intake, within the range studied.
View details for DOI 10.1016/j.jadohealth.2013.05.014
View details for PubMedID 23830088
-
Identifying sex-specific risk factors for stress fractures in adolescent runners.
Medicine and science in sports and exercise
2013; 45 (10): 1843-1851
Abstract
PURPOSE: Adolescent females and males participating in running represent a population at high risk of stress fracture. Few investigators have evaluated risk factors for prospective stress fracture in this population. METHODS: To better characterize risk factors for and incidence of stress fractures in this population, we collected baseline risk factor data on 748 competitive high school runners (442 girls and 306 boys) using an online survey. We then followed them prospectively for the development of stress fractures for an average of 2.3±1.2 total seasons of cross-country and track and field; follow-up data were available for 428 girls and 273 boys. RESULTS: We identified prospective stress fractures in 5.4% of girls (N=23) and 4.0% of boys (N=11). Tibial stress fractures were most common in girls, and the metatarsus was most frequently fractured in boys. Multivariate regression identified four independent risk factors for stress fractures in girls: prior fracture, BMI <19, late menarche (age menarche ≥15 years), and previous participation in gymnastics or dance. For boys, prior fracture and increased number of seasons were associated with an increased rate of stress fractures, whereas prior participation in basketball was associated with a decreased risk of stress fractures. CONCLUSION: Prior fracture represents the most robust predictor of stress fractures in both sexes. Low BMI, late menarche, and prior participation in gymnastics and dance are identifiable risk factors for stress fractures in girls. Participation in basketball appears protective in boys and may represent a modifiable risk factor for stress fractures. These findings may help guide future translational research and clinical care in the management and prevention of stress fractures in young runners.
View details for DOI 10.1249/MSS.0b013e3182963d75
View details for PubMedID 23584402
-
Multivariate Regression: The Pitfalls of Automated Variable Selection
PM&R
2013; 5 (9): 791-794
View details for DOI 10.1016/j.pmrj.2013.07.007
View details for Web of Science ID 000325041600008
View details for PubMedID 24054854
-
Interpreting "Null" Results
PM&R
2013; 5 (6): 520-523
View details for DOI 10.1016/j.pmrj.2013.05.003
View details for Web of Science ID 000321156400011
View details for PubMedID 23790820
-
Avoiding Careless Errors: Know Your Data
PM&R
2013; 5 (3): 228-229
View details for DOI 10.1016/j.pmrj.2013.01.012
View details for Web of Science ID 000316433300010
View details for PubMedID 23481330
-
Dealing With Non-normal Data
PM&R
2012; 4 (12): 1001-1005
View details for DOI 10.1016/j.pmrj.2012.10.013
View details for Web of Science ID 000313092100009
View details for PubMedID 23245662
-
Propensity Scores: Uses and Limitations
PM&R
2012; 4 (9): 693-697
View details for DOI 10.1016/j.pmrj.2012.07.002
View details for Web of Science ID 000309436500008
View details for PubMedID 22980422
-
Obesity and the relationship between pre-hypertension and chronic kidney disease: can we really isolate the effect of pre-hypertension?
KIDNEY INTERNATIONAL
2012; 82 (4): 489-489
View details for DOI 10.1038/ki.2012.144
View details for Web of Science ID 000307078000017
View details for PubMedID 22846814
-
How statistics can mislead.
American journal of public health
2012; 102 (8): e3-4
View details for DOI 10.2105/AJPH.2012.300697
View details for PubMedID 22698033
-
Clinical Versus Statistical Significance
PM&R
2012; 4 (6): 442-445
View details for DOI 10.1016/j.pmrj.2012.04.014
View details for Web of Science ID 000306038700008
View details for PubMedID 22732155
-
Communicating Risks Clearly: Absolute Risk and Number Needed to Treat
PM&R
2012; 4 (3): 220-222
View details for DOI 10.1016/j.pmrj.2012.01.001
View details for Web of Science ID 000305438500008
View details for PubMedID 22443959
-
Sun protective behaviors and vitamin D levels in the US population: NHANES 2003-2006
CANCER CAUSES & CONTROL
2012; 23 (1): 133-140
Abstract
Sun protection is recommended for skin cancer prevention, yet little is known about the role of sun protection on vitamin D levels. Our aim was to investigate the relationship between different types of sun protective behaviors and serum 25(OH)D levels in the general US population.Cross-sectional, nationally representative survey of 5,920 adults aged 18-60 years in the US National Health and Nutrition Examination Survey 2003-2006. We analyzed questionnaire responses on sun protective behaviors: staying in the shade, wearing long sleeves, wearing a hat, using sunscreen and SPF level. Analyses were adjusted for multiple confounders of 25(OH)D levels and stratified by race. Our primary outcome measures were serum 25(OH)D levels (ng/ml) measured by radioimmunoassay and vitamin D deficiency, defined as 25(OH)D levels <20 ng/ml.Staying in the shade and wearing long sleeves were significantly associated with lower 25(OH)D levels. Subjects who reported frequent use of shade on a sunny day had -3.5 ng/ml (p (trend) < 0.001) lower 25(OH)D levels compared to subjects who reported rare use. Subjects who reported frequent use of long sleeves had -2.2 ng/ml (p (trend) = 0.001) lower 25(OH)D levels. These associations were strongest for whites, and did not reach statistical significance among Hispanics or blacks. White participants who reported frequently staying in the shade or wearing long sleeves had double the odds of vitamin D deficiency compared with those who rarely did so. Neither wearing a hat nor using sunscreen was associated with low 25(OH)D levels or vitamin D deficiency.White individuals who protect themselves from the sun by seeking shade or wearing long sleeves may have lower 25(OH)D levels and be at risk for vitamin D deficiency. Frequent sunscreen use does not appear to be linked to vitamin D deficiency in this population.
View details for DOI 10.1007/s10552-011-9862-0
View details for PubMedID 22045154
-
Reliability and prevalence of digital image skin types in the United States: Results from National Health and Nutrition Examination Survey 2003-2004
JOURNAL OF THE AMERICAN ACADEMY OF DERMATOLOGY
2012; 66 (1): 163-165
View details for DOI 10.1016/j.jaad.2011.02.044
View details for Web of Science ID 000298712100031
View details for PubMedID 22177642
-
A Closer Look at Confidence Intervals
PM&R
2011; 3 (12): 1134-1141
View details for DOI 10.1016/j.pmrj.2011.10.005
View details for Web of Science ID 000305872700010
View details for PubMedID 22192323
-
The Limitations of Statistical Adjustment
PM&R
2011; 3 (9): 868-872
View details for DOI 10.1016/j.pmrj.2011.06.006
View details for Web of Science ID 000305438100011
View details for PubMedID 21944304
-
Understanding Study Design
PM&R
2011; 3 (6): 573-577
View details for DOI 10.1016/j.pmrj.2011.04.001
View details for Web of Science ID 000305437700010
View details for PubMedID 21665169
-
Understanding Odds Ratios
PM&R
2011; 3 (3): 263-267
View details for DOI 10.1016/j.pmrj.2011.01.009
View details for Web of Science ID 000305437400009
View details for PubMedID 21402371
-
Comparative Profiling of Primary Colorectal Carcinomas and Liver Metastases Identifies LEF1 as a Prognostic Biomarker
PLOS ONE
2011; 6 (2)
Abstract
We sought to identify genes of clinical significance to predict survival and the risk for colorectal liver metastasis (CLM), the most common site of metastasis from colorectal cancer (CRC).We profiled gene expression in 31 specimens from primary CRC and 32 unmatched specimens of CLM, and performed Significance Analysis of Microarrays (SAM) to identify genes differentially expressed between these two groups. To characterize the clinical relevance of two highly-ranked differentially-expressed genes, we analyzed the expression of secreted phosphoprotein 1 (SPP1 or osteopontin) and lymphoid enhancer factor-1 (LEF1) by immunohistochemistry using a tissue microarray (TMA) representing an independent set of 154 patients with primary CRC.Supervised analysis using SAM identified 963 genes with significantly higher expression in CLM compared to primary CRC, with a false discovery rate of <0.5%. TMA analysis showed SPP1 and LEF1 protein overexpression in 60% and 44% of CRC cases, respectively. Subsequent occurrence of CLM was significantly correlated with the overexpression of LEF1 (chi-square p = 0.042), but not SPP1 (p = 0.14). Kaplan Meier analysis revealed significantly worse survival in patients with overexpression of LEF1 (p<0.01), but not SPP1 (p = 0.11). Both univariate and multivariate analyses identified stage (p<0.0001) and LEF1 overexpression (p<0.05) as important prognostic markers, but not tumor grade or SPP1.Among genes differentially expressed between CLM and primary CRC, we demonstrate overexpression of LEF1 in primary CRC to be a prognostic factor for poor survival and increased risk for liver metastasis.
View details for DOI 10.1371/journal.pone.0016636
View details for Web of Science ID 000287761700013
View details for PubMedID 21383983
View details for PubMedCentralID PMC3044708
-
Overuse Injuries in High School Runners: Lifetime Prevalence and Prevention Strategies
PM&R
2011; 3 (2): 125-131
Abstract
To evaluate lifetime prevalence and risk factors for overuse injuries in high school athletes currently participating in long-distance running and provide recommendations for injury prevention strategies.Retrospective study design.Twenty-eight high schools in the San Francisco Bay Area.A total of 442 female and 306 male athletes, ages 13-18 years, who are on cross-country and track and field teams.Online survey with questions that detailed previous injuries sustained and risk factors for injury.Previous overuse injuries and association of risk factors to injury (including training variables, dietary patterns, and, in girls, menstrual irregularities).Previous injuries were reported by 68% of female subjects and 59% of male subjects. More injury types were seen in girls (1.2 ± 1.1 versus 1.0 ± 1.0, P < .01). Both genders had similar participation in running (2.5 ± 2.2 versus 2.3 ± 2.1 years), and previous injury prevalence followed a similar pattern: tibial stress injury (girls, 41%; boys, 34%), ankle sprain (girls, 32%; boys, 28%), patellofemoral pain (girls, 21%; boys, 16%), Achilles tendonitis (girls, 9%; boys, 6%), iliotibial band syndrome (girls, 7%; boys, 5%), and plantar fasciitis (girls, 5%; boys, 3%). Higher weekly mileage was associated with previous injuries in boys, (17.1 ± 11.9 versus 14.1 ± 11.5, P < .05) but not in girls (14.4 ± 10.2 versus 12.6 ± 11.8, not significant). A strong association between higher mileage and faster performances was seen in both groups. No association between previous injury and current dietary patterns (including disordered eating and calcium intake) or menstrual irregularities was seen.The majority of athletes currently participating in high school cross-country and track and field have a history of sustaining an overuse injury, with girls having a higher prevalence of injury. A modest mileage reduction may represent a modifiable risk factor for injury reduction. Future research is needed to evaluate the effects of incorporating a comprehensive strength training program on the prospective development of overuse injury and performance in this population.
View details for DOI 10.1016/j.pmrj.2010.09.009
View details for Web of Science ID 000305437300006
View details for PubMedID 21333951
-
Evaluating the relationship of calcium and vitamin D in the prevention of stress fracture injuries in the young athlete: a review of the literature.
PM & R : the journal of injury, function, and rehabilitation
2010; 2 (10): 945-949
Abstract
Calcium and vitamin D are recognized as 2 components of nutrition needed to achieve and maintain bone health. Calcium and vitamin D have been clearly shown to improve bone density and prevent fractures at all ages. However, the literature is conflicting as to the role of these nutrients in young athletes ages 18 to 35 years, both for bone development and for the prevention of bone overuse injuries. Differences in findings may relate to study design. Although retrospective and cross-sectional studies have had mixed results, the authors of prospective studies have consistently demonstrated a relationship of increased calcium intake with an improvement in bone density and a decrease in fracture risk. A randomized trial in female military recruits demonstrated that calcium/vitamin D supplementation reduced the incidence of stress fractures. A prospective study in young female runners demonstrated reduced incidence of stress fractures and increased bone mineral density with increased dietary calcium intake. Findings from both studies suggest female athletes and military recruits who consumed greater than 1500 mg of calcium daily exhibited the largest reduction in stress fracture injuries. To date, no prospective studies have been conducted in male athletes or in adolescent athletes. In most studies, males and nonwhite participants were poorly represented. Evidence regarding the relationship of vitamin D intake with the prevention of fractures in athletes is also limited. More prospective studies are needed to evaluate the role of calcium and vitamin D intake in prevention of stress fracture injuries in both male and female adolescent athletes, particularly those participating in sports with greater incidences of stress fracture injury.
View details for DOI 10.1016/j.pmrj.2010.05.006
View details for PubMedID 20970764
-
Evaluating the Relationship of Calcium and Vitamin D in the Prevention of Stress Fracture Injuries in the Young Athlete: A Review of the Literature
PM&R
2010; 2 (10): 945-949
Abstract
Calcium and vitamin D are recognized as 2 components of nutrition needed to achieve and maintain bone health. Calcium and vitamin D have been clearly shown to improve bone density and prevent fractures at all ages. However, the literature is conflicting as to the role of these nutrients in young athletes ages 18 to 35 years, both for bone development and for the prevention of bone overuse injuries. Differences in findings may relate to study design. Although retrospective and cross-sectional studies have had mixed results, the authors of prospective studies have consistently demonstrated a relationship of increased calcium intake with an improvement in bone density and a decrease in fracture risk. A randomized trial in female military recruits demonstrated that calcium/vitamin D supplementation reduced the incidence of stress fractures. A prospective study in young female runners demonstrated reduced incidence of stress fractures and increased bone mineral density with increased dietary calcium intake. Findings from both studies suggest female athletes and military recruits who consumed greater than 1500 mg of calcium daily exhibited the largest reduction in stress fracture injuries. To date, no prospective studies have been conducted in male athletes or in adolescent athletes. In most studies, males and nonwhite participants were poorly represented. Evidence regarding the relationship of vitamin D intake with the prevention of fractures in athletes is also limited. More prospective studies are needed to evaluate the role of calcium and vitamin D intake in prevention of stress fracture injuries in both male and female adolescent athletes, particularly those participating in sports with greater incidences of stress fracture injury.
View details for DOI 10.1016/j.pmrj.2010.05.006
View details for Web of Science ID 000208361700010
-
The importance of accounting for correlated observations.
PM & R : the journal of injury, function, and rehabilitation
2010; 2 (9): 858-861
View details for DOI 10.1016/j.pmrj.2010.07.482
View details for PubMedID 20869686
-
The Importance of Accounting for Correlated Observations
PM&R
2010; 2 (9): 858-861
View details for DOI 10.1016/j.pmrj.2010.07.482
View details for Web of Science ID 000208412400011
-
Nutritional factors that influence change in bone density and stress fracture risk among young female cross-country runners.
PM & R : the journal of injury, function, and rehabilitation
2010; 2 (8): 740-750
Abstract
To identify nutrients, foods, and dietary patterns associated with stress fracture risk and changes in bone density among young female distance runners.Two-year, prospective cohort study. Observational data were collected in the course of a multicenter randomized trial of the effect of oral contraceptives on bone health.One hundred and twenty-five female competitive distance runners ages 18-26 years.Dietary variables were assessed with a food frequency questionnaire.Bone mineral density and content (BMD/BMC) of the spine, hip, and total body were measured annually by dual x-ray absorptiometry (DEXA). Stress fractures were recorded on monthly calendars, and had to be confirmed by radiograph, bone scan, or magnetic resonance imaging.Seventeen participants had at least one stress fracture during follow-up. Higher intakes of calcium, skim milk, and dairy products were associated with lower rates of stress fracture. Each additional cup of skim milk consumed per day was associated with a 62% reduction in stress fracture incidence (P < .05); and a dietary pattern of high dairy and low fat intake was associated with a 68% reduction (P < .05). Higher intakes of skim milk, dairy foods, calcium, animal protein, and potassium were associated with significant (P < .05) gains in whole-body BMD and BMC. Higher intakes of calcium, vitamin D, skim milk, dairy foods, potassium, and a dietary pattern of high dairy and low fat were associated with significant gains in hip BMD.In young female runners, low-fat dairy products and the major nutrients in milk (calcium, vitamin D, and protein) were associated with greater bone gains and a lower stress fracture rate. Potassium intake was also associated with greater gains in hip and whole-body BMD.
View details for DOI 10.1016/j.pmrj.2010.04.020
View details for PubMedID 20709302
-
Nutritional Factors That Influence Change in Bone Density and Stress Fracture Risk Among Young Female Cross-Country Runners
PM&R
2010; 2 (8): 740-750
Abstract
To identify nutrients, foods, and dietary patterns associated with stress fracture risk and changes in bone density among young female distance runners.Two-year, prospective cohort study. Observational data were collected in the course of a multicenter randomized trial of the effect of oral contraceptives on bone health.One hundred and twenty-five female competitive distance runners ages 18-26 years.Dietary variables were assessed with a food frequency questionnaire.Bone mineral density and content (BMD/BMC) of the spine, hip, and total body were measured annually by dual x-ray absorptiometry (DEXA). Stress fractures were recorded on monthly calendars, and had to be confirmed by radiograph, bone scan, or magnetic resonance imaging.Seventeen participants had at least one stress fracture during follow-up. Higher intakes of calcium, skim milk, and dairy products were associated with lower rates of stress fracture. Each additional cup of skim milk consumed per day was associated with a 62% reduction in stress fracture incidence (P < .05); and a dietary pattern of high dairy and low fat intake was associated with a 68% reduction (P < .05). Higher intakes of skim milk, dairy foods, calcium, animal protein, and potassium were associated with significant (P < .05) gains in whole-body BMD and BMC. Higher intakes of calcium, vitamin D, skim milk, dairy foods, potassium, and a dietary pattern of high dairy and low fat were associated with significant gains in hip BMD.In young female runners, low-fat dairy products and the major nutrients in milk (calcium, vitamin D, and protein) were associated with greater bone gains and a lower stress fracture rate. Potassium intake was also associated with greater gains in hip and whole-body BMD.
View details for DOI 10.1016/j.pmrj.2010.04.020
View details for Web of Science ID 000208361600007
-
Misleading comparisons: the fallacy of comparing statistical significance.
PM & R : the journal of injury, function, and rehabilitation
2010; 2 (6): 559-562
View details for DOI 10.1016/j.pmrj.2010.04.016
View details for PubMedID 20630442
-
Misleading Comparisons: The Fallacy of Comparing Statistical Significance
PM&R
2010; 2 (6): 559-562
View details for DOI 10.1016/j.pmrj.2010.04.016
View details for Web of Science ID 000208412300011
-
Electronic web-based surveys: an effective and emerging tool in research.
PM & R : the journal of injury, function, and rehabilitation
2010; 2 (4): 307-309
View details for DOI 10.1016/j.pmrj.2010.02.004
View details for PubMedID 20430335
-
Making sense of intention-to-treat.
PM & R : the journal of injury, function, and rehabilitation
2010; 2 (3): 209-213
View details for DOI 10.1016/j.pmrj.2010.01.004
View details for PubMedID 20359686
-
Making Sense of Intention-to-Treat
PM&R
2010; 2 (3): 209-213
View details for DOI 10.1016/j.pmrj.2010.01.004
View details for Web of Science ID 000208361200007
-
The problem of multiple testing.
PM & R : the journal of injury, function, and rehabilitation
2009; 1 (12): 1098-1103
View details for DOI 10.1016/j.pmrj.2009.10.004
View details for PubMedID 20006317
-
The Problem of Multiple Testing
PM&R
2009; 1 (12): 1098-1103
View details for DOI 10.1016/j.pmrj.2009.10.004
View details for Web of Science ID 000208412200007
-
Medulloblastoma Incidence has not Changed Over Time A CBTRUS Study
JOURNAL OF PEDIATRIC HEMATOLOGY ONCOLOGY
2009; 31 (12): 970-971
Abstract
Earlier studies have reported changes in the incidence of medulloblastoma (MB) but have conflicted, likely because of small sample size or misclassification of MB with primitive neuroectodermal tumor (PNET). The incidence of MB and PNET from 1985 to 2002 was determined from the Central Brain Tumor Registry of the United States, a large population-based cancer registry, using strict histologic and site codes. No statistically significant change in MB incidence was observed over the last 2 decades, but there was an increase in MB and PNET combined.
View details for Web of Science ID 000272658700019
View details for PubMedID 19887963
-
Do children and adults differ in survival from medulloblastoma? A study from the SEER registry
JOURNAL OF NEURO-ONCOLOGY
2009; 95 (1): 81-85
Abstract
Studies investigating whether adults have diminished survival from medulloblastoma (MB) compared with children have yielded conflicting results. We sought to determine in a population-based registry whether adults and children with MB differ in survival, and to examine whether dissimilar use of chemotherapy might contribute to any disparity. 1,226 MB subjects were identified using the Surveillance Epidemiology and End Results (SEER-9) registry (1973-2002) and survival analysis performed. MB was defined strictly to exclude non-cerebellar primitive neuro-ectodermal tumors. Patients were stratified by age at diagnosis: <3 years (infants), 3-17 years (children) and >or=18 years (adults). Because the SEER-9 registry lacks treatment data, a subset of 142 patients were identified using the San Francisco-Oakland SEER registry (1988-2003) and additional analyses performed. There was no significant difference in survival between children and adults with MB in either the SEER-9 (P = 0.17) or SFO (P = 0.89) cohorts but infants fared worse compared to both children (P < 0.01) and adults (P < 0.01). In the SFO sample, children and adults who received chemotherapy plus radiation therapy (XRT) did not differ in survival. Among patients treated with XRT alone, children showed increased survival (P = 0.04) compared with adults. Children and adults with MB do not differ with respect to overall survival, yet infants fare significantly worse. For children and adults with MB treated with both XRT and chemotherapy, we could not demonstrate a survival difference. Similar outcomes between adult and childhood MB may justify inclusion of adults in pediatric cooperative trials for MB.
View details for DOI 10.1007/s11060-009-9894-4
View details for Web of Science ID 000269884600010
View details for PubMedID 19396401
-
Putting P values in perspective.
PM & R : the journal of injury, function, and rehabilitation
2009; 1 (9): 873-877
View details for DOI 10.1016/j.pmrj.2009.07.003
View details for PubMedID 19769922
-
Putting P Values in Perspective
PM&R
2009; 1 (9): 873-877
View details for DOI 10.1016/j.pmrj.2009.07.003
View details for Web of Science ID 000208411900013
-
A Caution on Interpreting Odds Ratios
SLEEP
2009; 32 (8): 976-976
View details for Web of Science ID 000268557600002
View details for PubMedID 19725246
View details for PubMedCentralID PMC2717202
-
Incidence Patterns of Central Nervous System Germ Cell Tumors A SEER Study
JOURNAL OF PEDIATRIC HEMATOLOGY ONCOLOGY
2009; 31 (8): 541-544
Abstract
Incidence patterns of central nervous system (CNS) germ cell tumors (GCTs) have been reported, but the influence of underlying host risk factors has not been rigorously explored. We aimed to determine in a large, population-based cancer registry how age, sex, and race, influence the occurrence of CNS GCTs in the pediatric population.Using the Surveillance, Epidemiology, and End Results registry, we identified cases of histologically confirmed GCTs in children, adolescents, and young adults (age 0 to 29 y), diagnosed between 1973 and 2004. The cases were limited to only those with the International Classification of Childhood Cancer Xa: intracranial and intraspinal germ-cell tumors. Incidence rates (per 10,000) for each sex and race were plotted for single-age groups, and then stratified by tumor location and pathology subtype.The sample included a total of 638 cases (490 males). Males had significantly higher rates of CNS GCTs than females. Male and female rates diverged significantly starting at the age of 11 years and remained widely discrepant until the age of 30 years. There were more germinomas than nongerminomas in both sexes. Germinomas peaked in incidence during adolescence, whereas nongerminoma incidence remained relatively constant in children and young adults. Tumor location differed strikingly by sex (P<0.0001) with pineal location more common in males (61.0% vs. 15.5%). Asian race was associated with a higher rate of CNS GCTs than other races.Males have higher incidence of CNS GCTs, primarily germinomas, than females, starting in the second decade. Pineal location is strongly associated with male sex, with pineal germinomas representing over half of all CNS GCTs in males. Asian-Americans have higher rates than other races. These findings suggest a robust but poorly understood influence of sex, either genetic or hormonal, and race on the occurrence of CNS GCTs.
View details for Web of Science ID 000268815000003
View details for PubMedID 19636276
-
Incidence patterns for ependymoma: a Surveillance, Epidemiology, and End Results study Clinical article
JOURNAL OF NEUROSURGERY
2009; 110 (4): 725-729
Abstract
Previous small studies disagree about which clinical risk factors influence ependymoma incidence. The authors analyzed a large, population-based cancer registry to examine the relationship of incidence to patient age, sex, race, and tumor location, and to determine incidence trends over the past 3 decades.Data were obtained from the Surveillance, Epidemiology, and End Results (SEER-9) study, which was conducted from 1973 to 2003. Histological codes were used to define ependymomas. Age-adjusted incidence rates were compared by confidence intervals in the SEER*Stat 6.2 program. Multiplicative Poisson regression and Joinpoint analysis were used to determine annual percentage change and to look for sharp changes in incidence, respectively.From the SEER database, 1402 patients were identified. The incidence rate per 100,000 person-years was significantly higher in male than in female patients (males 0.227 +/- 0.029, females 0.166 +/- 0.03). For children, the age at diagnosis differed significantly by tumor location, with the mean age for patients with infratentorial tumors calculated as 5 +/- 0.4 years; for supratentorial tumors it was 7.77 +/- 0.6 years, and for spinal lesions it was 12.16 +/- 0.8 years. (Values are expressed as the mean +/- standard error [SE].) Adults showed no difference in the mean age of incidence by location, although most tumors in this age group were spinal. Between 1973 and 2003, the incidence increased significantly among adults but not among children, and there were no sharp changes at any single year, both before and after age adjustment.Males have a higher incidence of ependymoma than do females. A biological explanation remains elusive. Ependymoma occurs within the CNS at distinct locations at different ages, consistent with hypotheses postulating distinct populations of radial glial stem cells within the CNS. Ependymoma incidence appears to have increased over the past 3 decades, but only in adults.
View details for DOI 10.3171/2008.9.JNS08117
View details for Web of Science ID 000264594300017
View details for PubMedID 19061350
-
Both Location and Age Predict Survival in Ependymoma: A SEER Study
PEDIATRIC BLOOD & CANCER
2009; 52 (1): 65-69
Abstract
Studies have suggested that supratentorial ependymomas have better survival than infratentorial tumors, with spinal tumors having the best prognosis, but these data have been based on small samples. Using a population-based registry of ependymomas, we analyzed how age, gender, location, race and radiotherapy influence survival in children.We queried the Surveillance Epidemiology End Results database (SEER-17) from 1973 to 2003, strictly defining ependymomas by histology. Site codes were used to distinguish between supratentorial, infratentorial, and spinal tumors when available. Outcomes were compared by location, age, gender, race and radiotherapy, using Kaplan-Meier analysis and logrank tests. Cox regression was completed, incorporating all significant covariates from univariate analysis.Six hundred thirty-five children were identified with an overall 5-year survival of 57.1 +/- standard error (SE) 2.3%. Increasing age was associated with improved survival (P < 0.0001). Five-year survival by location was 59.5 +/- SE 5.5% supratentorial, 57.1 +/- SE 4.1% infratentorial and 86.7 +/- SE 5.2% spinal. Radiotherapy of the infratentorial tumors resulted in significantly improved survival in both univariate analysis (logrank P < 0.018) and multivariate analysis restricted to this tumor location (P = 0.033). Using multivariate analysis that incorporated all tumor locations, age (P < 0.001) and location (P = 0.020) were significant predictors for survival.Age and location independently influence survival in ependymoma. Spinal tumors are associated with a significantly better prognosis than both supratentorial and infratentorial tumors, and may represent a distinct biological entity. Radiotherapy appears beneficial for survival in patients with infratentorial ependymoma.
View details for DOI 10.1002/pbc.21806
View details for Web of Science ID 000261300000016
View details for PubMedID 19006249
-
Gender Affects Survival for Medulloblastoma Only in Older Children and Adults: A Study From the Surveillance Epidemiology and End Results Registry
PEDIATRIC BLOOD & CANCER
2009; 52 (1): 60-64
Abstract
Males have a higher incidence of medulloblastoma (MB) than females, but the effect of gender on survival is unclear. Studies have yielded conflicting results, possibly due to small sample sizes or differences in how researchers defined MB. We aimed to determine the effect of gender on survival in MB using a large data set and strict criteria for defining MB.A sample of 1,226 subjects (763 males and 463 females) was identified from 1973 to 2002, using the Surveillance Epidemiology and End Results (SEER-9) registry. MB was strictly defined to exclude non-cerebellar embryonal tumors (primitive neuro-ectodermal tumors). Because children <3 years of age are known to have worse survival, patients were stratified by age <3 years at diagnosis (95 males, 82 females) and >3 years (668 males, 381 females).Overall, there was no significant difference in survival between males and females (log rank P = 0.22). However, among subjects >3 years, females had significantly greater survival than males (log rank P = 0.02). In children <3 years, there was a non-significant trend toward poorer survival in females (median survival: males 27 months, females 13 months; log rank P = 0.24). This interaction between age group and gender was statistically significant (P = 0.03).Females with MB have a survival advantage only in subjects >3 years. In children <3 years, females may even have poorer outcome. The effect of gender on survival and incidence in MB warrants additional biologic investigation, and may differ in very young children with MB.
View details for DOI 10.1002/pbc.21832
View details for Web of Science ID 000261300000015
View details for PubMedID 19006250
-
Effect of oral contraceptives on weight and body composition in young female runners
MEDICINE AND SCIENCE IN SPORTS AND EXERCISE
2008; 40 (7): 1205-1212
Abstract
To examine the effect of oral contraceptives (OC) on body weight, fat mass, percent body fat, and lean mass in young female distance runners.The study population consisted of 150 female competitive distance runners aged 18-26 yr who had participated in a 2-yr randomized trial of the effect of the OC Lo/Ovral (30 microg of ethinyl estradiol and 0.3 mg of norgestrel) on bone health. Weight and body composition were measured approximately yearly by balance beam scales and dual-energy x-ray absorptiometry, respectively.Women randomized to the OC group tended to gain slightly less weight (adjusted mean difference (AMD) = -0.54 +/- 0.31 kg.yr, P = 0.09) and less fat (AMD = -0.35 +/- 0.25 kg.yr, P = 0.16) than those randomized to the control group. OC assignment was associated with a significant gain in lean mass relative to controls among eumenorrheic women (those who had 10 or more menstrual cycles in the year before baseline; AMD = 0.77 +/- 0.17 kg.yr, P < 0.0001) but not among women with fewer than 10 menstrual cycles in that year (AMD = 0.02 +/- 0.35 kg.yr, P = 0.96). Treatment-received analyses yielded similar results.This randomized trial confirms previous findings that OC use does not cause weight or fat mass gain, at least among young female runners. Our finding that this OC is associated with lean mass gain in eumenorrheic runners, but not in those with irregular menses, warrants examination in other studies.
View details for DOI 10.1249/MSS.0b013e31816a0df6
View details for Web of Science ID 000256981700002
View details for PubMedID 18580398
-
Cancer risk reduction and reproductive concerns in female BRCA1/2 mutation carriers
FAMILIAL CANCER
2008; 7 (2): 179-186
Abstract
Women with mutations in the BRCA1 or BRCA2 cancer susceptibility genes face unique choices regarding management of their high risk for breast and ovarian cancer that impact their reproductive options. In order to explore women's preferences for management of elevated cancer risk, we evaluated the decisions of BRCA1/2 mutation carriers about contraception, prophylactic surgery, and family planning.An internet-based questionnaire assessing high-risk women's preferences about cancer risk management and reproductive options was designed, pilot-tested and administered electronically to 284 participants of an internet-based advocacy group for women with BRCA1/2 mutations.Two hundred and thirteen eligible participants completed the majority of the survey. Mean age was 34 years; 66% were BRCA1 mutation carriers and 34% were BRCA2 mutation carriers. Most women (92%) had used oral contraceptive pills. About 88% of responders reported frequent or extreme worry about transmitting the mutation to their children. Despite their high level of worry, few responders said they would likely consider using assisted reproduction technologies such as a pregnancy surrogate (3%), cryopreservation of oocytes or embryos (8%), or pre-implantation genetic diagnosis (PGD) to select embryos without BRCA1/2 mutations (13%).Although they expressed substantial concern about transmitting BRCA1/2 mutations to their children, only a minority of the high-risk women surveyed were likely to consider currently available assisted reproductive strategies. Further research is necessary to explore the risk management preferences of patients with inherited cancer predisposition, and to incorporate these preferences into clinical care.
View details for DOI 10.1007/s10689-007-9171-7
View details for PubMedID 18026853
-
Regional bone mineral density in male athletes: a comparison of soccer players, runners and controls
BRITISH JOURNAL OF SPORTS MEDICINE
2007; 41 (10): 664-668
Abstract
To investigate the association of soccer playing and long-distance running with total and regional bone mineral density (BMD).Cross-sectional study.Academic medical centre.Elite male soccer players (n = 15), elite male long-distance runners (n = 15) and sedentary male controls (n = 15) aged 20-30 years.BMD (g/cm2) of the lumbar spine (L1-L4), right hip, right leg and total body were assessed by dual-energy x-ray absorptiometry, and a scan of the right calcaneus was performed with a peripheral instantaneous x-ray imaging bone densitometer.After adjustment for age, weight and percentage body fat, soccer players had significantly higher whole body, spine, right hip, right leg and calcaneal BMD than controls (p = 0.008, p = 0.041, p<0.001, p = 0.019, p<0.001, respectively) and significantly higher right hip and spine BMD than runners (p = 0.012 and p = 0.009, respectively). Runners had higher calcaneal BMD than controls (p = 0.002). Forty percent of the runners had T-scores of the lumbar spine between -1 and -2.5. Controls were similar: 34% had T-scores below -1 (including 7% with T-scores lower than -2.5).Playing soccer is associated with higher BMD of the skeleton at all sites measured. Running is associated with higher BMD at directly loaded sites (the calcaneus) but not at relatively unloaded sites (the spine). Specific loading conditions, seen in ball sports or in running, play a pivotal role in skeletal adaptation. The importance of including an appropriate control group in clinical studies is underlined.
View details for DOI 10.1136/bjsm.2006.030783
View details for Web of Science ID 000249621100015
View details for PubMedID 17473003
View details for PubMedCentralID PMC2465163
-
The effect of oral contraceptives on bone mass and stress fractures in female runners
MEDICINE AND SCIENCE IN SPORTS AND EXERCISE
2007; 39 (9): 1464-1473
Abstract
To determine the effect of oral contraceptives (OC) on bone mass and stress fracture incidence in young female distance runners.One hundred fifty competitive female runners ages 18-26 yr were randomly assigned to OC (30 microg of ethinyl estradiol and 0.3 mg of norgestrel) or control (no intervention) for 2 yr. Bone mineral density (BMD) and content (BMC) were measured yearly by dual x-ray absorptiometry. Stress fractures were confirmed by x-ray, magnetic resonance imaging, or bone scan.Randomization to OC was unrelated to changes in BMD or BMC in oligo/amenorrheic (N=50) or eumenorrheic runners (N=100). However, treatment-received analyses (which considered actual OC use) showed that oligo/amenorrheic runners who used OC gained about 1% per year in spine BMD (P<0.005) and whole-body BMC (P<0.005), amounts similar to those for runners who regained periods spontaneously and significantly greater than those for runners who remained oligo/amenorrheic (P<0.05). Dietary calcium intake and weight gain independently predicted bone mass gains in oligo/amenorrheic runners. Randomization to OC was not significantly related to stress fracture incidence, but the direction of the effect was protective in both menstrual groups (hazard ratio [95% CI]: 0.57 [0.18, 1.83]), and the effect became stronger in treatment-received analyses. The trial's statistical power was reduced by higher-than-anticipated noncompliance.OC may reduce the risk for stress fractures in female runners, but our data are inconclusive. Oligo/amenorrheic athletes with low bone mass should be advised to increase dietary calcium and take steps to resume normal menses, including weight gain; they may benefit from OC, but the evidence is inconclusive.
View details for DOI 10.1249/mss.0b013e318074e352
View details for Web of Science ID 000249445700004
View details for PubMedID 17805075
-
Risk factors for stress fracture among young female cross-country runners
MEDICINE AND SCIENCE IN SPORTS AND EXERCISE
2007; 39 (9): 1457-1463
Abstract
To identify risk factors for stress fracture among young female distance runners.Participants were 127 competitive female distance runners, aged 18-26, who provided at least some follow-up data in a randomized trial among 150 runners of the effects of oral contraceptives on bone health. After completing a baseline questionnaire and undergoing bone densitometry, they were followed an average of 1.85 yr.Eighteen participants had at least one stress fracture during follow-up. Baseline characteristics associated (P<0.10) in multivariate analysis with stress fracture occurrence were one or more previous stress fractures (rate ratio [RR] [95% confidence interval]=6.42 (1.80-22.87), lower whole-body bone mineral content (RR=2.70 [1.26-5.88] per 1-SD [293.2 g] decrease), younger chronologic age (RR=1.42 [1.05-1.92] per 1-yr decrease), lower dietary calcium intake (RR=1.11 [0.98-1.25] per 100-mg decrease), and younger age at menarche (RR=1.92 [1.15-3.23] per 1-yr decrease). Although not statistically significant, a history of irregular menstrual periods was also associated with increased risk (RR=3.41 [0.69-16.91]). Training-related factors did not affect risk.The results of this and other studies indicate that risk factors for stress fracture among young female runners include previous stress fractures, lower bone mass, and, although not statistically significant in this study, menstrual irregularity. More study is needed of the associations between stress fracture and age, calcium intake, and age at menarche. Given the importance of stress fractures to runners, identifying preventive measures is of high priority.
View details for DOI 10.1249/mss.0b013e318074e54b
View details for Web of Science ID 000249445700003
View details for PubMedID 17805074
-
Effects of ball sports on future risk of stress fracture in runners
CLINICAL JOURNAL OF SPORT MEDICINE
2005; 15 (3): 136-141
Abstract
To evaluate whether playing ball sports during childhood and adolescence is associated with the risk of stress fractures in runners later in life.Retrospective cohort study.National track and field championships, held at Stanford University.One hundred fifty-six elite female and 118 elite male distance runners, age 18 to 44 years.A 1-page questionnaire was used to collect data regarding ages during which athletes played basketball and soccer, as well as other important covariates and outcomes.Athletes reported the ages when stress fractures occurred. Time to event was defined as the number of years from beginning competitive running to the first stress fracture or to current age, if no fracture had occurred.In both men and women, playing ball sports in youth correlated with reduced stress fracture incidence later in life by almost half, controlling for possible confounders. In men, each additional year of playing ball sports conferred a 13% decreased incidence of stress fracture (adjusted hazard ratio [HR] and 95% confidence interval, 0.87 [0.79-0.95]. Among women with regular menses, the HR for each additional year of playing ball sports was similar: 0.87 (0.75-1.00); however, there was no effect of length of time played among women with irregular menses (HR, 1.03 [0.92-1.16]). In men, younger ages of playing ball sports conferred more protection against stress fractures (HR for each 1-year-older age at first exposure, 1.29 [1.14, 1.45]).Runners who participate during childhood and adolescence in ball sports may develop bone with greater and more symmetrically distributed bone mass, and with enhanced protection from future stress fractures.
View details for Web of Science ID 000230329000004
View details for PubMedID 15867555
-
Disordered eating, menstrual irregularity, and bone mineral density in female runners
MEDICINE AND SCIENCE IN SPORTS AND EXERCISE
2003; 35 (5): 711-719
Abstract
To examine the relationships between disordered eating, menstrual irregularity, and low bone mineral density (BMD) in young female runners.Subjects were 91 competitive female distance runners aged 18-26 yr. Disordered eating was measured by the Eating Disorder Inventory (EDI). Menstrual irregularity was defined as oligo/amenorrhea (0-9 menses per year). BMD was measured by dual x-ray absorptiometry.An elevated score on the EDI (highest quartile) was associated with oligo/amenorrhea, after adjusting for percent body fat, age, miles run per week, age at menarche, and dietary fat, (OR [95% CI]: 4.6 [1.1-18.6]). Oligo/amenorrheic runners had lower BMD than eumenorrheic runners at the spine (-5%), hip (-6%), and whole body (-3%), even after accounting for weight, percent body fat, EDI score, and age at menarche. Eumenorrheic runners with elevated EDI scores had lower BMD than eumenorrheic runners with normal EDI scores at the spine (-11%), with trends at the hip (-5%), and whole body (-5%), after adjusting for differences in weight and percent body fat. Runners with both an elevated EDI score and oligo/amenorrhea had no further reduction in BMD than runners with only one of these risk factors.In young competitive female distance runners, (i) disordered eating is strongly related to menstrual irregularity, (ii) menstrual irregularity is associated with low BMD, and (iii) disordered eating is associated with low BMD in the absence of menstrual irregularity.
View details for DOI 10.1249/01.MSS.0000064935.68277.E7
View details for PubMedID 12750578
-
Oral contraceptives and bone mineral density in white and black women in CARDIA
OSTEOPOROSIS INTERNATIONAL
2002; 13 (11): 893-900
Abstract
To examine whether exposure to oral contraceptives (OCs) is associated with bone mineral density (BMD) in young women, we studied, cross-sectionally and longitudinally, 216 white and 260 black women enrolled in the Coronary Artery Risk Development in Young Adults (CARDIA) study. Spine, hip and whole body BMDs were measured by dual-energy X-ray absorptiometry (DXA) when the women were aged 25-37 years, and whole body BMD was remeasured in 369 of the women 3 years later. A comprehensive history of OC use, including dose of ethinyl estradiol (estrogen) and duration of use, was determined from an interviewer-administered questionnaire. After adjustment for other relevant variables, we found that cumulative estrogen from OCs (mg) explained 4.0% of the variation in spine BMD ( p = 0.024) among white women, but did not explain any of the variance in BMD in black women. Cumulative OC estrogen was associated with a decreased risk for low bone density (lowest quartile) at the spine, hip and whole body in white women. The odds ratios (95% CIs) comparing women in the highest quartile of cumulative OC estrogen with those in the lowest quartile were, at the spine: 0.08 (0.02, 0.46); at the hip: 0.23 (0.06, 0.87); and at the whole body: 0.37 (0.11, 1.26). OC exposure was not related to low bone density in black women. OCs did not predict longitudinal changes in whole body BMD in either race. These results suggest that exposure to the estrogen from OCs during the premenopausal years may have a small beneficial effect on the skeleton in white women. Benefit is proportional to the cumulative estrogen exposure, suggesting that previous cross-sectional studies that considered OC use as a dichotomous variable may have lacked the power to detect an association.
View details for Web of Science ID 000179554500007
View details for PubMedID 12415437