All Publications


  • Superiority of compensatory reserve measurement compared to the shock index for early and accurate detection of reduced central blood volume status. The journal of trauma and acute care surgery Convertino, V. A., Thompson, P., Koons, N. J., Le, T. D., Lanier, J. B., Cardin, S. 2023

    Abstract

    Shock index (SI) equals the ratio of heart rate (HR) to systolic blood pressure (SBP) with clinical evidence that it is more sensitive for trauma patient status assessment and prediction of outcome compared to either HR or SBP alone. We used lower body negative pressure (LBNP) as a human model of central hypovolemia and compensatory reserve measurement (CRM) validated for accurate tracking of reduced central blood volume to test the hypotheses that SI: 1) presents a late signal of central blood volume status; 2) displays poor sensitivity and specificity for predicting the onset of hemodynamic decompensation; and 3) cannot identify individuals at greatest risk for the onset of circulatory shock.We measured HR, SBP and CRM in 172 human subjects (19 to 55 years) during progressive LBNP designed to determine tolerance to central hypovolemia as a model of hemorrhage. Subjects were subsequently divided into those with high (HT; n = 118) and low (LT; n = 54) tolerance based on completion of 60 mmHg LBNP. The time course relationship between SI and CRM was determined and Receiver Operating Characteristic (ROC) Area Under the Curve (AUC) was calculated for sensitivity and specificity of CRM and SI to predict hemodynamic decompensation using clinically defined thresholds of 40% for CRM and 0.9 for SI.The time and level of LBNP required to reach a SI = 0.9 (~60 mmHg LBNP) was significantly greater (P < 0.001) compared to CRM that reached 40% at ~40 mmHg LBNP. SI did not differ between HT and LT subjects at 45 mmHg LBNP levels. ROCAUC for CRM was 0.95 (95%CI = 0.94-0.97) compared to 0.91 (0.89-0.94) for SI (P = 0.0002).Despite high sensitivity and specificity, SI delays time to detect reductions in central blood volume with failure to distinguish individuals with varying tolerances to central hypovolemia.Level IV, Diagnostic Test or Criteria.

    View details for DOI 10.1097/TA.0000000000004029

    View details for PubMedID 37199525

  • Virtual Anesthesiology Medical Student Learning Program Pilot Designed in Response to COVID-19. The journal of education in perioperative medicine : JEPM Xi, A. S., Koons, N. J., Schirmer, A., Shanker, A., Goiffon, R. J. 2023; 25 (2): E706

    Abstract

    This learning opportunity was designed to provide an interactive, virtual, educational anesthesiology program for interested medical students and to offer an opportunity to learn more about an institutional culture through a question and answer (Q&A) with program faculty preceptors for the 2020-2021 anesthesiology residency application cycle. We sought to identify if this virtual learning program was a valuable educational tool through a survey.A short Likert-scale survey was sent to medical students before and after participation in a session using REDCap electronic data capture tool. We designed the survey to assess the program's self-reported effect on participants' anesthesiology knowledge, and whether the program design was successful in creating a collaborative experience while also providing a forum to explore residency programs.All respondents found the call useful in building anesthesiology knowledge and networking, and 42 (86%) found the call helpful in deciding where to apply for residency. Overall, 100% of respondents found the call useful, collaborative, engaging, and important to define critical thinking skills.The framework used for this program-virtual asynchronous and synchronous problem-based learning-can be applied broadly with potential benefit to medical student participants challenged by the cancellation of clinical rotations.

    View details for DOI 10.46374/volxxv_issue2_Xi

    View details for PubMedID 37377504

    View details for PubMedCentralID PMC10291957

  • Intralipid (R) improves left ventricular function in rats with lipopolysaccharide-induced endotoxaemia by a Src-STAT3-mediated mechanism BRITISH JOURNAL OF ANAESTHESIA Banerjee, S., Zargari, M., Medzikovic, L., Russino, H., Mikhael, M., Koons, N., Grogan, T., Rahman, S., Eghbali, M., Umar, S. 2023; 130 (2): E183-E187

    View details for DOI 10.1016/j.bja.2022.10.019

    View details for Web of Science ID 000960858500001

    View details for PubMedID 36462942

    View details for PubMedCentralID PMC10170391

  • Intralipid fails to rescue bupivacaine-induced cardiotoxicity in late-pregnant rats FRONTIERS IN MEDICINE Sherman, C., Koons, N., Zargari, M., Cha, C., Hirsch, J., Hong, R., Eghbali, M., Umar, S. 2022; 9: 899036

    Abstract

    Females routinely receive bupivacaine for obstetric and regional anesthesia. An accidental overdose of bupivacaine can result in cardiotoxicity and cardiac arrest. Intralipid (ILP) rescues bupivacaine-induced cardiotoxicity in male rats. However, bupivacaine cardiotoxicity and ILP rescue have not been studied in non-pregnant and late-pregnant female rats. Here, we tested the hypothesis that an appropriate dose of ILP would rescue non-pregnant and late-pregnant rats from bupivacaine-induced cardiotoxicity.Non-pregnant (n = 6) and late-pregnant (n = 7) female rats received intravenous bupivacaine (10-mg/kg bolus) to induce asystole. Resuscitation with 20% ILP (5-ml/kg actual body weight, single bolus, and 0.5-ml/kg/min maintenance) and chest compressions were continued for 10-min. Serial heart rate (HR), left ventricular ejection-fraction (LVEF%), and LV-fractional shortening (LVFS%) were recorded at baseline and 10-min after bupivacaine-induced cardiac arrest. Data are mean ± SD followed by 95% CI. P-values < 0.05 were considered statistically significant.All rats developed cardiac arrest within a few seconds after bupivacaine. All non-pregnant rats were successfully rescued by ILP, with a HR of 280 ± 32 bpm at baseline vs. 212 ± 18 bpm at 10-min post ILP (p < 0.01), LVEF of 70 ± 6% vs. 68 ± 5% (p = ns), and LVFS of 41 ± 5% vs. 39 ± 4% (p = ns). Interestingly, 6 out of 7 late-pregnant rats did not recover with ILP. Baseline HR, LVEF and LVFS for late-pregnant rats were 330 ± 40 bpm, 66 ± 5% and 38 ± 4%, respectively. At 10-min post ILP, the HR, LVEF, and LVFS were 39 ± 102 bpm (p < 0.0001), 8 ± 22% (p < 0.0001), and 5 ± 12% (p < 0.001), respectively.ILP successfully rescued bupivacaine-induced cardiac arrest in non-pregnant rats, but failed to rescue late-pregnant rats.

    View details for DOI 10.3389/fmed.2022.899036

    View details for Web of Science ID 000860719800001

    View details for PubMedID 36035396

    View details for PubMedCentralID PMC9411664

  • Identifying critical DO2 with compensatory reserve during simulated hemorrhage in humans TRANSFUSION Koons, N. J., Moses, C. D., Thompson, P., Strandenes, G., Convertino, V. A. 2022; 62: S122-S129

    Abstract

    Based on previous experiments in nonhuman primates, we hypothesized that DO2 crit in humans is 5-6 ml O2 ·kg-1  min-1 .We measured the compensatory reserve (CRM) and calculated oxygen delivery (DO2 ) in 166 healthy, normotensive, nonsmoking subjects (97 males, 69 females) during progressive central hypovolemia induced by lower body negative pressure as a model of ongoing hemorrhage. Subjects were classified as having either high tolerance (HT; N = 111) or low tolerance (LT; N = 55) to central hypovolemia.HT and LT groups were matched for age, weight, BMI, and vital signs, DO2 and CRM at baseline. The CRM-DO2 relationship was best fitted to a logarithmic model in HT subjects (amalgamated R2  = 0.971) and a second-order polynomial model in the LT group (amalgamated R2  = 0.991). Average DO2 crit for the entire subject cohort was estimated at 5.3 ml O2 ·kg-1  min-1 , but was ~14% lower in HT compared with LT subjects. The reduction in DO2 from 40% CRM to 20% CRM was 2-fold greater in the LT compared with the HT group.Average DO2 crit in humans is 5.3 ml O2 ·kg-1  min-1 , but is ~14% lower in HT compared with LT subjects. The CRM-DO2 relationship is curvilinear in humans, and different when comparing HT and LT individuals. The threshold for an emergent monitoring signal should be recalibrated from 30% to 40% CRM given that the decline in DO2 from 40% CRM to 20% CRM for LT subjects is located on the steepest part of the CRM-DO2 relationship.

    View details for DOI 10.1111/trf.16958

    View details for Web of Science ID 000814407600001

    View details for PubMedID 35733031

  • Needlestick injuries among anesthesia providers from a large US academic center: A 10-year retrospective analysis JOURNAL OF CLINICAL ANESTHESIA Borna, R., Rahimian, R., Koons, B., Grogan, T. R., Umar, S., Turner, J. 2022; 80: 110885

    Abstract

    Anesthesiologists are at high risk for needlestick injury. Such injuries pose a serious health threat from exposure to bloodborne pathogens. This retrospective analysis aimed to examine needlestick injury rate among anesthesia providers between 2010 and 2020 at the University of California Los Angeles, Department of Anesthesiology and Perioperative Medicine to determine specialty-specific factors associated with these injuries.Retrospective analysis.Academic Anesthesiology Department.None.All reported incidents of needlestick injuries to employees are sent to the Injury and Illness Prevention Committee. We included all anesthesia residents, fellows, nurse anesthetists, solo anesthesiologists, and supervising anesthesiologists.The overall rate of reported needlestick injuries was 5.3%. The rates for anesthesia residents were 2.1%, 13.5%, 7.9%, and 6.7% for post graduate year 1-4 (PGY 1-4) residents. The rates were 14.3%, 4.7%, 2.1%, and 6.9% for fellows, nurse anesthetists, supervising anesthesiologists, and solo anesthesiologists, respectively. We found that PGY2 residents had a higher injury rate than PGY1 residents (p-value<0.001). When grouping PGY2, PGY3, and PGY4 residents together, they had a collective rate of 9.4%. Furthermore, residents had a higher needlestick injury rate than supervising anesthesiologists (p-value <0.001).PGY2 residents and fellows had the highest rate of needlestick injury. Our study highlights the trend of increasing sharps injuries after PGY1 while supervising anesthesiologists had the lowest rate. Proposed mechanisms for the increased sharps injuries include residents' transition from medicine-based internship to the operating room environment with increased exposure to potentially injurious equipment, overnight call, and increased work-related and cognitive stress. Improving understanding of institution-specific prevention programs, raising awareness during their initial high-intensity training period with one-to-one supervision when habits are formed, and reducing exposure to sharps using a needleless system are some steps toward reducing the incidence of sharps injuries in a field where the risk remains high.

    View details for DOI 10.1016/j.jclinane.2022.110885

    View details for Web of Science ID 000806683300002

    View details for PubMedID 35644082

  • Physiology of Human Hemorrhage and Compensation COMPREHENSIVE PHYSIOLOGY Convertino, V. A., Koons, N. J., Suresh, M. R. 2021; 11 (2): 1531-1574

    Abstract

    Hemorrhage is a leading cause of death following traumatic injuries in the United States. Much of the previous work in assessing the physiology and pathophysiology underlying blood loss has focused on descriptive measures of hemodynamic responses such as blood pressure, cardiac output, stroke volume, heart rate, and vascular resistance as indicators of changes in organ perfusion. More recent work has shifted the focus toward understanding mechanisms of compensation for reduced systemic delivery and cellular utilization of oxygen as a more comprehensive approach to understanding the complex physiologic changes that occur following and during blood loss. In this article, we begin with applying dimensional analysis for comparison of animal models, and progress to descriptions of various physiological consequences of hemorrhage. We then introduce the complementary side of compensation by detailing the complexity and integration of various compensatory mechanisms that are activated from the initiation of hemorrhage and serve to maintain adequate vital organ perfusion and hemodynamic stability in the scenario of reduced systemic delivery of oxygen until the onset of hemodynamic decompensation. New data are introduced that challenge legacy concepts related to mechanisms that underlie baroreflex functions and provide novel insights into the measurement of the integrated response of compensation to central hypovolemia known as the compensatory reserve. The impact of demographic and environmental factors on tolerance to hemorrhage is also reviewed. Finally, we describe how understanding the physiology of compensation can be translated to applications for early assessment of the clinical status and accurate triage of hypovolemic and hypotensive patients. © 2021 American Physiological Society. Compr Physiol 11:1531-1574, 2021.

    View details for DOI 10.1002/cphy.c200016

    View details for Web of Science ID 000618794800006

    View details for PubMedID 33577122

  • An Analysis of Outcomes and Interventions for Female Pediatric Casualties in Iraq and Afghanistan MILITARY MEDICINE Gale, H. L., Koons, N. J., Borgman, M. A., April, M. D., Schauer, S. G. 2022; 187 (9-10): E1037-E1042

    Abstract

    Traumatic injuries were the most common reason for admission of pediatric patients to military hospitals during the recent wars in Iraq and Afghanistan. We compare survival and interventions between female and male pediatric casualties.This is a secondary analysis of a previously described dataset from the Department of Defense Trauma Registry. We requested pediatric encounters from January 2007 to January 2016 within Iraq and Afghanistan. We separated casualties by sex to compare injury and mortality patterns.Our initial dataset included 3439 pediatric encounters-784 (22.8%) females and 2655 (77.2%) males. Females were less likely to sustain injuries by explosive (38.0% versus 44.5%) but more likely to sustain injuries via alternative mechanisms of injury (28.9% versus 21.5%). Both sexes had similar ISS (females median 10 [5-17], males 10 [4-17]). Fewer females underwent tourniquet application (4.2% versus 7.2%; all findings were significant). In unadjusted and adjusted regression analyses, females under age 8 had lower odds of survival to hospital discharge (OR 0.67, 95% CI 0.51-0.89) compared to males.Among pediatric patients treated by U.S. medical personnel in Iraq and Afghanistan, females had a lower survival to hospital discharge despite similar severity of injury. Further studies are necessary to elucidate causes for this finding.

    View details for DOI 10.1093/milmed/usab024

    View details for Web of Science ID 000764373300001

    View details for PubMedID 33547789

  • Predictors of hemodynamic decompensation in progressive hypovolemia: Compensatory reserve versus heart rate variability JOURNAL OF TRAUMA AND ACUTE CARE SURGERY Schlotman, T. E., Suresh, M. R., Koons, N. J., Howard, J. T., Schiller, A. M., Cardin, S., Convertino, V. A. 2020; 89 (2S): S161-S168

    Abstract

    Hemorrhage remains the leading cause of death following traumatic injury in both civilian and military settings. Heart rate variability (HRV) and heart rate complexity (HRC) have been proposed as potential "new vital signs" for monitoring trauma patients; however, the added benefit of HRV or HRC for decision support remains unclear. Another new paradigm, the compensatory reserve measurement (CRM), represents the integration of all cardiopulmonary mechanisms responsible for compensation during relative blood loss and was developed to identify current physiologic status by estimating the progression toward hemodynamic decompensation. In the present study, we hypothesized that CRM would provide greater sensitivity and specificity to detect progressive reductions in central circulating blood volume and onset of decompensation as compared with measurements of HRV and HRC.Continuous, noninvasive measurements of compensatory reserve and electrocardiogram signals were made on 101 healthy volunteers during lower-body negative pressure (LBNP) to the point of decompensation. Measures of HRV and HRC were taken from electrocardiogram signal data.Compensatory reserve measurement demonstrated a superior sensitivity and specificity (receiver operator characteristic area under the curve [ROC AUC] = 0.93) compared with all HRV measures (ROC AUC ≤ 0.84) and all HRC measures (ROC AUC ≤ 0.86). Sensitivity and specificity values at the ROC optimal thresholds were greater for CRM (sensitivity = 0.84; specificity = 0.84) than HRV (sensitivity, ≤0.78; specificity, ≤0.77), and HRC (sensitivity, ≤0.79; specificity, ≤0.77). With standardized values across all levels of LBNP, CRM had a steeper decline, less variability, and explained a greater proportion of the variation in the data than both HRV and HRC during progressive hypovolemia.These findings add to the growing body of literature describing the advantages of CRM for detecting reductions in central blood volume. Most importantly, these results provide further support for the potential use of CRM in the triage and monitoring of patients at highest risk for the onset of shock following blood loss.

    View details for DOI 10.1097/TA.0000000000002605

    View details for Web of Science ID 000614154300024

    View details for PubMedID 32044875

  • Combat medic testing of a novel monitoring capability for early detection of hemorrhage Koons, N. J., Owens, G. A., Parsons, D. L., Schauer, S. G., Buller, J. L., Convertino, V. A. LIPPINCOTT WILLIAMS & WILKINS. 2020: S146-S152

    Abstract

    Current out-of-hospital protocols to determine hemorrhagic shock in civilian trauma systems rely on standard vital signs with military guidelines relying on heart rate and strength of the radial pulse on palpation, all of which have proven to provide little forewarning for the need to implement early intervention prior to decompensation. We tested the hypothesis that addition of a real-time decision-assist machine-learning algorithm, the compensatory reserve measurement (CRM), used by combat medics could shorten the time required to identify the need for intervention in an unstable patient during a hemorrhage profile as compared with vital signs alone.We randomized combat medics from the Army Medical Department Center and School Health Readiness Center of Excellence into three groups: group 1 viewed a display of no simulated hemorrhage and unchanging vital signs as a control (n = 24), group 2 viewed a display of simulated hemorrhage and changing vital signs alone (hemorrhage; n = 31), and group 3 viewed a display of changing vital signs with the addition of the CRM (hemorrhage + CRM; n = 22). Participants were asked to push a computer key when they believed the patient was becoming unstable and needed medical intervention.The average time of 11.0 minutes (95% confidence interval, 8.7-13.3 minutes) required by the hemorrhage + CRM group to identify an unstable patient (i.e., stop the video sequence) was less by more than 40% (p < 0.01) compared with 18.9 minutes (95% confidence interval, 17.2-20.5 minutes) in the hemorrhage group.The use of a machine-learning monitoring technology designed to measure the capacity to compensate for central blood volume loss resulted in reduced time required by combat medics to identify impending hemodynamic instability.Diagnostic, level IV.

    View details for DOI 10.1097/TA.0000000000002649

    View details for Web of Science ID 000614154300022

    View details for PubMedID 32118826

  • The compensatory reserve: potential for accurate individualized goal-directed whole blood resuscitation TRANSFUSION Convertino, V. A., Koons, N. J. 2020; 60: S150-S157

    Abstract

    Hemorrhagic shock can be mitigated by timely and accurate resuscitation designed to restore adequate delivery of oxygen (DO2 ). Current doctrine of using systolic blood pressure (SBP) as a guide for resuscitation can be associated with increased morbidity. The compensatory reserve measurement (CRM) is a novel vital sign based on the recognition that the sum of all mechanisms that contribute to the compensatory response to hemorrhage reside in features of the arterial pulse waveform. CRM can be assessed continuously and non-invasively in real time. Compared to standard vital signs, CRM provides an early, as well as more sensitive and specific, indicator of patient hemorrhagic status since the activation of compensatory mechanisms occurs immediately at the onset of blood loss. Recent data obtained from our laboratory experiments on non-human primates have demonstrated that CRM is linearly related to DO2 during controlled progressive hemorrhage and subsequent whole blood resuscitation. We used this relationship to determine that the time of hemodynamic decompensation (i.e., CRM = 0%) is defined by a critical DO2 at approximately 5.3 mL O2 ∙kg-1 ∙min-1 . We also demonstrated that a target CRM of 35% during whole blood resuscitation only required replacement of 40% of the total blood volume loss to adequately sustain a DO2 more than 50% (i.e., 8.1 mL O2 ∙kg-1 ∙min-1 ) above critical DO2 (i.e., threshold for decompensated shock) while maintaining hypotensive resuscitation (i.e., SBP at ~90 mmHg). Consistent with our hypothesis, specific values of CRM can be used to accurately maintain DO2 thresholds above critical DO2 , avoiding the onset of hemorrhagic shock with whole blood resuscitation.

    View details for DOI 10.1111/trf.15632

    View details for Web of Science ID 000536670100001

    View details for PubMedID 32478902

  • Tracking DO(2)with Compensatory Reserve During Whole Blood Resuscitation in Baboons SHOCK Koons, N. J., Nguyen, B., Suresh, M. R., Hinojosa-Laborde, C., Convertino, V. A. 2020; 53 (3): 327-334

    Abstract

    Hemorrhagic shock can be mitigated by timely and accurate resuscitation designed to restore adequate delivery of oxygen (DO2) by increasing cardiac output (CO). However, standard care of using systolic blood pressure (SBP) as a guide for resuscitation may be ineffective and can potentially be associated with increased morbidity. We have developed a novel vital sign called the compensatory reserve measurement (CRM) generated from analysis of arterial pulse waveform feature changes that has been validated in experimental and clinical models of hemorrhage. We tested the hypothesis that thresholds of DO2 could be accurately defined by CRM, a noninvasive clinical tool, while avoiding over-resuscitation during whole blood resuscitation following a 25% hemorrhage in nonhuman primates. To accomplish this, adult male baboons (n = 12) were exposed to a progressive controlled hemorrhage while sedated that resulted in an average (± SEM) maximal reduction of 508 ± 18 mL of their estimated circulating blood volume of 2,130 ± 60 mL based on body weight. CRM increased from 6 ± 0.01% at the end of hemorrhage to 70 ± 0.02% at the end of resuscitation. By linear regression, CRM values of 6% (end of hemorrhage), 30%, 60%, and 70% (end of resuscitation) corresponded to calculated DO2 values of 5.9 ± 0.34, 7.5 ± 0.87, 9.3 ± 0.76, and 11.6 ± 1.3 mL O2·kg·min during resuscitation. As such, return of CRM to ∼65% during resuscitation required only ∼400 mL to restore SBP to 128 ± 6 mmHg, whereas total blood volume replacement resulted in over-resuscitation as indicated by a SBP of 140 ± 7 mmHg compared with an average baseline value of 125 ± 5 mmHg. Consistent with our hypothesis, thresholds of calculated DO2 were associated with specific CRM values. A target resuscitation CRM value of ∼65% minimized the requirement for whole blood while avoiding over-resuscitation. Furthermore, 0% CRM provided a noninvasive metric for determining critical DO2 at approximately 5.3 mL O2·kg·min.

    View details for DOI 10.1097/SHK.0000000000001367

    View details for Web of Science ID 000561890200010

    View details for PubMedID 32045396

  • Physiological comparison of hemorrhagic shock and VO(2)max: A conceptual framework for defining the limitation of oxygen delivery EXPERIMENTAL BIOLOGY AND MEDICINE Convertino, V. A., Lye, K. R., Koons, N. J., Joyner, M. J. 2019; 244 (8): 690-701

    Abstract

    Disturbance of normal homeostasis occurs when oxygen delivery and energy stores to the body's tissues fail to meet the energy requirement of cells. The work submitted in this review is important because it advances the understanding of inadequate oxygen delivery as it relates to early diagnosis and treatment of circulatory shock and its relationship to disturbance of normal functioning of cellular metabolism in life-threatening conditions of hemorrhage. We explored data from the clinical and exercise literature to construct for the first time a conceptual framework for defining the limitation of inadequate delivery of oxygen by comparing the physiology of hemorrhagic shock caused by severe blood loss to maximal oxygen uptake induced by intense physical exercise. We also provide a translational framework in which understanding the fundamental relationship between the body's reserve to compensate for conditions of inadequate oxygen delivery as a limiting factor to V˙ O2max helps to re-evaluate paradigms of triage for improved monitoring of accurate resuscitation in patients suffering from hemorrhagic shock.

    View details for DOI 10.1177/1535370219846425

    View details for Web of Science ID 000470757400006

    View details for PubMedID 31042073

    View details for PubMedCentralID PMC6552402

  • Interrelationship Between Sex, Age, Blood Volume, and (V) over doto(2max) AEROSPACE MEDICINE AND HUMAN PERFORMANCE Koons, N. J., Suresh, M. R., Schlotman, T. E., Convertino, V. A. 2019; 90 (4): 362-368

    Abstract

    BACKGROUND: Circulating blood volume (BV) and maximal oxygen uptake (Vo2max) are physiological characteristics important for optimal human performance in aerospace and military operational environments. We tested the hypothesis that BV and Vo2max are lower in older people independent of sex.METHODS: To accomplish this, a "data mining" effort of an historic database generated from NASA and U.S. Air Force experiments was conducted. BV, red cell volume, plasma volume, hematocrit, and Vo2max were measured in 84 healthy individuals (24 women, 60 men) across an age range of 23 to 65 yr to assess the interrelationship between sex, age, BV, and Vo2max. Subjects were classified in age groups by < 40 yr and ≥ 40 yr; these groups identified women as pre- vs. postmenopausal.RESULTS: Consistent with our hypothesis, comparisons revealed that men had higher BV, red cell volume, hematocrit, and Vo2max than women when standardized for body mass. Against expectations, BV was not different in older compared with younger men and women. Vo2max was not different in older compared with younger women, while Vo2max was lower in older men.CONCLUSION: We conclude that physiological mechanisms other than BV associated with aging appear to be responsible for a decline in Vo2max of our older men. Furthermore, factors other than menopause may also influence the control of BV in the women. Our results provide evidence that aging may not compromise men or women in scenarios where BV can affect performance in aerospace and military environments.Koons NJ, Suresh MR, Schlotman TE, Convertino VA. Interrelationship between sex, age, blood volume, and Vo2max. Aerosp Med Hum Perform. 2019; 90(4):362-368.

    View details for DOI 10.3357/AMHP.5255.2019

    View details for Web of Science ID 000462785900002

    View details for PubMedID 30922423

  • Hemostatic responses to exercise, dehydration, and simulated bleeding in heat-stressed humans AMERICAN JOURNAL OF PHYSIOLOGY-REGULATORY INTEGRATIVE AND COMPARATIVE PHYSIOLOGY Borgman, M. A., Zaar, M., Aden, J. K., Schlader, Z. J., Gagnon, D., Rivas, E., Kern, J., Koons, N. J., Convertino, V. A., Cap, A. P., Crandall, C. 2019; 316 (2): R145-R156

    Abstract

    Heat stress followed by an accompanying hemorrhagic challenge may influence hemostasis. We tested the hypothesis that hemostatic responses would be increased by passive heat stress, as well as exercise-induced heat stress, each with accompanying central hypovolemia to simulate a hemorrhagic insult. In aim 1, subjects were exposed to passive heating or normothermic time control, each followed by progressive lower-body negative pressure (LBNP) to presyncope. In aim 2 subjects exercised in hyperthermic environmental conditions, with and without accompanying dehydration, each also followed by progressive LBNP to presyncope. At baseline, pre-LBNP, and post-LBNP (<1, 30, and 60 min), hemostatic activity of venous blood was evaluated by plasma markers of hemostasis and thrombelastography. For aim 1, both hyperthermic and normothermic LBNP (H-LBNP and N-LBNP, respectively) resulted in higher levels of factor V, factor VIII, and von Willebrand factor antigen compared with the time control trial (all P < 0.05), but these responses were temperature independent. Hyperthermia increased fibrinolysis [clot lysis 30 min after the maximal amplitude reflecting clot strength (LY30)] to 5.1% post-LBNP compared with 1.5% (time control) and 2.7% in N-LBNP ( P = 0.05 for main effect). Hyperthermia also potentiated increased platelet counts post-LBNP as follows: 274 K/µl for H-LBNP, 246 K/µl for N-LBNP, and 196 K/µl for time control ( P < 0.05 for the interaction). For aim 2, hydration status associated with exercise in the heat did not affect the hemostatic activity, but fibrinolysis (LY30) was increased to 6-10% when subjects were dehydrated compared with an increase to 2-4% when hydrated ( P = 0.05 for treatment). Central hypovolemia via LBNP is a primary driver of hemostasis compared with hyperthermia and dehydration effects. However, hyperthermia does induce significant thrombocytosis and by itself causes an increase in clot lysis. Dehydration associated with exercise-induced heat stress increases clot lysis but does not affect exercise-activated or subsequent hypovolemia-activated hemostasis in hyperthermic humans. Clinical implications of these findings are that quickly restoring a hemorrhaging hypovolemic trauma patient with cold noncoagulant fluids (crystalloids) can have serious deleterious effects on the body's innate ability to form essential clots, and several factors can increase clot lysis, which should therefore be closely monitored.

    View details for DOI 10.1152/ajpregu.00223.2018

    View details for Web of Science ID 000459252200008

    View details for PubMedID 30231210

    View details for PubMedCentralID PMC6397353

  • Proceedings of the 5th Annual United States Army Institute of Surgical Research Summer Undergraduate Research Internship Program 2017 Leon, R., Jensen, K., Abijay, C., Dolmetsch, T., Koons, N., Darlington, D. N., Cap, A., Wu, X., Delavan, C. P., Herzig, M. C., Christy, B. A., Kempski, K. M., Blackburn, A. N., De Lorenzo, R. A., Blackburn, M. B., Donald, M. C., Klemcke, H. G., Harrington, B. K., He, C. J., Gomez, B. I., Chao, T., Little, J. S., Heard, T. C., Dubick, M. A., Burmeister, D. M., Xu, A., Walker, K., Mohammadipoor, A., Rodriguez, L., Roberts, T., Batchinsky, A., Cancio, L., Antebi, B., Walford, R. A., McIntosh, C. S., Peltier, G. C., Sharma, U., Montgomery, R. K., Meledeo, M. A., Bynum, J. A., Lovelace, S., Estlack, L., Kazen, L., Mangum, L. C., Garcia, G. R., Akers, K. S., Greene, W., Cornell, L. BIOMED CENTRAL LTD. 2017