- Kidney and Pancreas Transplantation
Professor - Med Center Line, Medicine - Nephrology
Program Director, Stanford Transplant Nephrology Fellowship Training Program (2006 - Present)
Program Director, Stanford Nephrology Fellowship Training Program (2005 - Present)
Medical Director, Stanford Adult Kidney and Pancreas Transplantation (1992 - Present)
Residency:West Virginia University Hospital (1981) WV
Fellowship:University of Rochester (1983) NY
Board Certification: Nephrology, American Board of Internal Medicine (1984)
Board Certification: Internal Medicine, American Board of Internal Medicine (1981)
Internship:West Virginia University Hospital (1981) WV
Medical Education:Medical College of Virginia (1978) VA
M.D., Medical College of Virginia (1978)
A.B., Davidson College, Psychology (1974)
Current Research and Scholarly Interests
Tolerance induction in clinical kidney transplantation
Combined Blood Stem Cell and Kidney Transplant of One Haplotype Match Living Donor Pairs.
The Stanford Medical Center Program in Multi-Organ Transplantation and the Division of Bone marrow Transplantation are enrolling patients into a research study to determine if donor stem cells given after a living related one Haplotype match kidney transplantation will change the immune system such that immunosuppressive drugs can be completely withdrawn.
Graduate and Fellowship Programs
Correlates and outcomes of warfarin initiation in kidney transplant recipients newly diagnosed with atrial fibrillation.
Nephrology, dialysis, transplantation : official publication of the European Dialysis and Transplant Association - European Renal Association
2015; 30 (2): 321-329
In the kidney transplant population with atrial fibrillation (AF), evidence regarding the effectiveness and safety of warfarin treatment is lacking. We used fee-for-service Medicare claims to identify kidney transplant recipients with newly diagnosed AF from the United States Renal Data System. Warfarin use within 30 days of AF diagnosis was ascertained from Medicare Part D prescription claims (2007-11) or using a validated algorithm (1997-2011). The study end points were (i) the composite of death, stroke or gastrointestinal bleed, (ii) death and (iii) death-censored graft failure. Warfarin user and non-user groups were balanced using inverse probability of treatment weighting and hazard ratios were (HRs) estimated using Cox regression. Among 718 subjects with an indication for anticoagulation, 24% initiated warfarin treatment within 30 days of AF diagnosis. Age was the only independent correlate of warfarin use [odds ratio = 1.02 per year; 95% confidence interval (95% CI) 1.01-1.04]. In the larger cohort of 6492 patients with AF, warfarin use [(23.5%) versus non-use (76.5%)] was associated with small and non-significant reductions in the composite of death, stroke or gastrointestinal bleed (HR = 0.92; 95% CI 0.83-1.02), death (HR = 0.92; 95% CI 0.82-1.02) and death-censored graft failure (HR = 0.90; 95% CI 0.76-1.08). Our study suggests the need for clinical trials of warfarin use in the kidney transplant population with AF.
View details for DOI 10.1093/ndt/gfu323
View details for PubMedID 25335507
Outcomes After Kidney Transplantation of Patients Previously Diagnosed With Atrial Fibrillation
AMERICAN JOURNAL OF TRANSPLANTATION
2013; 13 (6): 1566-1575
Little is known about the prevalence and outcomes of patients with atrial fibrillation/flutter (AF) who receive a kidney transplant. We identified all patients who had >1 year of uninterrupted Medicare A+B coverage before receiving their first kidney transplant (1997-2009). The presence of pretransplant AF was ascertained from diagnosis codes in Medicare physician claims. We studied the posttransplant outcomes of death, all-cause graft failure, death-censored graft failure and stroke using multivariable Cox regression. Of 62 706 eligible first kidney transplant recipients studied, 3794 (6.4%) were diagnosed with AF prior to kidney transplant. Over a mean follow up of 4.9 years, 40.6% of AF patients and 24.9% without AF died. All-cause and death-censored graft failure were 46.8% and 16.5%, respectively, in the AF group and 36.4% and 19.5%, respectively, in those without AF. Ischemic stroke occurred in 2.8% of patients with and 1.6% of patients without AF. In patients with AF, multivariable-adjusted hazard ratios (95% confidence intervals) for death, graft failure, death-censored graft failure and ischemic stroke were 1.46 (1.38-1.54), 1.41 (1.34-1.48), 1.26 (1.15-1.37) and 1.36 (1.10-1.68), respectively. Pre-existing AF is associated with poor posttransplant outcomes. Special attention should be paid to AF in pretransplant evaluation, counseling and risk stratification of kidney transplant candidates.
View details for DOI 10.1111/ajt.12197
View details for Web of Science ID 000319706900024
Tolerance and Withdrawal of Immunosuppressive Drugs in Patients Given Kidney and Hematopoietic Cell Transplants
AMERICAN JOURNAL OF TRANSPLANTATION
2012; 12 (5): 1133-1145
Sixteen patients conditioned with total lymphoid irradiation (TLI) and antithymocyte globulin (ATG) were given kidney transplants and an injection of CD34+ hematopoietic progenitor cells and T cells from HLA-matched donors in a tolerance induction protocol. Blood cell monitoring included changes in chimerism, balance of T-cell subsets and responses to donor alloantigens. Fifteen patients developed multilineage chimerism without graft-versus-host disease (GVHD), and eight with chimerism for at least 6 months were withdrawn from antirejection medications for 1-3 years (mean, 28 months) without subsequent rejection episodes. Four chimeric patients have just completed or are in the midst of drug withdrawal, and four patients were not withdrawn due to return of underlying disease or rejection episodes. Blood cells from all patients showed early high ratios of CD4+CD25+ regulatory T cells and NKT cells versus conventional naive CD4+ T cells, and those off drugs showed specific unresponsiveness to donor alloantigens. In conclusion, TLI and ATG promoted the development of persistent chimerism and tolerance in a cohort of patients given kidney transplants and hematopoietic donor cell infusions. All 16 patients had excellent graft function at the last observation point with or without maintenance drugs.
View details for DOI 10.1111/j.1600-6143.2012.03992.x
View details for Web of Science ID 000303235100012
View details for PubMedID 22405058
Living donor evaluation and exclusion: the Stanford experience
2011; 25 (5): 697-704
The proportion of prospective living donors disqualified for medical reasons is unknown. The objective of this study is to delineate and quantify specific reasons for exclusion of prospective living donors from kidney donation.All adult prospective kidney donors who contacted our transplant program between October 1, 2007 and April 1, 2009 were included in our analysis (n?=?484). Data were collected by review of an electronic transplant database.Of the 484 prospective donors, 39 (8%) successfully donated, 229 (47%) were excluded, 104 (22%) were actively undergoing evaluation, and 112 (23%) were withdrawn before evaluation was complete. Criteria for exclusion were medical (n?=?150), psychosocial (n?=?22), or histocompatibility (n?=?57) reasons. Of the 150 prospective donors excluded for medical reasons, 79% were excluded because of obesity, hypertension, nephrolithiasis, and/or abnormal glucose tolerance. One hundred and forty-seven (61%) intended recipients had only one prospective living donor, of whom 63 (42%) were excluded.A significant proportion of prospective living kidney donors were excluded for medical reasons such as obesity (body mass index >30), hypertension, nephrolithiasis, and abnormal glucose tolerance. Longer-term studies are needed to characterize the risks to medically complex kidney donors and the potential risks and benefits afforded to recipients.
View details for DOI 10.1111/j.1399-0012.2010.01336.x
View details for Web of Science ID 000296262300018
View details for PubMedID 21044160
- United Network for Organ Sharing (UNOS) Organ Allocation Policy and Kidney Utilization AMERICAN JOURNAL OF KIDNEY DISEASES 2010; 56 (1): 7-9
Brief report: Tolerance and chimerism after renal and hematopoietic-cell transplantation
NEW ENGLAND JOURNAL OF MEDICINE
2008; 358 (4): 362-368
We describe a recipient of combined kidney and hematopoietic-cell transplants from an HLA-matched donor. A post-transplantation conditioning regimen of total lymphoid irradiation and antithymocyte globulin allowed engraftment of the donor's hematopoietic cells. The patient had persistent mixed chimerism, and the function of the kidney allograft has been normal for more than 28 months since discontinuation of all immunosuppressive drugs. Adverse events requiring hospitalization were limited to a 2-day episode of fever with neutropenia. The patient has had neither rejection episodes nor clinical manifestations of graft-versus-host disease.
View details for Web of Science ID 000252507900006
Rituximab failed to improve nephrotic syndrome in renal transplant patients with recurrent focal segmental glomerulosclerosis
AMERICAN JOURNAL OF TRANSPLANTATION
2008; 8 (1): 222-227
Focal segmental glomerulosclerosis (FSGS) recurs in 30% of patients with FSGS receiving a first renal transplant and in over 80% of patients receiving a second transplant after a recurrence. Recurrence often leads to graft failure. The pathogenesis remains unknown and may involve a circulating permeability factor that initiates injury to the glomerular capillary. There are anecdotal reports of pediatric patients with posttransplant lymphoproliferative disorder (PTLD) and recurrent FSGS who have had remission of proteinuria after treatment with rituximab. These observations have prompted speculation that B cells may play a role in the pathogenesis of recurrent FSGS. We report four consecutive adult patients with early recurrent FSGS refractory or dependent on plasmapheresis who received rituximab (total dose 2000-4200 mg). None of the patients treated with rituximab achieved remission in proteinuria, and one patient experienced early graft loss. In these four adult renal transplant patients with recurrent FSGS, rituximab failed to diminish proteinuria.
View details for DOI 10.1111/j.1600-6143.2007.02021.x
View details for Web of Science ID 000251859400033
View details for PubMedID 17979998
- High rates of coronary artery stenosis detected by angiography in diabetic renal transplant candidates NATURE CLINICAL PRACTICE NEPHROLOGY 2007; 3 (4): 194-195
The role of pre-emptive re-transplant in graft and recipient outcome
NEPHROLOGY DIALYSIS TRANSPLANTATION
2006; 21 (5): 1355-1364
The effect of the pre-emptive re-transplant, and of inter-transplant waiting time generally, on graft and recipient survival is not well established.Analysis of the United States Renal Data System (USRDS) data (1/1/90 through 12/31/00; n = 92,844) was performed. Cox regression was used to analyse time to event, with an additional analysis to stratify by transplant era.Having a prior transplant, as well as the total number of transplants, was related to an increased risk of graft failure [hazard ratio (HR) 1.24, P<0.001 for history of prior transplant; HR 1.35 per transplant, P<0.001], but not to recipient death. The time waiting for re-transplant slightly worsened the risk for recipient mortality in the entire patient population and in the recipients of single re-transplant (HR 1.003 and 1.004 per month respectively, P<0.001), and for graft failure only in recipients of single re-transplant (HR 1.001 per month, P<0.05). Pre-emptive re-transplant (dialysis-free re-transplant or transplant within 6 days of last graft failure) increased the risk of graft failure (HR 1.36, P<0.001) and did not have any statistically significant effect on recipient survival. The longer duration of prior graft survival but not the type of the graft (living vs deceased) had protective effect on the consecutive graft and recipient survival.With the potential caveats associated with retrospective data analysis, these results suggest that pre-emptive re-transplantation is associated with increased risk of graft failure, while longer time on dialysis in between transplants is associated with negative effect upon graft and recipient survival in most patient subgroups. The optimal time in between graft failure and re-transplant was not evaluated in this study. Further prospective studies might be needed to confirm the observed effects.
View details for DOI 10.1093/ndt/gfk061
View details for Web of Science ID 000237004900034
View details for PubMedID 16476722
Validation of a screening protocol for identifying low-risk candidates with type 1 diabetes mellitus for kidney with or without pancreas transplantation
2006; 20 (2): 139-146
Certain clinical risk factors are associated with significant coronary artery disease in kidney transplant candidates with diabetes mellitus. We sought to validate the use of a clinical algorithm in predicting post-transplantation mortality in patients with type 1 diabetes. We also examined the prevalence of significant coronary lesions in high-risk transplant candidates.All patients with type 1 diabetes evaluated between 1991 and 2001 for kidney with/without pancreas transplantation were classified as high-risk based on the presence of any of the following risk factors: age >or=45 yr, smoking history >or=5 pack years, diabetes duration >or=25 yr or any ST-T segment abnormalities on electrocardiogram. Remaining patients were considered low risk. All high-risk candidates were advised to undergo coronary angiography. The primary outcome of interest was all-cause mortality post-transplantation.Eighty-four high-risk and 42 low-risk patients were identified. Significant coronary artery stenosis was detected in 31 high-risk candidates. Mean arterial pressure was a significant predictor of coronary stenosis (odds ratio 1.68; 95% confidence interval 1.14-2.46), adjusted for age, sex and duration of diabetes. In 75 candidates who underwent transplantation with median follow-up of 47 months, the use of clinical risk factors predicted all eight deaths. No deaths occurred in low-risk patients. A significant mortality difference was noted between the two risk groups (p = 0.03).This clinical algorithm can identify patients with type 1 diabetes at risk for mortality after kidney with/without pancreas transplant. Patients without clinical risk factors can safely undergo transplantation without further cardiac evaluation.
View details for DOI 10.1111/j.1399-0012.2005.00461.x
View details for Web of Science ID 000237095200001
View details for PubMedID 16640517
Kidney transplant candidate evaluation
SEMINARS IN DIALYSIS
2005; 18 (6): 487-494
The practicing nephrologist is an indispensable component in the evaluation of the candidate for kidney transplantation, from referral to the transplant center to eventual transplantation, which now may be years later. Early referral may lead to preemptive transplantation, the ideal that has been achieved in 25% of living donor transplant cases. Annually approximately 30% of U.S. deceased donor kidneys are now transplanted under the allocation policies for zero human leukocyte antigen (HLA) mismatch kidneys and expanded criteria donor kidneys. Under either of these programs, candidates may receive a kidney offer soon after entering the wait-list, so prompt and complete evaluation and preparation by the practicing nephrologist is necessary for successful early transplantation. The remaining candidates require periodic review while ascending the wait-list and thorough repeat evaluation when nearing the top, as years may have passed since initial evaluation. Wait-list management is a major challenge faced by transplant centers, aggravated by the inexorable growth of the list. Active communication between the practicing nephrologist and the transplant center is essential to maintain the candidate's preparation for transplantation.
View details for Web of Science ID 000233515800009
View details for PubMedID 16398711
- Renal juxtaglomerular apparatus hyperplasia NEPHROLOGY DIALYSIS TRANSPLANTATION 2005; 20 (10): 2282-2283
The role of pretransplantation renal replacement therapy modality in kidney allograft and recipient survival
AMERICAN JOURNAL OF KIDNEY DISEASES
2005; 46 (3): 537-549
The effect of pretransplantation renal replacement therapy (RRT) modality on allograft and recipient survival outcome is not well understood.We studied allograft and recipient survival by using US Renal Data System records from January 1, 1990, to December 31, 1999, with a follow-up period through December 31, 2000 (n = 92,844; 60% males; 70% white; 23% black). Pretransplantation and predominant RRT modality during the end-stage renal disease (ESRD) period and number and specific combinations of RRT modalities were evaluated.Compared with hemodialysis (HD), a Cox model showed that peritoneal dialysis (PD) immediately before transplantation predicts a 3% lower risk for graft failure (P < 0.05) and 6% lower risk for recipient death (P < 0.001). When predominant RRT modality was analyzed (modality used for > 50% of the ESRD time), PD (hazard ratio [HR], 0.97; P < 0.05) had a protective effect for graft survival compared with HD. Better recipient survival also was associated with PD (HR, 0.96; P < 0.05). Increased number of RRT modalities during the ESRD course was associated with increased risk for graft failure (HR, 1.04 per additional modality used; P < 0.005) and recipient death (HR, 1.11 per additional modality used; P < 0.001). Any combination or any single modality (except for PD + HD for graft survival and PD + HD and PD + HD + transplantation for recipient survival) had protective effects on graft and recipient survival compared with HD.Our results suggest that compared with PD, HD as an RRT modality immediately before transplantation or as a predominant RRT modality during the ESRD course, used alone or in combination with other RRT modalities, is associated with increased risks for graft failure and recipient death. Increased number of RRT modalities used during the ESRD course is associated with worsening of graft and recipient survival.
View details for DOI 10.1053/j.ajkd.2005.05.013
View details for Web of Science ID 000231847500019
View details for PubMedID 16129217
Duration of end-stage renal disease and kidney transplant outcome
NEPHROLOGY DIALYSIS TRANSPLANTATION
2005; 20 (1): 167-175
Patients nearing end-stage renal disease (ESRD) increasingly choose pre-emptive renal transplant (PRT) to avoid pre-transplant dialysis and to minimize ESRD. Compared with long-term dialysis, PRT has been shown to increase allograft survival. However, the merit of short-term dialysis is not well characterized, and it may be the better medical choice in some patients. The goal of the study was to characterize the relationship between the duration of dialysis vs allograft and patient survival.We performed a retrospective nationwide cohort study of all kidney transplants (Tx) between January 1, 1990 and December 31, 1999, with a follow-up period through December 31, 2000. Participants were identified using the United States Renal Data System (USRDS), which tracks all ESRD cases in the nation including patients on dialysis and with kidney Tx. Patients with the history of more than one kidney Tx were excluded. Allograft survival and recipient survival were the primary outcomes of this study. Duration of ESRD as a continuous variable as well as divided into categories (14 days, 15-60 days, 61-180 days, 181-365 days, 1-2 years, 2-3 years, 3-5 years and >5 years) was the primary risk factor of interest. Models were adjusted for multiple donor and recipient factors, including demographics and co-morbidities, as well as for Tx procedure characteristics.A total of 81,130 patient records were used for analysis (age 44.1+/-14.3 years, 61% males, 24% black, 29% diabetic, pre-transplant ESRD duration 27.1+/-26.4 months, 26% living donors). ESRD duration, as a continuous variable, is associated with a modest increase in the risk of graft failure over time [hazard ratio (HR) 1.02 per year of ESRD duration, P<0.001]. When ESRD is studied as a categorical variable (duration of 0-14 days vs longer durations), the increased risk of allograft failure reached statistical significance only when the time on dialysis was > or =181 days. The duration of ESRD was a significant risk for recipient death (HR 1.04 per year, P<0.001); however, mortality risk reached statistical significance only when the patient had been on dialysis for > or =1 year.This study of USRDS records suggests that a short (<6 months) dialysis course has no detrimental effect on graft and patient survival, and should not be deferred if medically indicated.
View details for DOI 10.1093/ndt/gfh541
View details for Web of Science ID 000226702200025
View details for PubMedID 15546892
Dual-kidney transplantation with organs from expanded criteria donors: A long-term follow-up
2004; 78 (5): 692-696
Since 1995, dual-kidney transplantation using organs from marginal donors has been used at our center to expand the organ donor pool and decrease the waiting time for deceased donor kidney transplantation. This approach has allowed for a shorter waiting period without compromising outcome in the early posttransplant period. We now have 8-year follow-up in the first recipients. Older individuals were offered this option preferentially, because we reasoned that they would stand to benefit most from the shorter waiting period.Patients aged 55 years or more who underwent either dual-kidney transplantation with expanded criteria donors or single-kidney transplantation with standard donors were included in this study. All expanded criteria donor organs were those that were refused by all other local transplant centers. The primary endpoints were recipient death and graft failure.Waiting time for dual-kidney transplantation was 440 +/- 38 days versus 664 +/- 51 days for single-kidney transplantation (P<0.01). The 8-year actuarial patient survivals for the single- and dual-kidney transplants were 74.1% and 82.1%, respectively. The 8-year actuarial graft survivals for the single- and dual-kidney transplants were 59.4% and 69.7%, respectively.Eight-year actuarial patient and graft survivals in older individuals who underwent dual-kidney transplantation are equivalent to those who underwent standard single-kidney transplantation. With the continuing organ shortage and increasing waiting times for cadaver kidney transplantation, dual-kidney transplantation using organs that would otherwise be discarded offers a good option for older individuals who may not withstand a long waiting period.
View details for DOI 10.1097/01.tp.0000130452.01521.b1
View details for Web of Science ID 000223935400010
View details for PubMedID 15371670
Prospective, randomized trial of the effect of antibody induction in simultaneous pancreas and kidney transplantation: Three-year results
2004; 77 (8): 1269-1275
Historically, antibody induction has been used because of the higher immunologic risk of graft loss or rejection observed in simultaneous pancreas and kidney (SPK) transplantation compared with kidney transplantation alone. This trial was designed to assess the effect of antibody induction in SPK transplant recipients receiving tacrolimus, mycophenolate mofetil, and corticosteroids. Induction agents included T-cell-depleting and interleukin-2 receptor antibodies.A total of 174 SPK transplant recipients were enrolled in a prospective, open-label, multi-center study. They were randomized to induction (n=87) or non-induction (n=87) groups and followed for 3 years.At 3 years, actual patient (94.3% and 89.7%) and pancreas (75.9% and 75.9%) survivals were similar between the induction and non-induction groups, respectively. Actual kidney survival was similar at 1 and 2 years, but at 3 years, it was significantly better in the induction group compared with the non-induction group (92% vs. 81.6%; P =0.04). At 3 years, median serum creatinine and hemoglobin A1C were similar between the induction and non-induction groups (1.35 mg/dL and 1.20 mg/dL, 5.4% and 5.5%, respectively). Three-year cumulative incidence of biopsy-confirmed, treated acute kidney rejection in the induction and non-induction groups was 19.5% and 27.5% (P =0.14), respectively, with odds 4.6 times greater in African Americans regardless of treatment (P =0.004). Significantly higher rates of cytomegalovirus (CMV) viremia and CMV syndrome occurred in those receiving T-cell-depleting antibody induction (36.1%) when compared with those receiving anti-interleukin-2 receptor antibodies (2%) and non-induction (8.1%) (P <0.0001).Tacrolimus, mycophenolate mofetil, and corticosteroids resulted in excellent safety and efficacy in SPK transplant recipients. Actual 3-year kidney survival was significantly better in the induction group; however, CMV viremia and CMV syndrome rates were significantly higher in the T-cell-depleting antibody group. African Americans demonstrated a significantly greater risk of acute rejection despite antibody induction. Decisions regarding the use of induction therapy must weigh the risk of kidney graft loss or rejection against the risk of infection.
View details for DOI 10.1097/01.TP.0000123903.12311.36
View details for Web of Science ID 000221130900025
View details for PubMedID 15114097
Approaches to transplantation tolerance in humans
2004; 77 (6): 932-936
Although transplantation tolerance to organ allografts has been achieved using a wide variety of immunologic interventions in laboratory animals, few tolerance induction protocols with complete immunosuppressive drug withdrawal have been tested in humans. Preclinical and clinical studies of the use of total lymphoid irradiation for the induction of chimeric and nonchimeric tolerance are summarized here.
View details for DOI 10.1097/01.TP.0000117782.93598.6E
View details for Web of Science ID 000220460500027
View details for PubMedID 15077041
Prediction of 3-yr cadaveric graft survival based on pre-transplant variables in a large national dataset
2003; 17 (6): 485-497
Pre- and post-transplant predictive factors of graft survival for optimal and expanded criteria grafts have been studied in the past. The goal of our study was to evaluate the recent large set of United Network of Organ Sharing records (1990-1998) to generate a prediction algorithm of 3-yr graft survival based on pre-transplant variables alone. The dataset of patients with end-stage renal disease and cadaveric kidney or kidney-pancreas transplantation (1990-1998) used in the study consisted of 37,407 records. Logistic regression (LM) and a tree-based model (TBM) were used to identify predictors of 3-yr allograft survival and to generate prediction algorithm. Donor and recipient demographic characteristics (age, race, and gender) and body mass index showed non-linear, while human leukocyte antigen match showed strong linear relationships with 3-yr graft survival. Prediction of the probability of graft survival from the model, achieved a good match with the observed survival of the separate dataset, with a correlation of r = 0.998 for LM and r = 0.984 for TBM. The positive predictive value (PV) of allograft survival with LM and TBM was 76.0% and the negative PV was 63 and 53.8% for LM and TBM, respectively. Both LM and the TBM can potentially be used in clinical practice for long-term prediction of kidney allograft survival based on pre-transplant variables.
View details for Web of Science ID 000186367500001
View details for PubMedID 14756263
A randomized, placebo-controlled trial of IGF-1 for delayed graft function: A human model to study postischemic ARF
2003; 64 (2): 593-602
Insulin-like growth factor (IGF-1) has been shown in animal models to accelerate recovery from acute renal failure (ARF). However, a therapeutic trial of recombinant human (rh) IGF-1 in patients with ARF in the intensive care unit (ICU) failed to demonstrate efficacy . Such patients often had multiple organ failure, recurrent renal injury, and a delay of several days before commencing treatment.To circumvent these confounding factors, we randomized recipients of cadaveric renal allografts to immediate (<5 hours) rhIGF-1 versus placebo therapy (100 mg/kg subcutaneously twice a day for 6 days). Preliminary observations 3 hours posttransplantation in an additional 44 patients revealed a creatinine clearance < or = 20 mL/min to predict protracted ARF. Thus, this value was used to determine study eligibility.Creatinine clearance prior to commencing treatment was not significantly different between the two groups (8 +/- 5 mL/min for IGF-1 and 7 +/- 6 mL/min for placebo; P = 0.39). Inulin clearance on day 7, the primary outcome measure, was 21 +/- 22 mL/min and 19 +/- 19 mL/min in the IGF-1 (N = 19) and placebo (N = 24) groups, respectively (P = 0.67). Secondary outcome measures, including nadir serum creatinines after 6 weeks and need for dialysis, also did not differ between the two groups. We performed an analysis of statistical power using the placebo arm of the trial. Defining a twofold increase above placebo in day 7 glomerular filtration rate (GFR) as of meaningful biologic significance, we determined that the modest sample size used in the present study is adequate.We, thus, conclude that (1) IGF-1 treatment is unlikely to benefit ARF and (2) the transplanted kidney is a good model to screen new agents for ARF that have demonstrated promise in animal trials.
View details for Web of Science ID 000183966500022
View details for PubMedID 12846755
Quantification of immunosuppression by flow cytometry in stable renal transplant recipients
THERAPEUTIC DRUG MONITORING
2003; 25 (1): 22-27
The current standard of monitoring transplant patients by drug levels is not optimal because it does not take into account the different and individual effects of immunosuppressive drugs on each patient. In this study, the authors tested immune function assays for monitoring transplant patients. Blood was collected from stable renal transplant patients treated with cyclosporin, mycophenolate mofetil, and prednisone (n = 8), and from healthy volunteers (n = 12). Lymphocyte proliferation, expression of T-cell surface activation antigens (CD25, CD71, CD11a, CD95, CD154), production of intracellular cytokines (IL-2, INFgamma, TNFalpha), and lymphocyte subsets (CD4, CD8, CD16, CD20) were assessed by flow cytometry. Lymphocyte proliferation, expression of T-cell surface activation antigens, and production of intracellular cytokines were significantly decreased in transplant recipients compared with healthy control volunteers. The combined effects of several immunosuppressive drugs in renal transplant recipients can be quantitated with immune function assays in whole blood. This new method may be helpful to achieve an optimal level of immunosuppression for each patient.
View details for Web of Science ID 000180612500003
View details for PubMedID 12548140
Increased expression of cytotoxic effector molecules: Different interpretations for steroid-based and steroid-free immunosuppression
2003; 7 (1): 53-58
Cytotoxic T lymphocyte (CTL) effector molecules have been studied as markers of acute rejection in renal allograft recipients on steroid-based immunosuppression. We hypothesized that basal CTL gene expression may vary with time post-transplantation as well as with different immunosuppression protocols (steroid-based or steroid-free). Variations in CTL gene expression may thus impact on the ability to predict acute allograft rejection. We used the non-invasive method of quantitative competitive-reverse transcription-polymerase chain reaction (QC-RT-PCR) to quantify the amounts of CTL effector molecules (granulysin, GL; perforin, P; granzyme B, GB) in serial peripheral blood lymphocyte (PBL) samples from steroid-free and steroid-based adult and pediatric renal allograft recipients. Patients on both protocols were clinically monitored by protocol biopsies at 1, 3, 6, and 12 months post-transplantation and for graft function at 1 yr post-transplantation in a separate clinical study. Steroid-free patients with stable graft function showed an increase in GL, P, and GB gene expression over time post-transplantation with the increase being seen largely by the first post-transplant month. A further increase in GL expression was noted at the end of the first post-transplant year in the absence of acute rejection, whereas GB and P levels were unchanged. At comparative time-points post-transplantation, CTL genes were found to be higher in steroid-free patients with stable graft function, compared to steroid-based recipients with either clinically stable graft function or acute rejection. This study suggests that levels of CTL gene expression, although important in a steroid-based regimen to monitor the risk of acute rejection, may not be similarly applied in patients on steroid-free immunosuppression. The early increase in levels seen in steroid-free patients appears to correlate with the total absence of steroids. As steroid-free patients seem to have a lower incidence of acute rejection and better long-term graft function at 1 yr, the early increase in CTL genes in the absence of acute rejection may suggest an early adaptive immune activation response, promoting early graft acceptance in this protocol.
View details for Web of Science ID 000180971500011
View details for PubMedID 12581329
Conversion of stable renal allograft recipients to a bioequivalent cyclosporine formulation
2002; 74 (7): 1013-1017
Gengraf capsule, an AB-rated generic cyclosporine for Neoral, has been shown to be bioequivalent in previous studies. The purpose of this pharmacokinetic study performed in stable renal transplant recipients was to evaluate interchangeability of Gengraf and Neoral.Using an open-label, three-period design, 50 renal transplant recipients taking stable doses of Neoral completed a multicenter study. Subjects continued their Neoral regimen during period I (days 1-14). Subjects then switched from Neoral on a milligram-for-milligram basis to Gengraf during period II (days 15-28), followed by conversion to the same milligram-for-milligram dosing regimen of Neoral during period III (days 29-35). Twelve-hour pharmacokinetic evaluations (maximum observed blood concentration [C(max) ], concentration before dosing [C(trough) ], time to maximum observed concentration [T(max) ], and area under the blood concentration-vs.-time curve [AUC]) occurred on days 1, 14, 15, 28, and 29. Additional predose samples (C (trough)) were evaluated on days 7, 21, and 35. Laboratory and safety parameters were also evaluated.The pharmacokinetics of Gengraf (C(max), T(max), C(trough), and AUC) were indistinguishable from the Neoral values in stable renal allograft recipients. The bioequivalent capsules were interchangeable with respect to C(max), C(trough), and AUC at steady state and also on conversion from one capsule formulation to the other. The 90% confidence intervals (CI) for the Gengraf versus Neoral comparison at steady state (day 28 vs. day 14) were 0.95 to 1.03 for AUC and 0.92 to 1.04 for C(max). Trough concentrations remained consistent throughout the study, with no need for dosage adjustment in any of the subjects. Gengraf is well tolerated, with an excellent safety profile, comparable to the safety profile of Neoral. CONCLUSIONS The pharmacokinetics of Gengraf are equivalent and indistinguishable from those of Neoral. Gengraf is well tolerated and interchangeable with Neoral in stable renal transplant recipients.
View details for DOI 10.1097/01.TP.0000032435.67101.8A
View details for Web of Science ID 000178645000020
View details for PubMedID 12394847
Mixed chimerism and immunosuppressive drug withdrawal after HLA-mismatched kidney and hematopoietic progenitor transplantation
2002; 73 (9): 1386-1391
Rodents and dogs conditioned with total-lymphoid irradiation (TLI), with or without antithymocyte globulin (ATG), have been shown to develop mixed chimerism and immune tolerance without graft-versus-host disease (GVHD) after the infusion of major histocompatability complex (MHC)-mismatched donor bone marrow cells given alone or in combination with an organ allograft.Four human leukocyte antigen (HLA)-mismatched recipients of living donor kidney transplants were conditioned with TLI and ATG posttransplantation and infused with cyropreserved donor granulocyte colony-stimulating factor (G-CSF) "mobilized" hematopoietic progenitor (CD34+) cells (3-5x10(6) cells/kg) thereafter. Maintenance prednisone and cyclosporine dosages were tapered, and recipients were monitored for chimerism, GVHD, graft function, T-cell subsets in the blood, and antidonor reactivity in the mixed leukocyte reaction (MLR).Three of the four patients achieved multilineage macrochimerism, with up to 16% of donor-type cells among blood mononuclear cells without evidence of GVHD. Prolonged depletion of CD4+ T cells was observed in all four patients. Rejection episodes were not observed in the three macrochimeric recipients, and immunosuppressive drugs were withdrawn in the first patient by 12 months. Prednisone was withdrawn from a second patient at 9 months, and cyclosporine was tapered thereafter.Multilineage macrochimerism can be achieved without GVHD in HLA-mismatched recipients of combined kidney and hematopoietic progenitor transplants. Conditioning of the host with posttransplant TLI and ATG was nonmyeloablative and was not associated with severe infections. Recipients continue to be studied for the development of immune tolerance.
View details for Web of Science ID 000175933100002
View details for PubMedID 12023614
Late post-transplant anemia in adult renal transplant recipients. An under-recognized problem?
AMERICAN JOURNAL OF TRANSPLANTATION
2002; 2 (5): 429-435
Post-transplant anemia (PTA), a frequent complication during the first 3-6 months after transplant, is thought to be uncommon during the late post-transplant period. A study population of adults (> 18 years) transplanted during 1995 at Stanford University (n = 88) and University of North Carolina (n = 40) was selected. Data-collection points were 0, 1, 2, 3, 4 and 5 years post transplant. Anemia was defined as a hematocrit < 33 volume percentage. Thirty percent of patients were anemic at some time during the post-transplant period. The prevalence of PTA increased over time; by 5 years post transplant, 26% of the patients were anemic. Anemia occurred in 62.5% of patients converted from azathioprine to mycophenolate mofetil. A multivariate logistic regression model demonstrated a correlation between anemia and serum total CO2 (p = 0.002), BUN (p = 0.04), and creatinine (p = 0.045) at 1 year post transplant. At 5 years post transplant, only serum total CO2 (p = 0.0004) correlated with anemia. Thus, diminished renal excretory function and metabolic acidosis appear to be the most important correlates of late PTA. These findings should be interpreted in view of the fact that the newer immunosuppressive agents may have an even more profound effect on anemia and its recovery after transplantation.
View details for Web of Science ID 000176143200006
View details for PubMedID 12123208
Maintenance and recovery stages of postischemic acute renal failure in humans
AMERICAN JOURNAL OF PHYSIOLOGY-RENAL PHYSIOLOGY
2002; 282 (2): F271-F280
Postischemic injury in 38 recipients of 7-day-old cadaveric renal allografts was classified into sustained (n = 15) or recovering (n = 23) acute renal failure (ARF) according to the prevailing inulin clearance. Recipients of long-standing allografts that functioned optimally (n = 16) and living transplant donors undergoing nephrectomy (n = 10) served as functional and structural controls, respectively. A combination of physiological and morphometric techniques were used to evaluate glomerular filtration rate and its determinants 1-3 h after reperfusion and again on day 7 to elucidate the mechanism for persistent hypofiltration in ARF that is sustained. Glomerular filtration rate in the sustained ARF group on day 7 was depressed by 90% (mean +/- SD); the corresponding fall in renal plasma flow was proportionately less. Neither plasma oncotic pressure nor the single-nephron ultrafiltration coefficient differed between the sustained ARF and the control group, however. A model of glomerular ultrafiltration and a sensitivity analysis were used to compute the prevailing transcapillary hydraulic pressure gradient (DeltaP), the only remaining determinant of DeltaP. This revealed that DeltaP varied between 27 and 28 mmHg in sustained ARF and 32-38 mmHg in recovering ARF on day 7 vs. 47-54 mmHg in controls. Sustained ARF was associated with persistent tubular dilatation. We conclude that depression of DeltaP, perhaps due partially to elevated tubule pressure, is the predominant cause of hypofiltration in the maintenance stage of ARF that is sustained for 7 days.
View details for Web of Science ID 000173348100011
View details for PubMedID 11788441
Randomized trial of tacrolimus plus mycophenolate mofetil or azathioprine versus cyclosporine oral solution (modified) plus mycophenolate mofetil after cadaveric kidney transplantation: Results at 2 years
2001; 72 (2): 245-250
A previous report described the 1-year results of a prospective, randomized trial designed to investigate the optimal combination of immunosuppressants in kidney transplantation. Recipients of first cadaveric kidney allografts were treated with tacrolimus+mycophenolate mofetil (MMF), cyclosporine oral solution (modified) (CsA)+MMF, or tacrolimus+azathioprine (AZA). Results at 1 year revealed that optimal efficacy and safety were achieved with a regimen containing tacrolimus+MMF. The present report describes results at 2 years.Two hundred twenty-three recipients of first cadaveric kidney allografts were randomized to receive tacrolimus+MMF, CsA+MMF, or tacrolimus+AZA. All regimens contained corticosteroids, and antibody induction was used only in patients who experienced delayed graft function. Patients were followed up for 2 years.The results at 2 years corroborate and extend the findings of the previous report. Patients randomized to either treatment arm containing tacrolimus experienced improved kidney function. New-onset insulin dependence remained in four, three, and four patients in the tacrolimus+MMF, CsA+MMF, and tacrolimus+AZA treatment arms, respectively. Furthermore, patients with delayed graft function/acute tubular necrosis who were treated with tacrolimus+MMF experienced a 23% increase in allograft survival compared with patients receiving CsA+MMF (P=0.06). Patients randomized to tacrolimus+MMF received significantly lower doses of MMF compared with those administered CsA+MMF.All three immunosuppressive regi-mens provided excellent safety and efficacy. How-ever, the best results overall were achieved with tacrolimus+MMF. The combination may provide particular benefit to kidney allograft recipients who develop delayed graft function/acute tubular necrosis. Renal function at 2 years was better in the tacrolimus treatment groups compared with the CsA group.
View details for Web of Science ID 000170342600014
View details for PubMedID 11477347
Recommendations for the outpatient surveillance of renal transplant recipients
JOURNAL OF THE AMERICAN SOCIETY OF NEPHROLOGY
2000; 11 (10): S1-S86
View details for Web of Science ID 000089619400001
Randomized trial of tacrolimus (Prograf) in combination with azathioprine or mychophenolate mofetil versus cyclosporine (Neoral) with mycophenolate mofetil after cadaveric kidney transplantation
2000; 69 (5): 834-841
Our clinical trial was designed to investigate the optimal combination of immunosuppressants for renal transplantation.A randomized three-arm, parallel group, open label, prospective study was performed at 15 North American centers to compare three immunosuppressive regimens: tacrolimus + azathioprine (AZA) versus cyclosporine (Neoral) + mycophenolate mofetil (MMF) versus tacrolimus + MMF. All patients were first cadaveric kidney transplants receiving the same maintenance corticosteroid regimen. Only patients with delayed graft function (32%) received antilymphocyte induction. A total of 223 patients were randomized, transplanted, and followed for 1 year.There were no significant differences in baseline demography between the three treatment groups. At 1 year the results are as follows: acute rejection 17% (95% confidence interval 9%, 26%) in tacrolimus + AZA; 20% (confidence interval 11%, 29%) in cyclosporine + MMF; and 15% (confidence interval 7%, 24%) in tacrolimus + MMF. The incidence of steroid resistant rejection requiring antilymphocyte therapy was 12% in the tacrolimus + AZA group, 11% in the cyclosporine + MMF group, and 4% in the tacrolimus + MMF group. There were no significant differences in overall patient or graft survival. Tacrolimus-treated patients had a lower incidence of hyperlipidemia through 6 months posttransplant. The incidence of posttransplant diabetes mellitus requiring insulin was 14% in the tacrolimus + AZA group, 7% in the cyclosporine + MMF and 7% in the tacrolimus + MMF groups.All regimens yielded similar acute rejection rates and graft survival, but the tacrolimus + MMF regimen was associated with the lowest rate of steroid resistant rejection requiring antilymphocyte therapy.
View details for Web of Science ID 000086145600027
View details for PubMedID 10755536
PAH extraction and estimation of plasma flow in human postischemic acute renal failure
AMERICAN JOURNAL OF PHYSIOLOGY-RENAL PHYSIOLOGY
1999; 277 (2): F312-F318
We determined the effect of postischemic injury to the human renal allograft on p-aminohippurate (PAH) extraction (E(PAH)) and renal blood flow. We evaluated renal function in 44 allograft recipients on two occasions: 1-3 h after reperfusion (day 0) and again on postoperative day 7. On day 0 subsets underwent intraoperative determination of renal blood flow (n = 35) by Doppler flow meter and E(PAH) (n = 25) by renal venous assay. Blood flow was also determined in another subset of 16 recipients on postoperative day 7 by phase contrast-cine-magnetic resonance imaging, and E(PAH) was computed from the simultaneous PAH clearance. Glomerular filtration rate (GFR) on day 7 was used to divide subjects into recovering (n = 23) and sustained (n = 21) acute renal failure (ARF) groups, respectively. Despite profound depression of GFR in the sustained ARF group, renal plasma flow was only slightly depressed, averaging 296 +/- 162 ml. min(-1). 1.73 m(-2) on day 0 and 202 +/- 72 ml. min(-1). 1.73 m(-2) on day 7, respectively. These values did not differ from corresponding values in the recovering ARF group: 252 +/- 133 and 280 +/- 109 ml. min(-1). 1.73 m(-2), respectively. E(PAH) was profoundly depressed on day 0, averaging 18 +/- 14 and 10 +/- 7% in recovering and sustained ARF groups, respectively, vs. 86 +/- 6% in normal controls (P < 0.001). Corresponding values on day 7 remained significantly depressed at 65 +/- 20 and 11 +/- 22%, respectively. We conclude that postischemic injury to the renal allograft results in profound impairment of E(PAH) that persists for at least 7 days, even after the onset of recovery. An ensuing reduction in urinary PAH clearance results in a gross underestimate of renal plasma flow, which is close to the normal range in the initiation, maintenance, and recovery stages of this injury.
View details for Web of Science ID 000081923400019
View details for PubMedID 10444587
Dual kidney transplantation: Older donors for older recipients
JOURNAL OF THE AMERICAN COLLEGE OF SURGEONS
1999; 189 (1): 82-91
Dual kidney transplantation, the transplantation of both donor kidneys into a single recipient, allows increased use of expanded criteria donors (eg, older donors with a history of hypertension) to alleviate the disparity between available donors and potential recipients. We evaluated outcomes in our dual kidney transplant program that started in 1995.A retrospective comparison of donor and recipient data between recipients of dual (n = 41) versus single (n = 199) cadaveric renal transplants from February 1, 1995, to March 22, 1998, was performed. Dual kidney transplantation was selectively performed when the calculated donor admission creatinine clearance was less than 90 mL/min and the donor age was greater than 60 years, or if the donor had an elevated terminal serum creatinine. Every attempt was made to age- and size-match the donor and recipients.Recipients of dual kidneys had donors who were older than single kidney donors (59 +/- 12 versus 42 +/- 17 years respectively, p < 0.0001) and had more hypertension (51% versus 29%, p = 0.024). Average urine output was lower in the dual versus single kidney group (252 +/- 157 versus 191 +/- 70 mL/hr, p = 0.036). Donors for dual kidney recipients had a lower donor admission creatinine clearance of 82 +/- 28 mL/min versus 105 +/- 45 mL/min in the single kidney group (p = 0.005). Recipients of dual versus single kidneys were older (58 +/- 11 versus 47 +/- 12 years, p > 0.0001). Dual versus single kidney recipients had similar serum creatinines up to 2 years posttransplant (1.6 +/- 0.3 versus 1.6 +/- 0.7 mg/dL at 2 years, p = NS) and a comparable incidence of delayed graft function (24% versus 33%, p = NS) and 3-month posttransplant creatinine clearance (54 +/- 23 versus 57 +/- 25 mL/min, p = NS). One-year patient and graft survival for single kidney transplantation was 97% and 90%, respectively, and 98% and 89% for dual kidney transplantation (p = NS).Dual kidney donors were significantly older, had more hypertension, lower urine outputs, and lower donor admission creatinine clearance. Despite these differences, dual kidney recipients had comparable postoperative function, outcomes, and survival versus single kidney recipients. We believe selective use of dual kidney transplantation can provide excellent outcomes to recipients of kidneys from older donors with reduced renal function.
View details for Web of Science ID 000081230900016
View details for PubMedID 10401744
Power Doppler imaging of acute renal transplant rejection
JOURNAL OF CLINICAL ULTRASOUND
1999; 27 (4): 171-175
We evaluated the usefulness of power Doppler imaging (PDI) in diagnosing acute renal-transplant rejection.Twenty-eight patients underwent 33 renal-transplant biopsies for suspected acute rejection. Patterns of renal parenchymal vascularity revealed by PDI in patients with abnormal biopsy results were compared with patterns in a group who had normal biopsy results. PDI examinations were reviewed retrospectively by 2 independent radiologists who had no knowledge of the biopsy results. A PDI diagnosis of acute rejection required marked vascular pruning in both the cortex and medulla. PDI results then were compared with transplant-biopsy results.The sensitivity and specificity of PDI for diagnosing acute renal-transplant rejection were 40% and 100%, respectively. None of the patients with negative biopsy results had PDI abnormalities. The negative predictive value of PDI was 33%, and the positive predictive value was 100%.In our study, an abnormal sonogram was highly predictive of acute transplant rejection. However, a normal sonogram did not exclude the possibility of rejection.
View details for Web of Science ID 000079955100001
View details for PubMedID 10323186
Sodium reabsorption and distribution of Na+/K+-ATPase during postischemic injury to the renal allograft
1999; 55 (3): 963-975
A loss of proximal tubule cell polarity is thought to activate tubuloglomerular feedback, thereby contributing to glomerular filtration rate depression in postischemic acute renal failure (ARF).We used immunomicroscopy to evaluate the segmental distribution of Na+/K+-ATPase in tubules of recipients of cadaveric renal allografts. Fractional excretion (FE) of sodium and lithium was determined simultaneously. Observations were made on two occasions: one to three hours after graft reperfusion (day 0) and again on post-transplant day 7. An inulin clearance below or above 25 ml/min on day 7 was used to divide subjects into groups with sustained (N = 15) or recovering (N = 16) ARF, respectively.In sustained ARF, the fractional excretion of sodium (FENa) was 40 +/- 6% and 11 +/- 5%, and the fractional excretion of lithium (FELi) was 76 +/- 5% and 70 +/- 2% on days 0 and 7, respectively. Corresponding findings in recovering ARF were 28 +/- 2% and 6 +/- 2% for the FENa and 77 +/- 4% and 55 +/- 3% (P < 0.05 vs. sustained) for FELi. Na+/K+-ATPase distribution in both groups was mainly basolateral in distal straight and convoluted tubule segments and collecting ducts. However, Na+/K+-ATPase was poorly retained in the basolateral membrane of proximal convoluted and straight tubule segments in sustained and recovering ARF on both days 0 and 7.We conclude that loss of proximal tubule cell polarity for Na+/K+-ATPase distribution is associated with enhanced delivery of filtered Na+ to the macula densa for seven days after allograft reperfusion. Whether an ensuing activation of tubuloglomerular feedback is an important cause of glomerular filtration rate depression in this form of ARF remains to be determined.
View details for Web of Science ID 000078682200019
View details for PubMedID 10027933
Plasma glutathione peroxidase and its relationship to renal proximal tubule function
MOLECULAR GENETICS AND METABOLISM
1998; 65 (3): 238-245
Selenium-dependent extracellular glutathione peroxidase (E-GPx) is found in plasma and other extracellular fluids. Previous studies have indicated that patients with chronic renal failure on dialysis have low plasma GPx activity. In this study, dialysis patients had approximately 40% of control plasma GPx activity, while anephric individuals had lowest plasma GPx activities ranging from 2 to 22% of control. The residual plasma GPx activity in anephric individuals could be completely precipitated by anti-E-GPx antibodies, indicating that all plasma GPx activity can be attributed to E-GPx in both normal and anephric individuals. Plasma GPx activity rises rapidly following kidney transplantation, often reaching normal values within 10 days. The plasma GPx activity in some transplanted patients rises to levels higher than the normal range, followed by a return to the normal range. Since E-GPx in the kidney is primarily synthesized in the proximal tubules, we investigated whether nephrotoxic agents known to disrupt proximal tubule function also affected plasma GPx activity. The beta-lactam antibiotic cephaloglycin rapidly caused a decrease in plasma GPx activity in rabbits. In addition, the chemotherapeutic agent ifosfamide caused a decrease in plasma GPx activity in pediatric osteosarcoma patients. Fanconi syndrome associated with either ifosfamide therapy or valproic acid therapy also caused a decrease in plasma GPx activity. Thus plasma GPx activity is related to kidney function and is decreased in certain situations where nephrotoxic drugs are administered. Monitoring plasma GPx activity may have predictive value in evaluating the function of transplanted kidneys or in predicting those patients particularly at risk of nephrotoxic injury associated with certain medications.
View details for Web of Science ID 000077722800008
View details for PubMedID 9851889
What is the optimal approach for the end-stage diabetic nephropathy patient considering simultaneous pancreas-kidney transplantation?
Advances in renal replacement therapy
1998; 5 (3): 232-240
This case-based discussion regards two very different patients with end-stage diabetic nephropathy (ESDN) who are considering transplantation. What is the best approach for each individual: pancreas-kidney transplant or kidney transplant alone? Suppose a live kidney donor is available? What are the risks and benefits of each approach? In the candidate evaluation process, medical issues, such as uncorrectable coronary artery disease, are investigated and may preclude transplantation altogether or dictate the optimal approach. Similarly, a careful psychosocial profile is important to tailor the approach to the patient. The multidisciplinary transplant team has an obligation to provide informed consent, foster realistic expectations, and advise the candidate based on collective expertise. Ultimately, the decision as to the best course-pancreas-kidney, kidney transplant alone, or no transplantation-is the result of a collaborative effort between the patient and the transplant team.
View details for PubMedID 9686634
- Low-dose OKT3 treatment for rejection/induction in kidney and kidney-pancreas transplantation TRANSPLANTATION PROCEEDINGS 1998; 30 (4): 1552-1554
Backleak, light junctions, and cell-cell adhesion in postischemic injury to the renal allograft
JOURNAL OF CLINICAL INVESTIGATION
1998; 101 (10): 2054-2064
Postischemic injury in recipients of 3-7-d-old renal allografts was classified into sustained (n = 19) or recovering (n = 20) acute renal failure (ARF) according to the prevailing inulin clearance. Recipients of optimally functioning, long-standing allografts and living donors undergoing nephrectomy served as functional (n = 14) and structural controls (n = 10), respectively. Marked elevation above control of fractional clearance of dextrans of graded size was consistent with transtubular backleak of 57% of filtrate (inulin) in sustained ARF. No backleak was detected in recovering ARF. To explore a structural basis for backleak, allograft biopsies were taken intraoperatively, 1 h after reperfusion in all recipients, and again on day 7 after transplant in a subset (n = 10). Electron microscopy revealed disruption of both apical and basolateral membranes of proximal tubule cells in both sustained and recovering ARF, but cell exfoliation and tubule basement membrane denudation were negligible. Histochemical analysis of membrane-associated adhesion complexes confirmed an abnormality of proximal but not distal tubule cells, marked in sustained ARF but not in recovering ARF. Staining for the zonula occludens complex (ZO-1) and adherens complex (alpha, beta, and gamma catenins) revealed diminished intensity and redistribution of each cytoskeletal protein from the apico-lateral membrane boundary. We conclude that impaired integrity of tight junctions and cell-cell adhesion in the proximal tubule provides a paracellular pathway through which filtrate leaks back in sustained allograft ARF.
View details for Web of Science ID 000073808800004
View details for PubMedID 9593761
Outcome in cadaveric renal transplant recipients treated with cyclosporine A and mycophenolate mofetil versus cyclosporine A and azathioprine
JOURNAL OF SURGICAL RESEARCH
1998; 76 (2): 131-136
Recent multicenter reports have demonstrated improved outcome in recipients of cadaveric renal transplants treated with mycophenolate mofetil (MMF) versus azathioprine (AZA) in combination with cyclosporine A (CSA) and prednisone. We compared the outcome at our center in patients treated with MMF versus AZA, CSA, and prednisone.We retrospectively reviewed 242 adult cadaveric renal transplant recipients treated between 11/91 and 5/97. We compared 25 donor variables and 27 recipient variables and outcome parameters between patients treated with MMF versus AZA. There were 117 patients treated with CSA+AZA, 84 with CSA+MMF, and 42 who received other immunosuppressive strategies.There were no significant differences in any clinically important donor variables. Patients treated with MMF versus AZA and CSA had significantly fewer rejections and readmissions. There was no significant difference in 1- or 2-year patient survival. Recipients treated with MMF had a 5% higher graft survival at 2 years, although the difference did not reach statistical significance.Outcome is improved in adult recipients of cadaveric renal transplants treated with MMF versus AZA in combination with CSA and prednisone.
View details for Web of Science ID 000075343200005
View details for PubMedID 9698512
A review of the kidneys that nobody wanted
1998; 65 (2): 213-219
We previously reported excellent outcome at 6 months after transplantation in recipients of expanded criteria donor kidneys that other local centers had declined, kidneys that nobody wanted (KNW), versus controls. We now report follow-up after 23 months.We retrospectively reviewed 27 donor and 24 recipient characteristics in 126 adult recipients of transplants from January 1, 1995, to November 25, 1996.Donors of control kidneys versus KNW were younger and had significantly higher minimum 4-hr urine output. Recipients of control kidneys versus KNW had significantly more HLA matches and lower 3-month posttransplant serum creatinine levels. Patient and graft survival rates were similar between the control kidneys versus the KNW. We also compared the control kidneys and KNW with regard to prompt function or delayed graft function and satisfactory versus unsatisfactory function (unsatisfactory: serum creatinine > or =2.5 ml/dl or graft loss at 6 months) to identify donor and recipient characteristics associated with delayed graft function and unsatisfactory outcome. The incidence of rejection was significantly lower in control kidneys and KNW with satisfactory function versus control kidneys and KNW with unsatisfactory function.These data demonstrate: (1) similar graft survival at 12 months, (2) lower donor age, (3) higher minimum 4-hr urine output, and (4) more HLA matches in recipients of control kidneys versus KNW. Optimal outcome was achieved in recipients of control kidneys and KNW with prompt function and satisfactory function based upon serum creatinine in the first 6 months and in recipients with lower rates of rejection. Although outcome is dependent upon many donor and recipient variables, we believe that with careful donor and recipient selection, excellent outcome can be achieved using expanded criteria donor kidneys.
View details for Web of Science ID 000071688700012
View details for PubMedID 9458017
- Expanded criteria for donor kidneys: An update on outcome in single versus dual kidney transplants TRANSPLANTATION PROCEEDINGS 1997; 29 (8): 3671-3673
Bladder augmentation can be problematic with renal failure and transplantation
1997; 11 (6): 672-675
Ten consecutive patients with failure of urinary bladder augmentation (UBA) performed either prior to or after reaching end-stage renal disease (ESRD) were studied. Seven patients developed increased hydroureteronephrosis, infectious complications, and advanced to ESRD after UBA. The mean time to development of ESRD in patients who had UBA performed with moderate chronic renal failure (CRF) was 1.8 years. The UBAs in all seven patients were taken down prior to transplantation. Subsequently, five of these UBA-takedown patients have received kidney grafts and all have stable, good renal function. Three patients had their UBA performed after they reached ESRD, in preparation for renal transplantation. All three of these patients experienced recurrent urosepsis following transplantation, resulting in death in one patient and loss of graft in another. The third patient will undergo takedown of the UBA. This study suggests that UBA may possibly not be the best option for patients with moderate CRF and those awaiting transplantation.
View details for Web of Science ID 000071016000002
View details for PubMedID 9438639
Outcomes in diabetic patients after simultaneous pancreas-kidney versus kidney alone transplantation
1997; 64 (9): 1288-1294
Previous studies have identified more morbidity in simultaneous pancreas-kidney (SPK) transplant recipients compared with kidney alone (KA) recipients. With the development of novel immunosuppressive drugs, studies are needed to determine optimal treatment regimens in specific patient populations.We retrospectively compared short-term outcome in diabetic patients receiving either SPK or KA transplantation from December 10, 1991, to July 31, 1996. The SPK recipients received either cyclosporine (CsA) + azathioprine (AZA), FK506+AZA, or FK506 + mycophenolate mofetil (MM). KA group patients received either CsA+AZA or CsA+MM.Recipients of SPK instead of KA transplants were younger, had a longer mean length of stay, had a decreased incidence of delayed graft function, and had more readmissions. There were no significant differences in serum creatinine at 1, 2, and 3 years after transplantation, number of rejection episodes and infections, incidence of kidney graft loss and patient death, and 1- and 3-year actuarial patient and kidney graft survival rates between the two groups. Diabetic SPK patients receiving FK506+MM had a higher mean 3-month creatinine clearance (calculated), compared with recipients of CsA+AZA or FK506+AZA. Diabetic patients after KA transplantation who received CsA+MM demonstrated fewer rejection episodes and graft losses, although differences did not reach statistical significance.(1) Diabetic SPK recipients have decreased rates of delayed graft function and more readmissions compared with diabetic KA recipients. (2) There is no difference in: serum creatinine levels up to 3 years after transplantation, number of rejection episodes or infections, and 1- and 3-year patient and graft survival rates between SPK and KA recipients. (3) Short-term outcome is improved in diabetic recipients of SPK and KA transplants receiving MM instead of AZA.
View details for Web of Science ID A1997YG53700010
View details for PubMedID 9371670
When should expanded criteria donor kidneys be used for single versus dual kidney transplants?
1997; 64 (8): 1142-1146
To increase the utilization of cadaveric donor kidneys, we have recently expanded our acceptable criteria to include aged donors (frequently with a history of hypertension), by selectively using both donor kidneys (dual transplant) into a single recipient.To define when these expanded criteria donor (ECD) kidneys should be used as a single versus a dual kidney transplant, we retrospectively reviewed 52 recipients of ECD kidneys that had been turned down by all other local centers between 1/1/95 and 11/15/96. Fifteen patients received dual transplants, whereas the remaining 37 received single kidneys. Of the dual kidney recipients, 14 of 15 ECD were > or = 59 years of age, 10 of 15 were hypertensive, and 9 of 15 were both. Of the single recipients, 11 of 37 ECD were > or = 59 years of age, 11 of 37 were hypertensive, and 7 of 37 were both. All patients received cyclosporine-based triple-drug therapy. We compared seven donor (D) and sixteen recipient outcome variables in single versus dual kidney transplants as subgrouped by: (1) donor admission creatinine clearance (D-AdC(Cr)) < 90 ml/min; (2) D-age > or = 59 years; and (3) cold storage (Cld Stg) < or > 24 hr.In the group with D-AdC(Cr) < 90, there was a significantly higher incidence of delayed graft function (DGF) in single versus dual recipients (9 of 20 [45%] vs. 1 of 11 [9%]; P=0.04) and worse early graft function based upon mean serum creatinine at 1 and 4 weeks (5.3+/-3.3 and 2.8+/-2.0 vs. 1.7+/-0.6 and 1.4+/-0.5 mg] dl; P<0.05). In the group with D-age > or = 59, recipients of single kidneys had significantly higher mean serum creatinine at 1, 4, and 12 weeks versus recipients of dual kidneys (5.1+/-3.3, 3.4+/-2.1, 2.8+/-1.5 versus 2.8+/-2.5, 1.5+/-0.6, 1.6+/-0.5 mg/dl; P<0.05). Cld Stg time also had an impact on DGF and early outcome. Recipients of dual kidneys stored less than 24 hr had a significantly lower incidence of DGF versus single kidneys stored more than 24 hr (10% vs. 46%; P<0.05) and better early graft function based on mean serum creatinine at 1, 4, and 12 weeks (1.9+/-0.8, 1.3+/-0.4, 1.5+/-0.2 vs. 6.6+/-3.4, 3.0+/-1.6, 2.9+/-1.9 mg/dl; P<0.05). The overall 1-year patient and graft survivals were 96% and 81% vs. 93% and 87% (P=NS) in recipients of single ECD versus dual ECD kidneys.In conclusion, we believe that kidneys from ECD with D-AdC(Cr) < 90 ml/min and D-age > or = 59 should be used as dual kidney transplants, keeping the Cld Stg time at < 24 hr to minimize the effect of Cld Stg on early graft function.
View details for Web of Science ID A1997YE23300011
View details for PubMedID 9355831
Pathophysiology of reduced glomerular filtration rate in delayed graft function
CURRENT OPINION IN NEPHROLOGY AND HYPERTENSION
1997; 6 (4): 405-409
Delayed graft function is a form of postischemic acute renal failure. It lowers glomerular filtration rate in large part by depressing the glomerular transcapillary hydraulic pressure difference, the driving force for the formation of filtrate. Loss of proximal tubule cell polarity, impaired sodium reabsorption and tubuloglomerular feedback-mediated afferent arteriolar constriction are all implicated. A link between delayed graft function and chronic allograft injury is evident. The ensuing impairment of graft survival makes urgent the need for further elucidation of delayed graft function and a search for an effective therapy.
View details for Web of Science ID A1997XK26600017
View details for PubMedID 9263693
Kidney and kidney/pancreas transplantation at Stanford University Medical Center.
The disparity between the supply of cadaveric donors and the demand for renal allografts continues to grow. We have taken a multifaceted approach to increase the allograft pool: 1. Spiral computed tomography to evaluate potential living kidney donors is safer, less invasive, less expensive and more time efficient and thus should encourage living organ donation. 2. Use of selected expanded criteria cadaveric donor kidneys (aged 60 or over, hypertensive) in size- and age-matched recipients have short-term function at 3 and 6 months comparable to standard cadaveric renal allografts. 3. Kidneys from expanded criteria donors over age 59 and with an adjusted creatinine clearance less than 90 ml/min should be used as a dual kidney transplant into an appropriate sized- and aged-matched recipient. 4. Kidneys from pediatric donors < 5 years of age should be utilized as en-bloc grafts, when transplanted into adult recipients. Pediatric renal transplantation poses numerous challenges given the different and problematic etiologies of ESRD, the surgical considerations in small children and infants and the enhanced immune response witnessed in children. Nevertheless, renal transplantation is clearly the therapy of choice for children with ESRD and excellent results can be obtained through strict adherence to surgical detail, tight immunosuppressive management, and aggressive fluid management in infants and small children. We feel it is also critically important that transplantation and follow-up care be carried out by an integrated and experienced surgical and medical team. Managed healthcare has had profound effects on the practice and management of transplantation centers. The one area of greatest impact has been the pressure upon programs to reduce their cost of transplantation. We have initiated a number of new outpatient treatment protocols as part of an effort to contain costs. Most patients with acute rejection are evaluated (including transplant kidney biopsy) and treated in an ambulatory setting. Completion of OKT3 therapy in selected patients is also performed at home through visiting nurses or at our ambulatory care center. Additionally, treatment of CMV disease is now performed almost exclusively on an outpatient basis.
View details for PubMedID 9919398
THE UTILITY OF RETROPERITONEAL KIDNEY PLACEMENT IN SIMULTANEOUS KIDNEY-PANCREAS TRANSPLANTATION
1995; 9 (6): 457-462
Simultaneous kidney-pancreas (SPK) transplantation has become an accepted therapeutic modality for patients with Type I diabetes mellitus-mediated end-stage renal disease (ESRD). However, the intraperitoneal placement of the renal allograft may pose technical problems when attempting percutaneous biopsy or Doppler ultrasound examination. Recently, the Stanford University Transplant Center adopted the technique of retroperitoneal placement of the renal allograft with intraperitoneal placement of the pancreas allograft (RETRO). From August 1993 to August 1994, a total of 12 patients underwent SPK with this new technique. Twelve patients who had received SPK with the standard technique served as historical controls (INTRA). Demographic data, follow-up, operative time, creatinine and amylase on discharge, length of stay, intraoperative fluid requirements, rejection episodes, thrombotic complications, infections, and number of open and closed renal biopsies were compared between the two groups. Average length of follow-up was greater in the INTRA group (29.3 +/- 1.7 vs. 15.9 +/- 1.1 months). In addition, the RETRO group had significantly fewer open renal biopsies (1/15) in comparison to the INTRA group (7/12) (p < 0.001). The two groups otherwise did not differ in any of the parameters studied. We conclude that retroperitoneal kidney and intraperitoneal pancreas allograft placement is associated with a significantly decreased requirement for open renal biopsy with its associated operating room and anesthetic costs. In addition, the option of transcystoscopic or percutaneous needle biopsy of the pancreas allograft is preserved. This technique should be considered as an alternative to intraperitoneal placement of both the pancreas and renal allografts.
View details for Web of Science ID A1995TJ01700007
View details for PubMedID 8645889
POSTISCHEMIC INJURY, DELAYED FUNCTION AND NA+/K+-ATPASE DISTRIBUTION IN THE TRANSPLANTED KIDNEY
1995; 48 (4): 1308-1315
We evaluated the postischemic renal injury in 22 patients undergoing renal transplantation. Renal tissue obtained 45 to 60 minutes after reperfusion of the allograft was stained with specific antibodies against the delta subunit of Na+/K(+)-ATPase, fodrin and ankyrin. The distribution of each cytoskeletal protein was analyzed by laser confocal microscopy. Subsequent allograft function was assessed on two occasions, 1 to 3 and 36 hours post-reperfusion, respectively. Recipients were divided into two groups: those who achieved a normal GFR on post-transplant day 3 (group 1, N = 12) and those with persistent hypofiltration (group 2, N = 10). Patients of both groups exhibited impaired sodium reabsorption and isosthenuria one to three hours postoperatively, but these abnormalities persisted on day 3 only in group 2 subjects with persistent hypofiltration. Abnormalities of Na+/K(+)-ATPase, ankyrin and fodrin were confined to proximal tubule cells and were marked only in the subjects of group 2. They consisted of redistribution of each cytoskeletal protein from the basolateral membrane to the cytoplasm. We conclude that postischemic injury to a renal allograft results in a loss of polarity of proximal tubule cells. We propose that ensuing impairment of proximal sodium reabsorption could activate tubuloglomerular feedback, thereby contributing to the protracted hypofiltration that characterizes this form of postischemic, acute renal failure.
View details for Web of Science ID A1995RV92600045
View details for PubMedID 8569093
ENDOGENOUS ANP IN POSTISCHEMIC ACUTE RENAL-ALLOGRAFT FAILURE
AMERICAN JOURNAL OF PHYSIOLOGY-RENAL PHYSIOLOGY
1995; 269 (1): F125-F133
Circulating atrial natriuretic peptide (ANP) levels and glomerular binding sites for ANP were examined in 23 subjects undergoing renal transplantation. Subjects were divided into two groups, group 1 (n = 12) with prompt and group 2 (n = 11) with delayed allograft function. Sixty to 180 min after graft reperfusion, renovascular resistance was threefold higher and glomerular filtration rate (GFR) depressed by 79% in group 2 vs. group 1. Corresponding median plasma ANP (114 vs. 140 pg/ml) and guanosine 3',5'-cyclic monophosphate (cGMP) levels (22 vs. 28 pmol/ml) were similarly elevated in the two groups [P = not significant (NS)]. Autoradiographic analysis of glomeruli in an allograft biopsy revealed the median density of total receptors (24 vs. 28 fmol/mm3), A receptors (15 vs. 19 fmol/mm3), and C receptors (6 vs. 9 fmol/mm3) for ANP to also be similar in group 2 vs. group 1, respectively (P = NS). By postoperative day 3, allograft GFR averaged only 6 +/- 2 in group 2 vs. 59 +/- 4 ml/min in group 1. Median plasma ANP levels doubled in each group to 262 and 251 pg/ml, respectively (P = NS). However, median values for plasma levels (38 vs. 17 pmol/ml) and the fractional clearance of cGMP (1.9 vs. 1.2) were significantly higher in group 2 than group 1. We conclude that, despite an adequate density of glomerular ANP receptors and enhanced cGMP generation, neither renal vasoconstriction nor hypofiltration is alleviated by a progressive elevation of plasma ANP levels in renal transplant recipients with sustained postischemic injury. We infer that constricted afferent arterioles are unresponsive to the vasorelaxant action of endogenous ANP in this form of postischemic, acute renal failure.
View details for Web of Science ID A1995RK17900016
View details for PubMedID 7631826
CLINICAL OUTCOME OF INTERVAL CADAVERIC RENAL-TRANSPLANTATION IN CARDIAC ALLOGRAFT RECIPIENTS
1995; 9 (2): 92-97
The introduction of cyclosporine into widespread clinical use has resulted in improved patient survival following cardiac transplantation. As a result of increased numbers of cardiac transplants, the inherent nephrotoxicity of cyclosporine, and prolonged patient survival, cardiac transplant recipients commonly present with renal dysfunction. In the subgroup who ultimately develop end-stage renal disease (ESRD), therapeutic options include renal transplantation. However, the clinical course associated with this treatment modality is unknown. From 1980 to 1993, 430 cardiac transplants were performed with cyclosporine-based immunosuppression at the Standard University Medical Center. Fourteen (3.3%) patients developed ESRD, requiring chronic dialysis or renal transplantation. The cause of ESRD was cyclosporine nephropathy (13/14; 93%) and glomerulonephritis (1/14; 7%). The average time interval to the development of ESRD was 82 +/- 42 months. Nine patients underwent renal transplantation. During the period of followup (38 +/- 27 months; range 6-89 months) after renal transplantation, cardiac function remained stable. There were no episodes of primary nonfunction of the renal allograft. Patient and renal allograft survival was 89% at both 1 and 3 years after renal transplant. Average serum creatinine was 1.3 +/- 0.6 mg/dl at 1 year and 1.6 +/- 0.8 mg/dl at 3 years post-transplant. The incidence of infectious complications was not statistically different when compared to that of the heart transplant controls and that of a group of cadaveric renal transplant controls (n = 20). Surprisingly, the incidence of renal allograft rejection in the heart transplant patients was 10-fold less than that of the renal transplant controls (0.006 +/- 0.02/patient-year vs. 0.062 +/- 0.05/patient-year; p < 0.01).(ABSTRACT TRUNCATED AT 250 WORDS)
View details for Web of Science ID A1995QV67600006
View details for PubMedID 7599409
- THE USE OF SPIRAL COMPUTED-TOMOGRAPHY IN THE EVALUATION OF LIVING DONORS FOR KIDNEY-TRANSPLANTATION TRANSPLANTATION 1995; 59 (4): 643-645
MECHANISMS OF FILTRATION FAILURE DURING POSTISCHEMIC INJURY OF THE HUMAN KIDNEY - A STUDY OF THE REPERFUSED RENAL-ALLOGRAFT
JOURNAL OF CLINICAL INVESTIGATION
1995; 95 (2): 820-831
Postischemic filtration failure in experimental animals results primarily from depression of the transcapillary hydraulic pressure difference (delta P), a quantity that cannot be determined in humans. To circumvent this limitation we determined the GFR and each of its remaining determinants in transplanted kidneys. Findings in 12 allografts that exhibited subsequent normofiltration (group 1) were compared with those in 11 allografts that exhibited persistent hypofiltration (group 2). Determinations were made intraoperatively in the exposed graft after 1-3 h of reperfusion. GFR (6 +/- 2 vs 29 +/- 5 ml/min) and renal plasma flow by Doppler flow meter (140 +/- 30 vs 315 +/- 49 ml/min) were significantly lower in group 2 than group 1. Morphometric analysis of glomeruli obtained by biopsy and a structural hydrodynamic model of viscous flow revealed the glomerular ultrafiltration coefficient to be similar, averaging 3.5 +/- 0.6 and 3.1 +/- 0.2 ml/(min.mmHg) in group 2 vs 1, respectively. Corresponding values for plasma oncotic pressure were also similar, averaging 19 +/- 1 vs 21 +/- 1 mmHg. We next used a mathematical model of glomerular ultrafiltration and a sensitivity analysis to calculate the prevailing range for delta P from the foregoing measured quantities. This revealed delta P to vary from only 20-21 mmHg in group 2 vs 34-45 mmHg in group 1 (P < 0.001). Further morphometric analysis revealed the diameters of Bowman's space and tubular lumens, as well as the percentage of tubular cells that were necrotic or devoid of brush border, to be similar in the two groups. We thus conclude (a) that delta P depression is the predominant cause of hypofiltration in this form of postischemic injury; and (b) that afferent vasoconstriction rather than tubular obstruction is the proximate cause of the delta P depression.
View details for Web of Science ID A1995QG20900052
View details for PubMedID 7860766
- POSTTRANSPLANT LYMPHOPROLIFERATIVE DISORDERS AND EPSTEIN-BARR-VIRUS PROPHYLAXIS TRANSPLANTATION 1995; 59 (1): 135-138
- ENDOVASCULAR STENTING OF THE ABDOMINAL-AORTA FOLLOWING RENAL-TRANSPLANTATION TRANSPLANTATION 1994; 58 (4): 522-524
- SPIRAL COMPUTED-TOMOGRAPHY IN THE DIAGNOSIS OF TRANSPLANT RENAL-ARTERY STENOSIS TRANSPLANTATION 1994; 57 (5): 746-748
GLOMERULAR SIZE-SELECTIVITY AND MICROALBUMINURIA IN EARLY DIABETIC GLOMERULAR-DISEASE
1992; 41 (4): 840-846
Sieving coefficients of uncharged dextrans of graded size (radii 30 to 60 A) were used to characterize barrier size-selectivity in nonazotemic diabetic humans with microalbuminuria (Group 1, N = 11) or macroalbuminuria (Group 2, N = 21). Compared to a non-diabetic control group (N = 21) the low radius end of the sieving profile was depressed, whereas the high radius end was elevated in each diabetic group, more so in Group 2 than Group 1. A heteroporous membrane model revealed the major portion of the glomerular barrier to be perforated by restrictive pores of approximately 56 A radius in all three groups. However, in keeping with a parallel trend for GFR, the relative density of restrictive pores was control greater than Group 1 greater than Group 2. The remaining minor portion of the barrier was perforated by large, shunt-like pores, the relative prominence of which ranked Group 2 greater than Group 1 greater than control. Although the hypothetical, fractional clearance of macromolecules attributable to the shunt-like pores varied directly with fractional clearances of albumin and IgG, the progressive increment in the latter fractional protein clearances in the two diabetic groups was disproportionate. This raises the possibility that factors in addition to barrier size defects contribute to the development, magnitude and composition of proteinuria early in the course of diabetic glomerular disease.(ABSTRACT TRUNCATED AT 250 WORDS)
View details for Web of Science ID A1992HM53500019
View details for PubMedID 1381005
ATRIAL-NATRIURETIC-PEPTIDE AND RESPONSE TO CHANGING PLASMA-VOLUME IN DIABETIC NEPHROPATHY
1991; 40 (7): 893-901
We evaluated the renal and hormonal responses to volume expansion induced by water immersion in subjects with diabetic nephropathy (n = 12) and in healthy control subjects (n = 9). Immersion induced similar average increments in sodium excretion (+/- 223 vs. 176 mumol/min) and comparable decrements in renovascular resistance (RVR; -15 vs. -16 U). However, whereas the control subjects responded uniformly, the response among diabetic subjects was highly variable, with a subset of patients exhibiting paradoxical antinatriuresis and vasoconstriction. Immersion was associated with marked elevation of atrial natriuretic peptide (ANP) in plasma of diabetic versus control subjects (61 +/- 9 vs. 19 +/- 2 pM, respectively; P less than 0.001). Yet for each picomolar increment in plasma ANP during immersion, the corresponding increases in urinary excretion of cyclic guanosine monophosphate (26 vs. 279 pmol/min) and sodium (9 vs. 47 mumol/min) and the reciprocal lowering of RVR (0.7 vs. 1.9 U) were blunted in the diabetic versus control group. Volume contraction in the postimmersion period was associated with disproportionate antinatriuresis and renal vasoconstriction in the diabetic group, despite a persistent elevation of ANP (29 +/- 2 vs. 16 +/- 2 pM, P less than 0.01). We propose that renal insensitivity to ANP in diabetic nephropathy could contribute to altered vasoreactivity and abnormal excretory responsiveness to changing plasma volume. Blunted natriuresis in response to ANP release and enhanced sodium retention during volume contraction could account for the expanded extracellular fluid volume that has consistently been reported to accompany the development of diabetic nephropathy.
View details for Web of Science ID A1991FU96200017
View details for PubMedID 1647996
MECHANISM OF POTASSIUM-DEPLETION DURING CHRONIC METABOLIC-ACIDOSIS IN THE RAT
AMERICAN JOURNAL OF PHYSIOLOGY
1987; 252 (1): F122-F130
Pair-fed rats on a normal K diet were given either 1.5% NH4Cl or water for 4 days. The acid-fed animals developed metabolic acidosis, negative K balance, and K depletion. Urinary Na excretion and urinary flow were not different between the groups beyond the first day. After the 4 days, isolated kidneys from animals in each of these groups were perfused at normal pH and bicarbonate concentrations. Urinary K excretion was similar between the groups despite the potassium depletion in the acid-fed animals. In contrast, isolated kidneys from animals with comparable K depletion induced by dietary K restriction readily conserved K (fractional excretion 0.35 +/- 0.04 vs. 0.83 +/- 0.09 by the kidneys from acid-fed animals, P less than 0.01). Sodium excretion and urinary flow were similar among the three groups of isolated kidneys. Plasma aldosterone concentrations were greater in the acid-fed rats after the 4 days of NH4Cl ingestion than in the control animals (43 +/- 10 vs. 10 +/- 2 ng/dl, P less than 0.01). Adrenalectomized rats were treated with either normal (4 micrograms/day) or high (22 micrograms/day) aldosterone replacement while ingesting NH4Cl for 4 days. Only in the presence of high aldosterone replacement did the acid-fed adrenalectomized animals develop K depletion. We conclude that chronic metabolic acidosis stimulates aldosterone secretion, and that aldosterone maintains the inappropriately high urinary potassium excretion and K depletion seen in this acid-base disorder.
View details for Web of Science ID A1987G403300074
View details for PubMedID 3812697
Uniform Long-Term Graft Survival in a Clincial Trial of the Induction of Tolerance to Kidney Transplants.
WILEY-BLACKWELL. 2013: 200-200
View details for Web of Science ID 000318240300549
- Revisiting the use of hepatitis B core antibody-positive donor kidneys ELSEVIER SCIENCE INC. 2001: 1535-1536
Eradication of cytomegalovirus reactivation disease using high-dose acyclovir and targeted intravenous ganciclovir in kidney and kidney/pancreas transplantation
LIPPINCOTT WILLIAMS & WILKINS. 1997: 931-933
The attack rate of cytomegalovirus (CMV) is over 50% in solid organ transplant recipients at risk for primary CMV infection and in those receiving antilymphocyte antibody therapy. Various CMV prophylaxis regimens over the last few years have reduced the attack rate to around 20% overall.We report our results using high-dose acyclovir for 3 months after transplant, with targeted intravenous ganciclovir for the duration of any antilymphocyte antibody therapy, in our kidney and simultaneous pancreas/kidney transplant recipients. Records of 109 consecutive patients over a 2-year period were reviewed.Six cases of CMV disease were identified. Five cases occurred in 21 patients at risk for primary CMV disease (24%), whereas only one case occurred in 73 patients at risk for CMV reactivation (1.4%).We conclude that high-dose acyclovir and targeted ganciclovir is excellent prophylaxis against CMV reactivation in kidney and simultaneous pancreas/kidney transplantation.
View details for Web of Science ID A1997XZ17300026
View details for PubMedID 9326425
- Use of an augmented urinary bladder can be catastrophic in renal transplantation ELSEVIER SCIENCE INC. 1997: 154-155
The kidneys that nobody wanted - Support for the utilization of expanded criteria donors
WILLIAMS & WILKINS. 1996: 1832-1841
The continuing shortage of cadaveric donors necessitates constant reappraisal of donor refusal criteria. From 1/1/95 to 3/20/96, 180 renal transplants were performed at our center. Of these, 26 were kidney/pancreas, 30 pediatric, 37 live donor adult, and 87 adult cadaveric renal transplants (CRT). In the CRT group there were 31 recipients of kidneys that all other local transplant centers declined. We retrospectively compared this group of kidneys that nobody wanted (KNW) to the remaining 56 CRTs (controls) performed at our center during the same period. Of the 31 recipients of KNW, 18 received kidneys declined for reasons of advanced age, defined as > or =60 years (including 8 who also had a history of hypertension, 4 who also had >10% sclerosed glomeruli on biopsy, and 3 also declined based upon donor quality because of acute injury), 8 for donor quality alone (e.g., prolonged hypotension), 3 on the basis of biopsy results alone, and 2 for anatomic abnormalities. Twelve recipients of KNW were "dual transplanted" with both donor kidneys. Of 27 donor variables compared between the KNW and control groups, only donor age (52+/-17 versus 40+/-17 years, respectively) and lowest total 4-hr urine output (327+/-208 versus 507+/-437 cc, respectively) proved to be significantly different (p< or =0.05). Of the 25 recipient variables examined, a significant difference was found only in serum creatinine at one month posttransplant (2.6+/-1.8 versus 1.8+/-1.0 mg/dl, respectively), although there was no difference in serum creatinine at three and six months. Actuarial one year patient (100 vs. 95%) and graft (97 vs. 91%) survival, KNW vs. controls respectively, are excellent to date. Further analyses showed no differences in outcome variables between recipients of KNW versus controls when the donor age was > or =60 years. Similar outcome was achieved by transplanting both kidneys from a KNW donor into a single recipient as compared with single-kidney transplantation from control donors. Careful donor-recipient pairing using kidneys from advanced-age donors for smaller, advanced-age recipients provided good short-term outcome. In conclusion, there was no significant difference in short-term outcome in recipients of KNW versus controls despite differences in donor age and lowest total 4-hr urine output. We believe that, with careful consideration, existing donor selection criteria can be expanded to include certain donors previously considered unusable.
View details for Web of Science ID A1996WA91600027
View details for PubMedID 8990373