Metrics will be available once more articles are published.
Sirtuin 1: A Dilemma in TransplantationRead the full article
Journal of Transplantation publishes research related to heart, lung, kidney, liver, pancreas and stem cell transplantation as well as focusing on the histocompatibility, related side effects and complications of those transplantations.
Journal of Transplantation maintains an Editorial Board of practicing researchers from around the world, to ensure manuscripts are handled by editors who are experts in the field of study.
Latest ArticlesMore articles
Pretransplant Donor-Specific Anti-HLA Antibodies and the Risk for Rejection-Related Graft Failure of Kidney Allografts
Background. The presence of donor-specific antibodies (DSAs) against HLA before kidney transplantation has been variably associated with decreased long-term graft survival. Data on the relation of pretransplant DSA with rejection and cause of graft failure in recipients of donor kidneys are scarce. Methods. Patients transplanted between 1995 and 2005 were included and followed until 2016. Donor-specific antibodies before transplantation were determined retrospectively. For cause, renal transplant biopsies were reviewed. Results. Pretransplant DSAs were found in 160 cases on a total of 734 transplantations (21.8%). In 80.5% of graft failures, a diagnostic renal biopsy was performed. The presence of pretransplant DSA (DSApos) increased the risk of graft failure within the first 3 months after transplantation (5.2% vs. 9.4%) because of rejection with intragraft thrombosis (). One year after transplantation, DSApos recipients had an increased hazard for antibody-mediated rejection at 10 years (9% DSAneg vs. 15% DSApos, ) with significant decreased graft survival at 10 years (79% DSAneg vs. 69% DSApos, ). This could largely contribute to an increased graft loss because of antibody-mediated rejection in the DSApos group. The incidence and graft loss because of T cell-mediated rejection was not affected by the presence of pretransplant DSA. Conclusions. Pretransplant DSAs are a risk factor for early graft loss and increase the incidence for humoral rejection and graft loss but do not affect the risk for T cell-mediated rejection.
Type of Preservation Solution, UW or HTK, Has an Impact on the Incidence of Biliary Stricture following Liver Transplantation: A Retrospective Study
Organ preservation plays a crucial role in the outcome following solid organ transplantation. The aim of this study was to perform a retrospective outcome analysis following liver transplantation using histidine tryptophan ketoglutarate (HTK) or the University of Wisconsin (UW) solutions for liver graft preservation. We retrospectively reviewed data on adult patients who were liver-transplanted at Karolinska University Hospital between 2007 and 2015. There was evaluation of donor and recipient characteristics, pre- and post-transplant blood chemistry tests, biliary and vascular complications, graft dysfunction and nonfunction, and patient and graft survivals. A total of 433 patients were included in the analyses, with 230 and 203 patients having received livers preserved with HTK and UW, respectively. Mean follow-up was 45 ± 29 months for the HTK group and 42.4 ± 26 for the UW group. There was no difference between the two groups either in terms of patient and graft survival, or of results of postoperative blood chemistry, or incidence of arterial complications, early allograft dysfunction, or primary graft nonfunction. However, the incidence of biliary stricture was higher in the UW group (22.7%) versus the HTK group (13.5%; ). Use of UW and HTK preservation solution in liver transplantation has no impact on patient and graft survival. However, use of HTK solution results in a lower incidence of posttransplant biliary stricture.
Endothelial Glycocalyx Shedding Occurs during Ex Vivo Lung Perfusion: A Pilot Study
Background. Damage to the endothelium has been established as a key pathological process in lung transplantation and ex vivo lung perfusion (EVLP), a new technology that provides a platform for the assessment of injured donor lungs. Damage to the lung endothelial glycocalyx, a structure that lines the endothelium and is integral to vascular barrier function, has been associated with lung dysfunction. We hypothesised that endothelial glycocalyx shedding occurs during EVLP and aimed to establish a porcine model to investigate the mechanism underlying glycocalyx breakdown during EVLP. Methods. Concentrations of endothelial glycocalyx breakdown products, syndecan-1, hyaluronan, heparan sulphate, and CD44, were measured using the ELISA and matrix metalloproteinase (MMP) activity by zymography in the perfusate of both human (n = 9) and porcine (n = 4) lungs undergoing EVLP. Porcine lungs underwent prolonged EVLP (up to 12 hours) with perfusion and ventilation parameters recorded hourly. Results. During human EVLP, endothelial glycocalyx breakdown products in the perfusate increased over time. Increasing MMP-2 activity over time was positively correlated with levels of syndecan-1 (r = 0.886; ) and hyaluronan (r = 0.943; ). In the porcine EVLP model, hyaluronan was the only glycocalyx product detectable during EVLP (1 hr: 19 (13–84) vs 12 hr: 143 (109–264) ng/ml; ). Porcine hyaluronan was associated with MMP-9 activity (r = 0.83; ) and also with dynamic compliance (r = 0.57; ). Conclusion. Endothelial glycocalyx products accumulate during both porcine and human EVLP, and this accumulation parallels an accumulation of matrix-degrading enzyme activity. Preliminary evidence in our porcine EVLP model suggests that shedding may be related to organ function, thus warranting additional study.
Clinical Significance of Renal Allograft Protocol Biopsies: A Single Tertiary Center Experience in Malaysia
Background. The role of protocol renal allograft biopsy in kidney transplantation is controversial due to the concern with procedural-related complications; however, its role is slowly evolving. Recent evidence suggests that protocol biopsy is useful in detecting subclinical renal pathology. Early recognition and treatment of renal pathologies can improve long-term outcomes of renal allografts. Methodology. A total of 362 renal allograft protocol biopsies were performed in adult recipients of kidney transplantation between 2012 and 2017. After excluding those with poor quality or those performed with a baseline serum creatinine level >200 umol/L, we analyzed 334 (92.3%) biopsies. Histology reports were reviewed and categorized into histoimmunological and nonimmunological changes. The immunological changes were subcategorized into the following: (1) no acute rejection (NR), (2) borderline changes (BC), and (3) subclinical rejection (SCR). Nonimmunological changes were subcategorized into the following: (1) chronicity including interstitial fibrosis/tubular atrophy (IFTA), chronic T-cell-mediated rejection (TCMR), unspecified chronic lesions, and arterionephrosclerosis, (2) de novo glomerulopathy/recurrence of primary disease (RP), and (3) other clinically unsuspected lesions (acute pyelonephritis, calcineurin inhibitors toxicity, postinfective glomerulonephritis, and BK virus nephropathy). Risk factors associated with SCR were assessed. Results. For the histoimmunological changes, 161 (48.2%) showed NR, 145 (43.4%) were BC, and 28 (8.4%) were SCR. These clinical events were more pronounced for the first 5 years; our data showed BC accounted for 59 (36.4%), 64 (54.2%), and 22 (40.7%) biopsies within <1 year, 1-5 years, and > 5 years, respectively (p = 0.011). Meanwhile, the incidence for SCR was 6 (3.7%) biopsies in <1 year, 18 (15.3%) in 1-5 years, and 4 (7.4%) in >5 years after transplantation (p=0.003). For the nonimmunological changes, chronicity, de novo glomerulopathy/RP, and other clinically unsuspected lesions were seen in 40 (12%), 10 (3%), and 12 (3.6%) biopsies, respectively. Living-related donor recipients were associated with decreased SCR (p=0.007). Conclusions. Despite having a stable renal function, our transplant recipients had a significant number of subclinical rejection on renal allograft biopsies.
A Prognostic Tool for Individualized Prediction of Graft Failure Risk within Ten Years after Kidney Transplantation
Identification of patients at risk of kidney graft loss relies on early individual prediction of graft failure. Data from 616 kidney transplant recipients with a follow-up of at least one year were retrospectively studied. A joint latent class model investigating the impact of serum creatinine (Scr) time-trajectories and onset of de novo donor-specific anti-HLA antibody (dnDSA) on graft survival was developed. The capacity of the model to calculate individual predicted probabilities of graft failure over time was evaluated in 80 independent patients. The model classified the patients in three latent classes with significantly different Scr time profiles and different graft survivals. Donor age contributed to explaining latent class membership. In addition to the SCr classes, the other variables retained in the survival model were proteinuria measured one-year after transplantation (HR=2.4, p=0.01), pretransplant non-donor-specific antibodies (HR=3.3, p<0.001), and dnDSA in patient who experienced acute rejection (HR=15.9, p=0.02). In the validation dataset, individual predictions of graft failure risk provided good predictive performances (sensitivity, specificity, and overall accuracy of graft failure prediction at ten years were 77.7%, 95.8%, and 85%, resp.) for the 60 patients who had not developed dnDSA. For patients with dnDSA individual risk of graft failure was not predicted with a so good performance.
Renal Dysfunction after Living-Donor Liver Transplantation: Experience with 500 Cases
Introduction. The possible risk factors for chronic kidney disease in transplant recipients have not been thoroughly investigated after living-donor liver transplantation. Material and Methods. A retrospective cohort study of consecutive adults who underwent living-donor liver transplantation between May 2004 and October 2016, in a single center, was conducted. Kidney function was investigated successively for all the patients throughout the study period, with 12 months being the shortest follow-up. Postoperative renal dysfunction was defined in accordance with the Chronic Kidney Disease Epidemiology Collaboration criteria. The patients’ demographic data, preoperative and intraoperative parameters, and outcomes were recorded. A calcineurin inhibitor-based immunosuppressive regimen, either tacrolimus or cyclosporine, was used in all the patients. Results. Of the 413 patients included in the study, 33 (8%) who survived for ≥1 year experienced chronic kidney disease 1 year after living-donor liver transplantation. Twenty-seven variables were studied to compare between the patients with normal kidney functions and those who developed chronic kidney disease 1 year after living-donor liver transplantation. Univariate regression analysis for predicting the likelihood of chronic kidney disease at 1 year revealed that the following 4 variables were significant: operative time, P < 0.0005; intraoperative blood loss, P < 0.0005; preoperative renal impairment, P = 0.001; and graft-to-recipient weight ratio (as a negative predictor), P < 0.0005. In the multivariate regression analysis, only 2 variables remained as independent predictors of chronic kidney disease at 1 year, namely, operative time with a cutoff value of ≥714 minutes and graft-to-recipient weight ratio as a negative predictor with a cutoff value of <0.91. Conclusion. In this study, prolonged operative time and small graft-to-recipient weight ratio were independent predictors of chronic kidney disease at 1 year after living-donor liver transplantation.
Metrics will be available once more articles are published.