Journal of Transplantation The latest articles from Hindawi © 2017 , Hindawi Limited . All rights reserved. Analysis of Risk Factors for Kidney Retransplant Outcomes Associated with Common Induction Regimens: A Study of over Twelve-Thousand Cases in the United States Sun, 24 Sep 2017 08:38:20 +0000 We studied registry data of 12,944 adult kidney retransplant recipients categorized by induction regimen received into antithymocyte globulin (ATG) (N = 9120), alemtuzumab (N = 1687), and basiliximab (N = 2137) cohorts. We analyzed risk factors for 1-year acute rejection (AR) and 5-year death-censored graft loss (DCGL) and patient death. Compared with the reference, basiliximab: (1) one-year AR risk was lower with ATG in retransplant recipients of expanded criteria deceased-donor kidneys (HR = 0.56, 95% CI = 0.35–0.91 and HR = 0.54, 95% CI = 0.27–1.08, resp.), while AR risk was lower with alemtuzumab in retransplant recipients with >3 HLA mismatches before transplant (HR = 0.63, 95% CI = 0.44–0.93 and HR = 0.81, 95% CI = 0.63–1.06, resp.); (2) five-year DCGL risk was lower with alemtuzumab, not ATG, in retransplant recipients of African American race (HR = 0.54, 95% CI = 0.34–0.86 and HR = 0.73, 95% CI = 0.51–1.04, resp.) or with pretransplant glomerulonephritis (HR = 0.65, 95% CI = 0.43–0.98 and HR = 0.82, 95% CI = 0.60–1.12, resp.). Therefore, specific risk factor-induction regimen combinations may predict outcomes and this information may help in individualizing induction in retransplant recipients. Alfonso H. Santos Jr., Michael J. Casey, and Karl L. Womer Copyright © 2017 Alfonso H. Santos Jr. et al. All rights reserved. Retrospective Study Looking at Cinacalcet in the Management of Hyperparathyroidism after Kidney Transplantation Mon, 13 Mar 2017 00:00:00 +0000 Objectives. The primary objective of this study is to evaluate the use of cinacalcet in the management of hyperparathyroidism in kidney transplant recipients. The secondary objective is to identify baseline factors that predict cinacalcet use after transplantation. Methods. In this single-center retrospective study, we conducted a chart review of all patients having been transplanted from 2003 to 2012 and having received cinacalcet up to kidney transplantation and/or thereafter. Results. Twenty-seven patients were included with a mean follow-up of years. Twenty-one were already taking cinacalcet at the time of transplantation. Cinacalcet was stopped within the first month in 12 of these patients of which 7 had to restart therapy. The main reason for restarting cinacalcet was hypercalcemia. Length of treatment was months. There were only 3 cases of mild hypocalcemia. There was no statistically significant association between baseline factors and cinacalcet status a year later. Conclusions. Discontinuing cinacalcet within the first month of kidney transplantation often leads to hypercalcemia. Cinacalcet appears to be an effective treatment of hypercalcemic hyperparathyroidism in kidney transplant recipients. Further studies are needed to evaluate safety and long-term benefits. Habib Mawad, Hugues Bouchard, Duy Tran, Denis Ouimet, Jean-Philippe Lafrance, Robert Zoël Bell, Sarah Bezzaoucha, Anne Boucher, Suzon Collette, Vincent Pichette, Lynne Senécal, and Michel Vallée Copyright © 2017 Habib Mawad et al. All rights reserved. The CECARI Study: Everolimus (Certican®) Initiation and Calcineurin Inhibitor Withdrawal in Maintenance Heart Transplant Recipients with Renal Insufficiency: A Multicenter, Randomized Trial Mon, 20 Feb 2017 00:00:00 +0000 In this 3-year, open-label, multicenter study, 57 maintenance heart transplant recipients (>1 year after transplant) with renal insufficiency (eGFR 30–60 mL/min/1.73 m2) were randomized to start everolimus with CNI withdrawal () or continue their current CNI-based immunosuppression (). The primary endpoint, change in measured glomerular filtration rate (mGFR) from baseline to year 3, did not differ significantly between both groups (+7.0 mL/min in the everolimus group versus +1.9 mL/min in the CNI group, ). In the on-treatment analysis, the difference did reach statistical significance (+9.4 mL/min in the everolimus group versus +1.9 mL/min in the CNI group, ). The composite safety endpoint of all-cause mortality, major adverse cardiovascular events, or treated acute rejection was not different between groups. Nonfatal adverse events occurred in 96.6% of patients in the everolimus group and 57.1% in the CNI group (). Ten patients (34.5%) in the everolimus group discontinued the study drug during follow-up due to adverse events. The poor adherence to the everolimus therapy might have masked a potential benefit of CNI withdrawal on renal function. Jan Van Keer, David Derthoo, Olivier Van Caenegem, Michel De Pauw, Eric Nellessen, Nathalie Duerinckx, Walter Droogne, Gábor Vörös, Bart Meyns, Ann Belmans, Stefan Janssens, Johan Van Cleemput, and Johan Vanhaecke Copyright © 2017 Jan Van Keer et al. All rights reserved. Switching Stable Kidney Transplant Recipients to a Generic Tacrolimus Is Feasible and Safe, but It Must Be Monitored Thu, 26 Jan 2017 11:07:56 +0000 Background. Tacrolimus is the primary immunosuppressive drug used in kidney transplant patients. Replacing brand name products with generics is a controversial issue that we studied after a Chilean Ministry of Health mandate to implement such a switch. Methods. Forty-one stable Prograf (Astellas) receiving kidney transplant patients were switched to a generic tacrolimus (Sandoz) in a 1 : 1 dose ratio and were followed up for up to 8 months. All other drugs were maintained as per normal practice. Results. Neither tacrolimus doses nor their trough blood levels changed significantly after the switch, but serum creatinine did: versus  mg/dL (). At the same time, five graft biopsies were performed, and two of them showed cellular acute rejection. There were nine infectious episodes treated satisfactorily with proper therapies. No patient or graft was lost during the follow-up time period. Conclusion. Switching from brand name tacrolimus to a generic tacrolimus (Sandoz) is feasible and appears to be safe, but it must be monitored carefully by treating physicians. Fernando González, René López, Elizabeth Arriagada, René Carrasco, Natalia Gallardo, and Eduardo Lorca Copyright © 2017 Fernando González et al. All rights reserved. A Single Perioperative Injection of Dexamethasone Decreases Nausea, Vomiting, and Pain after Laparoscopic Donor Nephrectomy Sun, 22 Jan 2017 00:00:00 +0000 Background. A single dose of perioperative dexamethasone (8–10 mg) reportedly decreases postoperative nausea, vomiting, and pain but has not been widely used in laparoscopic donor nephrectomy (LDN). Methods. We performed a retrospective cohort study of living donors who underwent LDN between 2013 and 2015. Donors who received a lower dose (4–6 mg)  () or a higher dose (8–14 mg) of dexamethasone () were compared with 111 donors who did not receive dexamethasone (control). Outcomes and incidence of postoperative nausea, vomiting, and pain within 24 h after LDN were compared before and after propensity-score matching. Results. The higher dose of dexamethasone reduced postoperative nausea and vomiting incidences by 28% () compared to control, but the lower dose did not. Total opioid use was 29% lower in donors who received the higher dose than in control (). The higher dose was identified as an independent factor for preventing postoperative nausea and vomiting. Postoperative complication rates and hospital stays did not differ between the groups. After propensity-score matching, the results were the same as for the unmatched analysis. Conclusion. A single perioperative injection of 8–14 mg dexamethasone decreases antiemetic and narcotic requirements in the first 24 h, with no increase in surgical complications. Shigeyoshi Yamanaga, Andrew Mark Posselt, Chris Earl Freise, Takaaki Kobayashi, Mehdi Tavakol, and Sang-Mo Kang Copyright © 2017 Shigeyoshi Yamanaga et al. All rights reserved. Risk Balancing of Cold Ischemic Time against Night Shift Surgery Possibly Reduces Rates of Reoperation and Perioperative Graft Loss Thu, 19 Jan 2017 11:21:24 +0000 Background. This retrospective cohort study evaluates the advantages of risk balancing between prolonged cold ischemic time (CIT) and late night surgery. Methods. 1262 deceased donor kidney transplantations were analyzed. Multivariable regression was used to determine odds ratios (ORs) for reoperation, graft loss, delayed graft function (DGF), and discharge on dialysis. CIT was categorized according to a forward stepwise pattern ≤1h/>1h, ≤2h/>2h, ≤3h/>3h,, ≤nh/>nh. ORs for DGF were plotted against CIT and a nonlinear regression function with best was identified. First and second derivative were then implemented into the curvature formula to determine the point of highest CIT-mediated risk acceleration. Results. Surgery between 3 AM and 6 AM is an independent risk factor for reoperation and graft loss, whereas prolonged CIT is only relevant for DGF. CIT-mediated risk for DGF follows an exponential pattern with a cut-off for the highest risk increment at 23.5 hours. Conclusions. The risk of surgery at 3 AM–6 AM outweighs prolonged CIT when confined within 23.5 hours as determined by a new mathematical approach to calculate turning points of nonlinear time related risks. CIT is only relevant for the endpoint of DGF but had no impact on discharge on dialysis, reoperation, or graft loss. Nikos Emmanouilidis, Julius Boeckler, Bastian P. Ringe, Alexander Kaltenborn, Frank Lehner, Hans Friedrich Koch, Jürgen Klempnauer, and Harald Schrem Copyright © 2017 Nikos Emmanouilidis et al. All rights reserved. Blood Transfusions and Tumor Biopsy May Increase HCC Recurrence Rates after Liver Transplantation Thu, 05 Jan 2017 12:40:30 +0000 Introduction. Beneath tumor grading and vascular invasion, nontumor related risk factors for HCC recurrence after liver transplantation (LT) have been postulated. Potential factors were analyzed in a large single center experience. Material and Methods. This retrospective analysis included 336 consecutive patients transplanted for HCC. The following factors were analyzed stratified for vascular invasion: immunosuppression, rejection therapy, underlying liver disease, age, gender, blood transfusions, tumor biopsy, caval replacement, waiting time, Child Pugh status, and postoperative complications. Variables with a potential prognostic impact were included in a multivariate analysis. Results. The 5- and 10-year patient survival rates were 70 and 54%. The overall 5-year recurrence rate was 48% with vascular invasion compared to 10% without (). Univariate analysis stratified for vascular invasion revealed age over 60, pretransplant tumor biopsy, and the application of blood transfusions as significant risk factors for tumor recurrence. Blood transfusions remained the only significant risk factor in the multivariate analysis. Recurrence occurred earlier and more frequently in correlation with the number of applied transfusions. Conclusion. Tumor related risk factors are most important and can be influenced by patient selection. However, it might be helpful to consider nontumor related risk factors, identified in the present study for further optimization of the perioperative management. Daniel Seehofer, Robert Öllinger, Timm Denecke, Moritz Schmelzle, Andreas Andreou, Eckart Schott, and Johann Pratschke Copyright © 2017 Daniel Seehofer et al. All rights reserved. Pretransplant Factors and Associations with Postoperative Respiratory Failure, ICU Length of Stay, and Short-Term Survival after Liver Transplantation in a High MELD Population Thu, 17 Nov 2016 09:34:22 +0000 Changes in distribution policies have increased median MELD at transplant with recipients requiring increasing intensive care perioperatively. We aimed to evaluate association of preoperative variables with postoperative respiratory failure (PRF)/increased intensive care unit length of stay (ICU LOS)/short-term survival in a high MELD cohort undergoing liver transplant (LT). Retrospective analysis identified cases of PRF and increased ICU LOS with recipient, donor, and surgical variables examined. Variables were entered into regression with end points of PRF and ICU LOS > 3 days. 164 recipients were examined: 41 (25.0%) experienced PRF and 74 (45.1%) prolonged ICU LOS. Significant predictors of PRF with univariate analysis: BMI > 30, pretransplant MELD, preoperative respiratory failure, LVEF < 50%, FVC < 80%, intraoperative transfusion > 6 units, warm ischemic time > 4 minutes, and cold ischemic time > 240 minutes. On multivariate analysis, only pretransplant MELD predicted PRF (OR 1.14, ). Significant predictors of prolonged ICU LOS with univariate analysis are as follows: pretransplant MELD, FVC < 80%, FEV1 < 80%, deceased donor, and cold ischemic time > 240 minutes. On multivariate analysis, only pretransplant MELD predicted prolonged ICU LOS (OR 1.28, ). One-year survival among cohorts with PRF and increased ICU LOS was similar to subjects without. Pretransplant MELD is a robust predictor of PRF and ICU LOS. Higher MELDs at LT are expected to increase need for ICU utilization and modify expectations for recovery in the immediate postoperative period. Mark R. Pedersen, Myunghan Choi, Jeffrey A. Brink, and Anil B. Seetharam Copyright © 2016 Mark R. Pedersen et al. All rights reserved. Immunomodulatory Role of Mesenchymal Stem Cell Therapy in Vascularized Composite Allotransplantation Sun, 16 Oct 2016 11:15:46 +0000 This review aims to summarize contemporary evidence of the in vitro and in vivo immunomodulatory effects of mesenchymal stem cells (MSCs) in promoting vascularized composite allotransplant (VCA) tolerance. An extensive literature review was performed to identify pertinent articles of merit. Prospective preclinical trials in mammal subjects receiving VCA (or skin allograft) with administration of MSCs were reviewed. Prospective clinical trials with intravascular delivery of MSCs in human populations undergoing solid organ transplant were also identified and reviewed. Sixteen preclinical studies are included. Eleven studies compared MSC monotherapy to no therapy; of these, ten reported improved graft survival, which was statistically significantly prolonged in eight. Eight studies analyzed allograft survival with MSC therapy as an adjunct to proven immunosuppressive regimens. In these studies, daily immunosuppression was transiently delivered and then stopped. In all studies, treatment-free graft survival was statistically significantly prolonged in animals that received MSC therapy. MSCs have been safely administered clinically and their use in renal transplant clinical trials provides evidence that they improve allograft transplant tolerance in clinical practice. There is potential for MSC induction therapy to overcome many of the obstacles to widespread VCA in clinical practice. Preclinical studies are needed before MSC-induced VCA tolerance becomes a clinical reality. Richard Heyes, Andrew Iarocci, Yourka Tchoukalova, and David G. Lott Copyright © 2016 Richard Heyes et al. All rights reserved. Everolimus and Malignancy after Solid Organ Transplantation: A Clinical Update Tue, 11 Oct 2016 12:02:52 +0000 Malignancy after solid organ transplantation remains a major cause of posttransplant mortality. The mammalian target of rapamycin (mTOR) inhibitor class of immunosuppressants exerts various antioncogenic effects, and the mTOR inhibitor everolimus is licensed for the treatment of several solid cancers. In kidney transplantation, evidence from registry studies indicates a lower rate of de novo malignancy under mTOR inhibition, with some potentially supportive data from randomized trials of everolimus. Case reports and small single-center series have suggested that switch to everolimus may be beneficial following diagnosis of posttransplant malignancy, particularly for Kaposi’s sarcoma and nonmelanoma skin cancer, but prospective studies are lacking. A systematic review has shown mTOR inhibition to be associated with a significantly lower rate of hepatocellular carcinoma (HCC) recurrence versus standard calcineurin inhibitor therapy. One meta-analysis has concluded that patients with nontransplant HCC experience a low but significant survival benefit under everolimus monotherapy, so far unconfirmed in a transplant population. Data are limited in heart transplantation, although observational data and case reports have indicated that introduction of everolimus is helpful in reducing the recurrence of skin cancers. Overall, it can be concluded that, in certain settings, everolimus appears a promising option to lessen the toll of posttransplant malignancy. Hallvard Holdaas, Paolo De Simone, and Andreas Zuckermann Copyright © 2016 Hallvard Holdaas et al. All rights reserved. Utilization of Public Health Service Increased Risk Donors Yields Equivalent Outcomes in Liver Transplantation Thu, 29 Sep 2016 09:50:44 +0000 Background. The PHS increased risk donor (IRD) is underutilized in liver transplantation. We aimed to examine the posttransplant outcomes in recipients of increased-risk organs. Methods. We analyzed 228,040 transplants in the Organ Procurement and Transplantation Network database from 2004 to 2013. Endpoints were graft failure and death. Results were controlled for demographics and comorbidities. Statistical analysis utilized Fisher’s test and logistic regression. Results. 58,816 patients were identified (5,534 IRD, 53,282 non-IRD). IRDs were more frequently male (69.2% versus 58.3%, ), younger (34 versus 39, ), and less likely to have comorbidities (). Waitlist time was longer for IRD graft recipients (254 versus 238 days, ). All outcomes were better in the IRD group. Graft failure (23.6 versus 27.3%, ) and mortality (20.4 versus 22.3%, ) were decreased in IRD graft recipients. However, in multivariate analysis, IRD status was not a significant indicator of outcomes. Conclusion. This is the first study to describe IRD demographics in liver transplantation. Outcomes are improved in IRD organ recipients; however, controlling for donor and recipient comorbidities, ischemia time, and MELD score, the differences lose significance. In multivariate analysis, use of IRD organs is noninferior, with similar graft failure and mortality despite the infectious risk. V. A. Fleetwood, J. Lusciks, J. Poirier, M. Hertl, and E. Y. Chan Copyright © 2016 V. A. Fleetwood et al. All rights reserved. Manipulation of Ovarian Function Significantly Influenced Sarcopenia in Postreproductive-Age Mice Thu, 22 Sep 2016 14:15:26 +0000 Previously, transplantation of ovaries from young cycling mice into old postreproductive-age mice increased life span. We anticipated that the same factors that increased life span could also influence health span. Female CBA/J mice received new (60 d) ovaries at 12 and 17 months of age and were evaluated at 16 and 25 months of age, respectively. There were no significant differences in body weight among any age or treatment group. The percentage of fat mass was significantly increased at 13 and 16 months of age but was reduced by ovarian transplantation in 16-month-old mice. The percentages of lean body mass and total body water were significantly reduced in 13-month-old control mice but were restored in 16- and 25-month-old recipient mice by ovarian transplantation to the levels found in six-month-old control mice. In summary, we have shown that skeletal muscle mass, which is negatively influenced by aging, can be positively influenced or restored by reestablishment of active ovarian function in aged female mice. These findings provide strong incentive for further investigation of the positive influence of young ovaries on restoration of health in postreproductive females. Rhett L. Peterson, Kate C. Parkinson, and Jeffrey B. Mason Copyright © 2016 Rhett L. Peterson et al. All rights reserved. C1Q Assay Results in Complement-Dependent Cytotoxicity Crossmatch Negative Renal Transplant Candidates with Donor-Specific Antibodies: High Specificity but Low Sensitivity When Predicting Flow Crossmatch Sun, 04 Sep 2016 10:38:05 +0000 The aim of the present study was to describe the association of positive flow cross match (FXM) and C1q-SAB. Methods. In this observational, cross-sectional, and comparative study, patients included had negative AHG-CDC-XM and donor specific antibodies (DSA) and were tested with FXM. All pretransplant sera were tested with C1q-SAB assay. Results. A total of 50 donor/recipient evaluations were conducted; half of them had at least one C1q+ Ab (, 52%). Ten patients (20.0%) had DSA C1q+ Ab. Twenty-five (50%) FXMs were positive. Factors associated with a positive FXM were the presence of C1q+ Ab (DSA C1q+ Ab: OR 27, 2.80–259.56, , and no DSA C1q+ Ab: OR 5, 1.27–19.68, ) and the DSA LABScreen-SAB MFI (OR 1.26, 95% CI 1.06–1.49, ). The cutoff point of immunodominant LABScreen SAB DSA-MFI with the greatest sensitivity and specificity to predict FXM was 2,300 (sensitivity: 72% and specificity: 75%). For FXM prediction, DSA C1q+ Ab was the most specific (95.8%, 85–100) and the combination of DSA-MFI > 2,300 and C1q+ Ab was the most sensitive (92.0%, 79.3–100). Conclusions. C1q+ Ab and LABScreen SAB DSA-MFI were significantly associated with FXM. DSA C1q+ Ab was highly specific but with low sensitivity. José M. Arreola-Guerra, Natalia Castelán, Adrián de Santiago, Adriana Arvizu, Norma Gonzalez-Tableros, Mayra López, Isaac Salcedo, Mario Vilatobá, Julio Granados, Luis E. Morales-Buenrostro, and Josefina Alberú Copyright © 2016 José M. Arreola-Guerra et al. All rights reserved. Impact of Recipient and Donor Obesity Match on the Outcomes of Liver Transplantation: All Matches Are Not Perfect Thu, 01 Sep 2016 13:30:08 +0000 There is a paucity of literature examining recipient-donor obesity matching on liver transplantation outcomes. The United Network for Organ Sharing database was queried for first-time recipients of liver transplant whose age was ≥18 between January 2003 and September 2013. Outcomes including patient and graft survival at 30 days, 1 year, and 5 years and overall, liver retransplantation, and length of stay were compared between nonobese recipients receiving a graft from nonobese donors and obese recipient-obese donor, obese recipient-nonobese donor, and nonobese recipient-obese donor pairs. 51,556 LT recipients were identified, including 34,217 (66%) nonobese and 17,339 (34%) obese recipients. The proportions of patients receiving an allograft from an obese donor were 24% and 29%, respectively, among nonobese and obese recipients. Graft loss (HR: 1.27; 95% CI: 1.09–1.46; ) and mortality (HR: 1.38; 95% CI: 1.16–1.65; ) at 30 days were increased in the obese recipient-obese donor pair. However, 1-year graft (HR: 0.83; 95% CI: 0.74–0.93; ) and patient (HR: 0.84; 95% CI: 0.74–0.95; ) survival and overall patient (HR: 0.93; 95% CI: 0.86–1.00; ) survival were favorable. There is evidence of recipient and donor obesity disadvantage early, but survival curves demonstrate improved long-term outcomes. It is important to consider obesity in the donor-recipient match. Eliza W. Beal, Dmitry Tumin, Lanla F. Conteh, A. James Hanje, Anthony J. Michaels, Don Hayes Jr., Sylvester M. Black, and Khalid Mumtaz Copyright © 2016 Eliza W. Beal et al. All rights reserved. For and against Organ Donation and Transplantation: Intricate Facilitators and Barriers in Organ Donation Perceived by German Nurses and Doctors Mon, 15 Aug 2016 08:50:51 +0000 Background. Significant facilitators and barriers to organ donation and transplantation remain in the general public and even in health professionals. Negative attitudes of HPs have been identified as the most significant barrier to actual ODT. The purpose of this paper was hence to investigate to what extent HPs (physicians and nurses) experience such facilitators and barriers in ODT and to what extent they are intercorrelated. We thus combined single causes to circumscribed factors of respective barriers and facilitators and analyzed them for differences regarding profession, gender, spiritual/religious self-categorization, and self-estimated knowledge of ODT and their mutual interaction. Methods. By the use of questionnaires we investigated intricate facilitators and barriers to organ donation experienced by HPs (; 73% nurses, 27% physicians) in around ten wards at the University Hospital of Munich. Results. Our study confirms a general high agreement with the importance of ODT. Nevertheless, we identified both facilitators and barriers in the following fields: (1) knowledge of ODT and willingness to donate own organs, (2) ethical delicacies in ODT, (3) stressors to handle ODT in the hospital, and (4) individual beliefs and self-estimated religion/spirituality. Conclusion. Attention to the intricacy of stressors and barriers in HPs continues to be a high priority focus for the availability of donor organs. Niels Christian Hvidt, Beate Mayr, Piret Paal, Eckhard Frick, Anna Forsberg, and Arndt Büssing Copyright © 2016 Niels Christian Hvidt et al. All rights reserved. The Kidney Transplant Evaluation Process in the Elderly: Reasons for Being Turned down and Opportunities to Improve Cost-Effectiveness in a Single Center Thu, 04 Aug 2016 14:20:11 +0000 Background. The kidney transplant evaluation process for older candidates is complex due to the presence of multiple comorbid conditions. Methods. We retrospectively reviewed patients ≥60 years referred to our center for kidney transplantation over a 3-year period. Variables were collected to identify reasons for patients being turned down and to determine the number of unnecessary tests performed. Statistical analysis was performed to estimate the association between clinical predictors and listing status. Results. 345 patients were included in the statistical analysis. 31.6% of patients were turned down: 44% due to coronary artery disease (CAD), peripheral vascular disease (PVD), or both. After adjustment for patient demographics and comorbid conditions, history of CAD, PVD, or both (OR = 1.75, 95% CI (1.20, 2.56), ), chronic obstructive pulmonary disease (OR = 8.75, 95% CI (2.81, 27.20), ), and cancer (OR 2.59, 95% CI (1.18, 5.67), ) were associated with a higher risk of being turned down. 14.8% of patients underwent unnecessary basic testing and 9.6% underwent unnecessary supplementary testing with the charges over a 3-year period estimated at $304,337. Conclusion. A significant number of older candidates are deemed unacceptable for kidney transplantation with primary reasons cited as CAD and PVD. The overall burden of unnecessary testing is substantial and potentially avoidable. Beatrice P. Concepcion, Rachel C. Forbes, Aihua Bian, and Heidi M. Schaefer Copyright © 2016 Beatrice P. Concepcion et al. All rights reserved. The Utility of Routine Ultrasound Imaging after Elective Transplant Ureteric Stent Removal Thu, 14 Jul 2016 06:15:15 +0000 Background. Ureteric stent insertion during kidney transplantation reduces the incidence of major urological complications (MUCs). We evaluated whether routine poststent removal graft ultrasonography (PSRGU) was useful in detecting MUCs before they became clinically or biochemically apparent. Methods. A retrospective analysis was undertaken of clinical outcomes following elective stent removals from adult single renal transplant recipients (sRTRs) at our centre between 1 January 2011 and 31 December 2013. Results. Elective stent removal was performed for 338 sRTRs. Of these patients, 222 had routine PSRGU (median (IQR) days after stent removal = 18 (11–31)), 79 had urgent PSRGU due to clinical or biochemical indications, 12 had CT imaging, and 25 had no further renal imaging. Of the 222 sRTRs who underwent routine PSRGU, 210 (94.6%) had no change of management, three (1.4%) required repeat imaging only, and eight patients (3.6%) had incidental (nonureteric) findings. One patient (0.5%) had nephrostomy insertion as a result of routine PSRGU findings, but no ureteric stenosis was identified. Of 79 patients having urgent PSRGU after elective stent removal, three patients required transplant ureteric reimplantation. Conclusions. This analysis found no evidence that routine PSRGU at two to three weeks after elective stent removal provides any added value beyond standard clinical and biochemical monitoring. Bibek Das, Dorian Hobday, Jonathon Olsburgh, and Chris Callaghan Copyright © 2016 Bibek Das et al. All rights reserved. Clinical Course and Outcomes of Late Kidney Allograft Dysfunction Sun, 10 Jul 2016 13:07:27 +0000 Background. This study is provided to increase the efficiency of the treatment of kidney transplant recipients by predicting the development of the late allotransplant dysfunction. Methods. 330 patients who have lived for more than one year with functioning kidney allograft were evaluated. To predict the subsequent duration of the well-functioning of allotransplant the prognostic significance of 15 baseline clinical and sociodemographic characteristics on the results of the survey one year after transplantation was investigated. The result was considered to be positive in constructing the regression prognostication model if recipient lived more than 3 years from the time of transplantation. Results. It was established that more late start of renal allograft dysfunction after transplantation correlates with the more time it takes till complete loss of allograft function. Creatinine and hemoglobin blood concentration and the level of proteinuria one year after transplantation within created mathematical model allow predicting the loss of kidney transplant function three years after the transplantation. Patients with kidney transplant dysfunction are advised to renew the program hemodialysis upon reaching plasma creatinine concentration 0.5–0.7 mmol/L. Conclusion. Values of creatinine, hemoglobin, and proteinuria one year after transplantation can be used for subsequent prognostication of kidney transplant function. Viktor Denisov, Vadym Zakharov, Anna Ksenofontova, Eugene Onishchenko, Tatyana Golubova, Sergey Kichatyi, and Olga Zakharova Copyright © 2016 Viktor Denisov et al. All rights reserved. Intermediate-Term Outcomes of Dual Adult versus Single-Kidney Transplantation: Evolution of a Surgical Technique Sun, 10 Jul 2016 08:18:32 +0000 Background. Acceptance of dual kidney transplantation (DKT) has proven difficult, due to surgical complexity and concerns regarding long-term outcomes. We herein present a standard technique for ipsilateral DKT and compare outcomes to single-kidney transplant (SKT) recipients. Methods. A retrospective single-center comparison of DKT and SKT performed between February 2007 and July 2013. Results. Of 516 deceased donor kidney transplants, 29 were DKT and 487 were SKT. Mean follow-up was 43 ± 67 months. DKT recipients were older and more likely than SKT recipients to receive an extended criteria graft (). For DKT versus SKT, the rates of delayed graft function (10.3 versus 9.2%) and acute rejection (20.7 versus 22.4%) were equivalent ( = ns). A higher than expected urologic complication rate in the DKT cohort (14 versus 2%, ) was reduced through modification of the ureteral anastomosis. Graft survival was equivalent between DKT and SKT groups ( = ns) with actuarial 3-year DKT patient and graft survivals of 100% and 93%. At 3 years, the groups had similar renal function ( = ns). Conclusions. By utilizing extended criteria donor organs as DKT, the donor pool was enlarged while providing excellent patient and graft survival. The DKT urologic complication rate was reduced by modification of the ureteral anastomosis. Ana K. Islam, Richard J. Knight, Wesley A. Mayer, Adam B. Hollander, Samir Patel, Larry D. Teeter, Edward A. Graviss, Ashish Saharia, Hemangshu Podder, Emad H. Asham, and A. Osama Gaber Copyright © 2016 Ana K. Islam et al. All rights reserved. Effectively Screening for Coronary Artery Disease in Patients Undergoing Orthotopic Liver Transplant Evaluation Wed, 22 Jun 2016 09:11:41 +0000 Coronary artery disease (CAD) is prevalent in patients with end-stage liver disease and associated with poor outcomes when undergoing orthotopic liver transplantation (OLT); however, noninvasive screening for CAD in this population is less sensitive. In an attempt to identify redundancy, we reviewed our experience among patients undergoing CAD screening as part of their OLT evaluation between May 2009 and February 2014. Demographic, clinical, and procedural characteristics were analyzed. Of the total number of screened patients (), initial screening was more common via stress testing (; 75.8%) than coronary angiography (; 24.2%). Most with initial stress testing underwent angiography (; 39.4%). Among those undergoing angiography, CAD was common (; 23.5%). Across the entire cohort the number of traditional risk factors was linearly associated with CAD, and those with two or more risk factors were found to have CAD by angiography 50% of the time (OR 1.92; CI 1.07–3.44, ). Our data supports that CAD is prevalent among pre-OLT patients, especially among those with 2 or more risk factors. Moreover, we identified a lack of uniformity in practice and the need for evidence-based and standardized screening protocols. Bryan C. Lee, Feng Li, Adam J. Hanje, Khalid Mumtaz, Konstantinos D. Boudoulas, and Scott M. Lilly Copyright © 2016 Bryan C. Lee et al. All rights reserved. Current Treatment Approaches to HCC with a Special Consideration to Transplantation Mon, 20 Jun 2016 07:07:50 +0000 Hepatocellular carcinoma (HCC) is the third leading cause of cancer deaths worldwide. The mainstay of treatment of HCC has been both resectional and transplantation surgery. It is well known that, in selected, optimized patients, hepatectomy for HCC may be an option, even in patients with underlying cirrhosis. Resectable patients with early HCC and underlying liver disease are however increasingly being considered for transplantation because of potential for better disease-free survival and resolution of underlying liver disease, although this approach is limited by the availability of donor livers, especially in resectable patients. Outcomes following liver transplantation improved dramatically for patients with HCC following the implementation of the Milan criteria in the late 1990s. Ever since, the rather restrictive nature of the Milan criteria has been challenged with good outcomes. There has also been an increase in the donor pool with marginal donors including organs retrieved following cardiac death being used. Even so, patients still continue to die while waiting for a liver transplant. In order to reduce this attrition, bridging techniques and methods for downstaging disease have evolved. Additionally new techniques for organ preservation have increased the prospect of this potentially curative procedure being available for a greater number of patients. N. Bhardwaj, M. T. P. R. Perera, and M. A. Silva Copyright © 2016 N. Bhardwaj et al. All rights reserved. Incidence, Characteristics, and Prognosis of Incidentally Discovered Hepatocellular Carcinoma after Liver Transplantation Wed, 15 Jun 2016 11:58:39 +0000 Background. We aimed to assess incidentally discovered hepatocellular carcinoma (iHCC) over time and to compare outcome to preoperatively diagnosed hepatocellular carcinoma (pdHCC) and nontumor liver transplants. Methods. We studied adults transplanted with a follow-up of at least one year. Patients were divided into 3 groups according to diagnosis of hepatocellular carcinoma. Results. Between 1990 and 2010, 887 adults were transplanted. Among them, 121 patients (13.6%) had pdHCC and 32 patients (3.6%) had iHCC; frequency of iHCC decreased markedly over years, in parallel with significant increase in pdHCC. Between 1990 and 1995, 120 patients had liver transplants, 4 (3.3%) of them had iHCC, and only 3 (2.5%) had pdHCC, while in the last 5 years, 263 patients were transplanted, 7 (0.03%) of them had iHCC, and 66 (25.1%) had pdHCC (). There was no significant difference between groups regarding patient survival; 5-year survival was 74%, 75.5%, and 77.3% in iHCC, pdHCC, and non-HCC groups, respectively (). Patients with iHCC had no recurrences after transplant, while pdHCC patients experienced 17 recurrences (15.3%) (). Conclusions. iHCC has significantly decreased despite steady increase in number of transplants for hepatocellular carcinoma. Patients with iHCC had excellent outcomes with no tumor recurrence and survival comparable to pdHCC. Walid El Moghazy, Samy Kashkoush, Glenda Meeberg, and Norman Kneteman Copyright © 2016 Walid El Moghazy et al. All rights reserved. Liver Transplantation for Hepatocellular Carcinoma: A Single Center Resume Overlooking Four Decades of Experience Sun, 10 Jan 2016 09:48:22 +0000 Background. This is a single center oncological resume overlooking four decades of experience with liver transplantation (LT) for hepatocellular carcinoma (HCC). Methods. All 319 LT for HCC that were performed between 1975 and 2011 were included. Predictors for HCC recurrence (HCCR) and survival were identified by Cox regression, Kaplan-Meier analysis, Log Rank, and -tests where appropriate. Results. HCCR was the single strongest hazard for survival (). Hazards for HCCR were tumor staging beyond the histologic MILAN (), bilateral tumor spreading (), tumor grading beyond G2 (), and vascular infiltration of small or large vessels (, , resp.). Grading beyond G2 () as well as small and large vascular infiltrations (, , resp.) was associated with higher hazard ratios for long-term survival as compared to liver transplantation beyond histological MILAN (). Tumor dedifferentiation significantly correlated with vascular infiltration () and intrahepatic tumor spreading (). Conclusion. LT enables survival from HCC. HCC dedifferentiation is associated with vascular infiltration and intrahepatic tumor spreading and is a strong hazard for HCCR and survival. Pretransplant tumor staging should include grading by biopsy, because grading is a reliable and easily accessible predictor of HCCR and survival. Detection of dedifferentiation should speed up the allocation process. Nikos Emmanouilidis, Rickmer Peters, Bastian P. Ringe, Zeynep Güner, Wolf Ramackers, Hüseyin Bektas, Frank Lehner, Michael Manns, Jürgen Klempnauer, and Harald Schrem Copyright © 2016 Nikos Emmanouilidis et al. All rights reserved. Lung Transplantation in Patients with High Lung Allocation Scores in the US: Evidence for the Need to Evaluate Score Specific Outcomes Mon, 21 Dec 2015 13:54:04 +0000 Objective. The lung allocation score (LAS) resulted in a lung transplantation (LT) selection process guided by clinical acuity. We sought to evaluate the relationship between LAS and outcomes. Methods. We analyzed Scientific Registry of Transplant Recipient (SRTR) data pertaining to recipients between 2005 and 2012. We stratified them into quartiles based on LAS and compared survival and predictors of mortality. Results. We identified 10,304 consecutive patients, comprising 2,576 in each LAS quartile (quartile 1 (26.3–35.5), quartile 2 (35.6–39.3), quartile 3 (39.4–48.6), and quartile 4 (48.7–95.7)). Survival after 30 days (96.9% versus 96.8% versus 96.0% versus 94.8%), 90 days (94.6% versus 93.7% versus 93.3% versus 90.9%), 1 year (87.2% versus 85.0% versus 84.8% versus 80.9%), and 5 years (55.4% versus 54.5% versus 52.5% versus 48.8%) was higher in the lower groups. There was a significantly higher 5-year mortality in the highest LAS group (HR 1.13, , HR 1.17, , and HR 1.17, ) comparing quartiles 2, 3, and 4, respectively, to quartile 1. Conclusion. Overall, outcomes in recipients with higher LAS are worse than those in patients with lower LAS. These data should inform more individualized evidence-based discussion during pretransplant counseling. Jeremiah A. Hayanga, Alena Lira, Tedi Vlahu, Jingyan Yang, Jonathan K. Aboagye, Heather K. Hayanga, James D. Luketich, and Jonathan D’Cunha Copyright © 2015 Jeremiah A. Hayanga et al. All rights reserved. Risk Factors Associated with Increased Morbidity in Living Liver Donation Tue, 15 Dec 2015 14:25:59 +0000 Living donor liver donation (LDLD) is an alternative to cadaveric liver donation. We aimed at identifying risk factors and developing a score for prediction of postoperative complications (POCs) after LDLD in donors. This is a retrospective cohort study in 688 donors between June 1995 and February 2014 at Hospital Sírio-Libanês and A.C. Camargo Cancer Center, in São Paulo, Brazil. Primary outcome was POC graded ≥III according to the Clavien-Dindo classification. Left lateral segment (LLS), left lobe (LL), and right lobe resections (RL) were conducted in 492 (71.4%), 109 (15.8%), and 87 (12.6%) donors, respectively. In total, 43 (6.2%) developed POCs, which were more common after RL than LLS and LL (14/87 (16.1%) versus 23/492 (4.5%) and 6/109 (5.5%), resp., ). Multivariate analysis showed that RL resection (OR: 2.81, 95% CI: 1.32 to 3.01; ), smoking status (OR: 3.2, 95% CI: 1.35 to 7.56; ), and blood transfusion (OR: 3.15, 95% CI: 1.45 to 6.84; ) were independently associated with POCs. RL resection, intraoperative blood transfusion, and smoking were associated with increased risk for POCs in donors. Helry L. Candido, Eduardo A. da Fonseca, Flávia H. Feier, Renata Pugliese, Marcel A. Benavides, Enis D. Silva, Karina Gordon, Marcelo Gama de Abreu, Jaume Canet, Paulo Chapchap, and Joao Seda Neto Copyright © 2015 Helry L. Candido et al. All rights reserved. Plasma Exchange for the Recurrence of Primary Focal Segmental Glomerulosclerosis in Adult Renal Transplant Recipients: A Meta-Analysis Mon, 30 Nov 2015 06:33:50 +0000 Background. Posttransplant recurrence of primary focal segmental glomerulosclerosis (rFSGS) in the form of massive proteinuria is not uncommon and has detrimental consequences on renal allograft survival. A putative circulating permeability factor has been implicated in the pathogenesis leading to widespread use of plasma exchange (PLEX). We reviewed published studies to assess the role of PLEX on treatment of rFSGS in adults. Methods. Eligible manuscripts compared PLEX or variants with conventional care for inducing proteinuria remission (PR) in rFSGS and were identified through MEDLINE and reference lists. Data were abstracted in parallel by two reviewers. Results. We detected 6 nonrandomized studies with 117 cases enrolled. In a random effects model, the pooled risk ratio for the composite endpoint of partial or complete PR was 0,38 in favour of PLEX (95% CI: 0,23–0,61). No statistical heterogeneity was observed among included studies (%, = 0,42). On average, 9–26 PLEX sessions were performed to achieve PR. Renal allograft loss due to recurrence was lower (range: 0%–67%) in patients treated with PLEX. Conclusion. Notwithstanding the inherent limitations of small, observational trials, PLEX appears to be effective for PR in rFSGS. Additional research is needed to further elucidate its optimal use and impact on long-term allograft survival. Georgios Vlachopanos, Argyrios Georgalis, and Harikleia Gakiopoulou Copyright © 2015 Georgios Vlachopanos et al. All rights reserved. Psychosocial Status of Liver Transplant Candidates in Iran and Its Correlation with Health-Related Quality of Life and Depression and Anxiety Sun, 15 Nov 2015 11:10:53 +0000 Objectives. The study was aimed at providing a psychosocial profile for Iranian liver transplant candidates referred to an established liver transplantation program. Material and Methods. Patients assessed for liver transplant candidacy in Imam Khomeini Hospital (Tehran, Iran) between March 2013 and September 2014 were included. The following battery of tests were administered: Psychosocial Assessment of Candidates for Transplant (PACT), the Short-Form health survey (SF-36), and Hospital Anxiety and Depression Scale (HADS). Results. Psychosocial assessment in 205 liver transplant candidates revealed significant impairments in several SF-36 domains; social functioning was the least and physical functioning was the most impaired domains. The prevalence of cases with probable anxiety and depressive disorders, according to HADS, was 13.8% and 5.6%, respectively. According to PACT, 24.3% of the assessed individuals were considered good or excellent candidates. In 11.2%, transplantation seemed poor candidate due to at least one major psychosocial or lifestyle risk factor. Poor candidate quality was associated with impaired health-related quality of life and higher scores on anxiety and depression scales (). Conclusions. Transplant programs could implement specific intervention programs based on normative databases to address the psychosocial issues in patients in order to improve patient care, quality of life, and transplant outcomes. Maryam Banihashemi, Mohsen Hafezi, Mohsen Nasiri-Toosi, Ali Jafarian, Mohammad Reza Abbasi, Mohammad Arbabi, Maryam Abdi, Mahzad Khavarian, and Ali-Akbar Nejatisafa Copyright © 2015 Maryam Banihashemi et al. All rights reserved. Influence of Deceased Donor and Pretransplant Recipient Parameters on Early Overall Kidney Graft-Survival in Germany Sun, 11 Oct 2015 12:54:08 +0000 Background. Scarcity of grafts for kidney transplantation (KTX) caused an increased consideration of deceased donors with substantial risk factors. There is no agreement on which ones are detrimental for overall graft-survival. Therefore, we investigated in a nationwide multicentre study the impact of donor and recipient related risks known before KTX on graft-survival based on the original data used for allocation and graft acceptance. Methods. A nationwide deidentified multicenter study-database was created of data concerning kidneys donated and transplanted in Germany between 2006 and 2008 as provided by the national organ procurement organization (Deutsche Stiftung Organtransplantation) and BQS Institute. Multiple Cox regression (significance level 5%, hazard ratio [95% CI]) was conducted (, isolated KTX). Results. Risk factors associated with graft-survival were donor age (1.020 [1.013–1.027] per year), donor size (0.985 [0.977–0.993] per cm), donor’s creatinine at admission (1.002 [1.001–1.004] per µmol/L), donor treatment with catecholamine (0.757 [0.635–0.901]), and reduced graft-quality at procurement (1.549 [1.217–1.973]), as well as recipient age (1.012 [1.003–1.021] per year), actual panel reactive antibodies (1.007 [1.002–1.011] per percent), retransplantation (1.850 [1.484–2.306]), recipient’s cardiovascular comorbidity (1.436 [1.212–1.701]), and use of IL2-receptor antibodies for induction (0.741 [0.619–0.887]). Conclusion. Some donor characteristics persist to impact graft-survival (e.g., age) while the effect of others could be mitigated by elaborate donor-recipient match and care. Carl-Ludwig Fischer-Fröhlich, Marcus Kutschmann, Johanna Feindt, Irene Schmidtmann, Günter Kirste, Nils R. Frühauf, Ulrike Wirges, Axel Rahmel, and Christina Schleicher Copyright © 2015 Carl-Ludwig Fischer-Fröhlich et al. All rights reserved. Delayed Graft Function in Kidney Transplants: Time Evolution, Role of Acute Rejection, Risk Factors, and Impact on Patient and Graft Outcome Thu, 10 Sep 2015 07:24:27 +0000 Background. Although numerous risk factors for delayed graft function (DGF) have been identified, the role of ischemia-reperfusion injury and acute rejection episodes (ARE) occurring during the DGF period is ill-defined and DGF impact on patient and graft outcome remains controversial. Methods. From 1983 to 2014, 1784 kidney-only transplantations from deceased donors were studied. Classical risk factors for DGF along with two novel ones, recipient’s perioperative saline loading and residual diuresis, were analyzed by logistic regression and receiver operating characteristic (ROC) curves. Results. Along with other risk factors, absence of perioperative saline loading increases acute rejection incidence (OR = 1.9 [1.2–2.9]). Moreover, we observed two novel risk factors for DGF: patient’s residual diuresis ≤500 mL/d (OR = 2.3 [1.6–3.5]) and absence of perioperative saline loading (OR = 3.3 [2.0–5.4]). Area under the curve of the ROC curve (0.77 [0.74–0.81]) shows an excellent discriminant power of our model, irrespective of rejection. DGF does not influence patient survival . However, graft survival is decreased only when rejection was associated with DGF .  Conclusions. Perioperative saline loading efficiently prevents ischemia-reperfusion injury, which is the predominant factor inducing DGF. DGF per se has no influence on patient and graft outcome. Its incidence is currently close to 5% in our centre. Martin Chaumont, Judith Racapé, Nilufer Broeders, Fadoua El Mountahi, Annick Massart, Thomas Baudoux, Jean-Michel Hougardy, Dimitri Mikhalsky, Anwar Hamade, Alain Le Moine, Daniel Abramowicz, and Pierre Vereerstraeten Copyright © 2015 Martin Chaumont et al. All rights reserved. Alternative Living Kidney Donation Programs Boost Genetically Unrelated Donation Wed, 02 Sep 2015 09:11:16 +0000 Donor-recipient ABO and/or HLA incompatibility used to lead to donor decline. Development of alternative transplantation programs enabled transplantation of incompatible couples. How did that influence couple characteristics? Between 2000 and 2014, 1232 living donor transplantations have been performed. In conventional and ABO-incompatible transplantation the willing donor becomes an actual donor for the intended recipient. In kidney-exchange and domino-donation the donor donates indirectly to the intended recipient. The relationship between the donor and intended recipient was studied. There were 935 conventional and 297 alternative program transplantations. There were 66 ABO-incompatible, 68 domino-paired, 62 kidney-exchange, and 104 altruistic donor transplantations. Waiting list recipients () were excluded as they did not bring a living donor. 1131 couples remained of whom 196 participated in alternative programs. Genetically unrelated donors (486) were primarily partners. Genetically related donors (645) were siblings, parents, children, and others. Compared to genetically related couples, almost three times as many genetically unrelated couples were incompatible and participated in alternative programs (). 62% of couples were genetically related in the conventional donation program versus 32% in alternative programs (). Patient and graft survival were not significantly different between recipient programs. Alternative donation programs increase the number of transplantations by enabling genetically unrelated donors to donate. Rosalie A. Poldervaart, Mirjam Laging, Tessa Royaards, Judith A. Kal-van Gestel, Madelon van Agteren, Marry de Klerk, Willij Zuidema, Michiel G. H. Betjes, and Joke I. Roodnat Copyright © 2015 Rosalie A. Poldervaart et al. All rights reserved.