Journal of Transplantation http://www.hindawi.com The latest articles from Hindawi Publishing Corporation © 2016 , Hindawi Publishing Corporation . All rights reserved. Manipulation of Ovarian Function Significantly Influenced Sarcopenia in Postreproductive-Age Mice Thu, 22 Sep 2016 14:15:26 +0000 http://www.hindawi.com/journals/jtrans/2016/4570842/ Previously, transplantation of ovaries from young cycling mice into old postreproductive-age mice increased life span. We anticipated that the same factors that increased life span could also influence health span. Female CBA/J mice received new (60 d) ovaries at 12 and 17 months of age and were evaluated at 16 and 25 months of age, respectively. There were no significant differences in body weight among any age or treatment group. The percentage of fat mass was significantly increased at 13 and 16 months of age but was reduced by ovarian transplantation in 16-month-old mice. The percentages of lean body mass and total body water were significantly reduced in 13-month-old control mice but were restored in 16- and 25-month-old recipient mice by ovarian transplantation to the levels found in six-month-old control mice. In summary, we have shown that skeletal muscle mass, which is negatively influenced by aging, can be positively influenced or restored by reestablishment of active ovarian function in aged female mice. These findings provide strong incentive for further investigation of the positive influence of young ovaries on restoration of health in postreproductive females. Rhett L. Peterson, Kate C. Parkinson, and Jeffrey B. Mason Copyright © 2016 Rhett L. Peterson et al. All rights reserved. C1Q Assay Results in Complement-Dependent Cytotoxicity Crossmatch Negative Renal Transplant Candidates with Donor-Specific Antibodies: High Specificity but Low Sensitivity When Predicting Flow Crossmatch Sun, 04 Sep 2016 10:38:05 +0000 http://www.hindawi.com/journals/jtrans/2016/2106028/ The aim of the present study was to describe the association of positive flow cross match (FXM) and C1q-SAB. Methods. In this observational, cross-sectional, and comparative study, patients included had negative AHG-CDC-XM and donor specific antibodies (DSA) and were tested with FXM. All pretransplant sera were tested with C1q-SAB assay. Results. A total of 50 donor/recipient evaluations were conducted; half of them had at least one C1q+ Ab (, 52%). Ten patients (20.0%) had DSA C1q+ Ab. Twenty-five (50%) FXMs were positive. Factors associated with a positive FXM were the presence of C1q+ Ab (DSA C1q+ Ab: OR 27, 2.80–259.56, , and no DSA C1q+ Ab: OR 5, 1.27–19.68, ) and the DSA LABScreen-SAB MFI (OR 1.26, 95% CI 1.06–1.49, ). The cutoff point of immunodominant LABScreen SAB DSA-MFI with the greatest sensitivity and specificity to predict FXM was 2,300 (sensitivity: 72% and specificity: 75%). For FXM prediction, DSA C1q+ Ab was the most specific (95.8%, 85–100) and the combination of DSA-MFI > 2,300 and C1q+ Ab was the most sensitive (92.0%, 79.3–100). Conclusions. C1q+ Ab and LABScreen SAB DSA-MFI were significantly associated with FXM. DSA C1q+ Ab was highly specific but with low sensitivity. José M. Arreola-Guerra, Natalia Castelán, Adrián de Santiago, Adriana Arvizu, Norma Gonzalez-Tableros, Mayra López, Isaac Salcedo, Mario Vilatobá, Julio Granados, Luis E. Morales-Buenrostro, and Josefina Alberú Copyright © 2016 José M. Arreola-Guerra et al. All rights reserved. Impact of Recipient and Donor Obesity Match on the Outcomes of Liver Transplantation: All Matches Are Not Perfect Thu, 01 Sep 2016 13:30:08 +0000 http://www.hindawi.com/journals/jtrans/2016/9709430/ There is a paucity of literature examining recipient-donor obesity matching on liver transplantation outcomes. The United Network for Organ Sharing database was queried for first-time recipients of liver transplant whose age was ≥18 between January 2003 and September 2013. Outcomes including patient and graft survival at 30 days, 1 year, and 5 years and overall, liver retransplantation, and length of stay were compared between nonobese recipients receiving a graft from nonobese donors and obese recipient-obese donor, obese recipient-nonobese donor, and nonobese recipient-obese donor pairs. 51,556 LT recipients were identified, including 34,217 (66%) nonobese and 17,339 (34%) obese recipients. The proportions of patients receiving an allograft from an obese donor were 24% and 29%, respectively, among nonobese and obese recipients. Graft loss (HR: 1.27; 95% CI: 1.09–1.46; ) and mortality (HR: 1.38; 95% CI: 1.16–1.65; ) at 30 days were increased in the obese recipient-obese donor pair. However, 1-year graft (HR: 0.83; 95% CI: 0.74–0.93; ) and patient (HR: 0.84; 95% CI: 0.74–0.95; ) survival and overall patient (HR: 0.93; 95% CI: 0.86–1.00; ) survival were favorable. There is evidence of recipient and donor obesity disadvantage early, but survival curves demonstrate improved long-term outcomes. It is important to consider obesity in the donor-recipient match. Eliza W. Beal, Dmitry Tumin, Lanla F. Conteh, A. James Hanje, Anthony J. Michaels, Don Hayes Jr., Sylvester M. Black, and Khalid Mumtaz Copyright © 2016 Eliza W. Beal et al. All rights reserved. For and against Organ Donation and Transplantation: Intricate Facilitators and Barriers in Organ Donation Perceived by German Nurses and Doctors Mon, 15 Aug 2016 08:50:51 +0000 http://www.hindawi.com/journals/jtrans/2016/3454601/ Background. Significant facilitators and barriers to organ donation and transplantation remain in the general public and even in health professionals. Negative attitudes of HPs have been identified as the most significant barrier to actual ODT. The purpose of this paper was hence to investigate to what extent HPs (physicians and nurses) experience such facilitators and barriers in ODT and to what extent they are intercorrelated. We thus combined single causes to circumscribed factors of respective barriers and facilitators and analyzed them for differences regarding profession, gender, spiritual/religious self-categorization, and self-estimated knowledge of ODT and their mutual interaction. Methods. By the use of questionnaires we investigated intricate facilitators and barriers to organ donation experienced by HPs (; 73% nurses, 27% physicians) in around ten wards at the University Hospital of Munich. Results. Our study confirms a general high agreement with the importance of ODT. Nevertheless, we identified both facilitators and barriers in the following fields: (1) knowledge of ODT and willingness to donate own organs, (2) ethical delicacies in ODT, (3) stressors to handle ODT in the hospital, and (4) individual beliefs and self-estimated religion/spirituality. Conclusion. Attention to the intricacy of stressors and barriers in HPs continues to be a high priority focus for the availability of donor organs. Niels Christian Hvidt, Beate Mayr, Piret Paal, Eckhard Frick, Anna Forsberg, and Arndt Büssing Copyright © 2016 Niels Christian Hvidt et al. All rights reserved. The Kidney Transplant Evaluation Process in the Elderly: Reasons for Being Turned down and Opportunities to Improve Cost-Effectiveness in a Single Center Thu, 04 Aug 2016 14:20:11 +0000 http://www.hindawi.com/journals/jtrans/2016/7405930/ Background. The kidney transplant evaluation process for older candidates is complex due to the presence of multiple comorbid conditions. Methods. We retrospectively reviewed patients ≥60 years referred to our center for kidney transplantation over a 3-year period. Variables were collected to identify reasons for patients being turned down and to determine the number of unnecessary tests performed. Statistical analysis was performed to estimate the association between clinical predictors and listing status. Results. 345 patients were included in the statistical analysis. 31.6% of patients were turned down: 44% due to coronary artery disease (CAD), peripheral vascular disease (PVD), or both. After adjustment for patient demographics and comorbid conditions, history of CAD, PVD, or both (OR = 1.75, 95% CI (1.20, 2.56), ), chronic obstructive pulmonary disease (OR = 8.75, 95% CI (2.81, 27.20), ), and cancer (OR 2.59, 95% CI (1.18, 5.67), ) were associated with a higher risk of being turned down. 14.8% of patients underwent unnecessary basic testing and 9.6% underwent unnecessary supplementary testing with the charges over a 3-year period estimated at $304,337. Conclusion. A significant number of older candidates are deemed unacceptable for kidney transplantation with primary reasons cited as CAD and PVD. The overall burden of unnecessary testing is substantial and potentially avoidable. Beatrice P. Concepcion, Rachel C. Forbes, Aihua Bian, and Heidi M. Schaefer Copyright © 2016 Beatrice P. Concepcion et al. All rights reserved. The Utility of Routine Ultrasound Imaging after Elective Transplant Ureteric Stent Removal Thu, 14 Jul 2016 06:15:15 +0000 http://www.hindawi.com/journals/jtrans/2016/1231567/ Background. Ureteric stent insertion during kidney transplantation reduces the incidence of major urological complications (MUCs). We evaluated whether routine poststent removal graft ultrasonography (PSRGU) was useful in detecting MUCs before they became clinically or biochemically apparent. Methods. A retrospective analysis was undertaken of clinical outcomes following elective stent removals from adult single renal transplant recipients (sRTRs) at our centre between 1 January 2011 and 31 December 2013. Results. Elective stent removal was performed for 338 sRTRs. Of these patients, 222 had routine PSRGU (median (IQR) days after stent removal = 18 (11–31)), 79 had urgent PSRGU due to clinical or biochemical indications, 12 had CT imaging, and 25 had no further renal imaging. Of the 222 sRTRs who underwent routine PSRGU, 210 (94.6%) had no change of management, three (1.4%) required repeat imaging only, and eight patients (3.6%) had incidental (nonureteric) findings. One patient (0.5%) had nephrostomy insertion as a result of routine PSRGU findings, but no ureteric stenosis was identified. Of 79 patients having urgent PSRGU after elective stent removal, three patients required transplant ureteric reimplantation. Conclusions. This analysis found no evidence that routine PSRGU at two to three weeks after elective stent removal provides any added value beyond standard clinical and biochemical monitoring. Bibek Das, Dorian Hobday, Jonathon Olsburgh, and Chris Callaghan Copyright © 2016 Bibek Das et al. All rights reserved. Clinical Course and Outcomes of Late Kidney Allograft Dysfunction Sun, 10 Jul 2016 13:07:27 +0000 http://www.hindawi.com/journals/jtrans/2016/7401808/ Background. This study is provided to increase the efficiency of the treatment of kidney transplant recipients by predicting the development of the late allotransplant dysfunction. Methods. 330 patients who have lived for more than one year with functioning kidney allograft were evaluated. To predict the subsequent duration of the well-functioning of allotransplant the prognostic significance of 15 baseline clinical and sociodemographic characteristics on the results of the survey one year after transplantation was investigated. The result was considered to be positive in constructing the regression prognostication model if recipient lived more than 3 years from the time of transplantation. Results. It was established that more late start of renal allograft dysfunction after transplantation correlates with the more time it takes till complete loss of allograft function. Creatinine and hemoglobin blood concentration and the level of proteinuria one year after transplantation within created mathematical model allow predicting the loss of kidney transplant function three years after the transplantation. Patients with kidney transplant dysfunction are advised to renew the program hemodialysis upon reaching plasma creatinine concentration 0.5–0.7 mmol/L. Conclusion. Values of creatinine, hemoglobin, and proteinuria one year after transplantation can be used for subsequent prognostication of kidney transplant function. Viktor Denisov, Vadym Zakharov, Anna Ksenofontova, Eugene Onishchenko, Tatyana Golubova, Sergey Kichatyi, and Olga Zakharova Copyright © 2016 Viktor Denisov et al. All rights reserved. Intermediate-Term Outcomes of Dual Adult versus Single-Kidney Transplantation: Evolution of a Surgical Technique Sun, 10 Jul 2016 08:18:32 +0000 http://www.hindawi.com/journals/jtrans/2016/2586761/ Background. Acceptance of dual kidney transplantation (DKT) has proven difficult, due to surgical complexity and concerns regarding long-term outcomes. We herein present a standard technique for ipsilateral DKT and compare outcomes to single-kidney transplant (SKT) recipients. Methods. A retrospective single-center comparison of DKT and SKT performed between February 2007 and July 2013. Results. Of 516 deceased donor kidney transplants, 29 were DKT and 487 were SKT. Mean follow-up was 43 ± 67 months. DKT recipients were older and more likely than SKT recipients to receive an extended criteria graft (). For DKT versus SKT, the rates of delayed graft function (10.3 versus 9.2%) and acute rejection (20.7 versus 22.4%) were equivalent ( = ns). A higher than expected urologic complication rate in the DKT cohort (14 versus 2%, ) was reduced through modification of the ureteral anastomosis. Graft survival was equivalent between DKT and SKT groups ( = ns) with actuarial 3-year DKT patient and graft survivals of 100% and 93%. At 3 years, the groups had similar renal function ( = ns). Conclusions. By utilizing extended criteria donor organs as DKT, the donor pool was enlarged while providing excellent patient and graft survival. The DKT urologic complication rate was reduced by modification of the ureteral anastomosis. Ana K. Islam, Richard J. Knight, Wesley A. Mayer, Adam B. Hollander, Samir Patel, Larry D. Teeter, Edward A. Graviss, Ashish Saharia, Hemangshu Podder, Emad H. Asham, and A. Osama Gaber Copyright © 2016 Ana K. Islam et al. All rights reserved. Effectively Screening for Coronary Artery Disease in Patients Undergoing Orthotopic Liver Transplant Evaluation Wed, 22 Jun 2016 09:11:41 +0000 http://www.hindawi.com/journals/jtrans/2016/7187206/ Coronary artery disease (CAD) is prevalent in patients with end-stage liver disease and associated with poor outcomes when undergoing orthotopic liver transplantation (OLT); however, noninvasive screening for CAD in this population is less sensitive. In an attempt to identify redundancy, we reviewed our experience among patients undergoing CAD screening as part of their OLT evaluation between May 2009 and February 2014. Demographic, clinical, and procedural characteristics were analyzed. Of the total number of screened patients (), initial screening was more common via stress testing (; 75.8%) than coronary angiography (; 24.2%). Most with initial stress testing underwent angiography (; 39.4%). Among those undergoing angiography, CAD was common (; 23.5%). Across the entire cohort the number of traditional risk factors was linearly associated with CAD, and those with two or more risk factors were found to have CAD by angiography 50% of the time (OR 1.92; CI 1.07–3.44, ). Our data supports that CAD is prevalent among pre-OLT patients, especially among those with 2 or more risk factors. Moreover, we identified a lack of uniformity in practice and the need for evidence-based and standardized screening protocols. Bryan C. Lee, Feng Li, Adam J. Hanje, Khalid Mumtaz, Konstantinos D. Boudoulas, and Scott M. Lilly Copyright © 2016 Bryan C. Lee et al. All rights reserved. Current Treatment Approaches to HCC with a Special Consideration to Transplantation Mon, 20 Jun 2016 07:07:50 +0000 http://www.hindawi.com/journals/jtrans/2016/7926264/ Hepatocellular carcinoma (HCC) is the third leading cause of cancer deaths worldwide. The mainstay of treatment of HCC has been both resectional and transplantation surgery. It is well known that, in selected, optimized patients, hepatectomy for HCC may be an option, even in patients with underlying cirrhosis. Resectable patients with early HCC and underlying liver disease are however increasingly being considered for transplantation because of potential for better disease-free survival and resolution of underlying liver disease, although this approach is limited by the availability of donor livers, especially in resectable patients. Outcomes following liver transplantation improved dramatically for patients with HCC following the implementation of the Milan criteria in the late 1990s. Ever since, the rather restrictive nature of the Milan criteria has been challenged with good outcomes. There has also been an increase in the donor pool with marginal donors including organs retrieved following cardiac death being used. Even so, patients still continue to die while waiting for a liver transplant. In order to reduce this attrition, bridging techniques and methods for downstaging disease have evolved. Additionally new techniques for organ preservation have increased the prospect of this potentially curative procedure being available for a greater number of patients. N. Bhardwaj, M. T. P. R. Perera, and M. A. Silva Copyright © 2016 N. Bhardwaj et al. All rights reserved. Incidence, Characteristics, and Prognosis of Incidentally Discovered Hepatocellular Carcinoma after Liver Transplantation Wed, 15 Jun 2016 11:58:39 +0000 http://www.hindawi.com/journals/jtrans/2016/1916387/ Background. We aimed to assess incidentally discovered hepatocellular carcinoma (iHCC) over time and to compare outcome to preoperatively diagnosed hepatocellular carcinoma (pdHCC) and nontumor liver transplants. Methods. We studied adults transplanted with a follow-up of at least one year. Patients were divided into 3 groups according to diagnosis of hepatocellular carcinoma. Results. Between 1990 and 2010, 887 adults were transplanted. Among them, 121 patients (13.6%) had pdHCC and 32 patients (3.6%) had iHCC; frequency of iHCC decreased markedly over years, in parallel with significant increase in pdHCC. Between 1990 and 1995, 120 patients had liver transplants, 4 (3.3%) of them had iHCC, and only 3 (2.5%) had pdHCC, while in the last 5 years, 263 patients were transplanted, 7 (0.03%) of them had iHCC, and 66 (25.1%) had pdHCC (). There was no significant difference between groups regarding patient survival; 5-year survival was 74%, 75.5%, and 77.3% in iHCC, pdHCC, and non-HCC groups, respectively (). Patients with iHCC had no recurrences after transplant, while pdHCC patients experienced 17 recurrences (15.3%) (). Conclusions. iHCC has significantly decreased despite steady increase in number of transplants for hepatocellular carcinoma. Patients with iHCC had excellent outcomes with no tumor recurrence and survival comparable to pdHCC. Walid El Moghazy, Samy Kashkoush, Glenda Meeberg, and Norman Kneteman Copyright © 2016 Walid El Moghazy et al. All rights reserved. Liver Transplantation for Hepatocellular Carcinoma: A Single Center Resume Overlooking Four Decades of Experience Sun, 10 Jan 2016 09:48:22 +0000 http://www.hindawi.com/journals/jtrans/2016/7895956/ Background. This is a single center oncological resume overlooking four decades of experience with liver transplantation (LT) for hepatocellular carcinoma (HCC). Methods. All 319 LT for HCC that were performed between 1975 and 2011 were included. Predictors for HCC recurrence (HCCR) and survival were identified by Cox regression, Kaplan-Meier analysis, Log Rank, and -tests where appropriate. Results. HCCR was the single strongest hazard for survival (). Hazards for HCCR were tumor staging beyond the histologic MILAN (), bilateral tumor spreading (), tumor grading beyond G2 (), and vascular infiltration of small or large vessels (, , resp.). Grading beyond G2 () as well as small and large vascular infiltrations (, , resp.) was associated with higher hazard ratios for long-term survival as compared to liver transplantation beyond histological MILAN (). Tumor dedifferentiation significantly correlated with vascular infiltration () and intrahepatic tumor spreading (). Conclusion. LT enables survival from HCC. HCC dedifferentiation is associated with vascular infiltration and intrahepatic tumor spreading and is a strong hazard for HCCR and survival. Pretransplant tumor staging should include grading by biopsy, because grading is a reliable and easily accessible predictor of HCCR and survival. Detection of dedifferentiation should speed up the allocation process. Nikos Emmanouilidis, Rickmer Peters, Bastian P. Ringe, Zeynep Güner, Wolf Ramackers, Hüseyin Bektas, Frank Lehner, Michael Manns, Jürgen Klempnauer, and Harald Schrem Copyright © 2016 Nikos Emmanouilidis et al. All rights reserved. Lung Transplantation in Patients with High Lung Allocation Scores in the US: Evidence for the Need to Evaluate Score Specific Outcomes Mon, 21 Dec 2015 13:54:04 +0000 http://www.hindawi.com/journals/jtrans/2015/836751/ Objective. The lung allocation score (LAS) resulted in a lung transplantation (LT) selection process guided by clinical acuity. We sought to evaluate the relationship between LAS and outcomes. Methods. We analyzed Scientific Registry of Transplant Recipient (SRTR) data pertaining to recipients between 2005 and 2012. We stratified them into quartiles based on LAS and compared survival and predictors of mortality. Results. We identified 10,304 consecutive patients, comprising 2,576 in each LAS quartile (quartile 1 (26.3–35.5), quartile 2 (35.6–39.3), quartile 3 (39.4–48.6), and quartile 4 (48.7–95.7)). Survival after 30 days (96.9% versus 96.8% versus 96.0% versus 94.8%), 90 days (94.6% versus 93.7% versus 93.3% versus 90.9%), 1 year (87.2% versus 85.0% versus 84.8% versus 80.9%), and 5 years (55.4% versus 54.5% versus 52.5% versus 48.8%) was higher in the lower groups. There was a significantly higher 5-year mortality in the highest LAS group (HR 1.13, , HR 1.17, , and HR 1.17, ) comparing quartiles 2, 3, and 4, respectively, to quartile 1. Conclusion. Overall, outcomes in recipients with higher LAS are worse than those in patients with lower LAS. These data should inform more individualized evidence-based discussion during pretransplant counseling. Jeremiah A. Hayanga, Alena Lira, Tedi Vlahu, Jingyan Yang, Jonathan K. Aboagye, Heather K. Hayanga, James D. Luketich, and Jonathan D’Cunha Copyright © 2015 Jeremiah A. Hayanga et al. All rights reserved. Risk Factors Associated with Increased Morbidity in Living Liver Donation Tue, 15 Dec 2015 14:25:59 +0000 http://www.hindawi.com/journals/jtrans/2015/949674/ Living donor liver donation (LDLD) is an alternative to cadaveric liver donation. We aimed at identifying risk factors and developing a score for prediction of postoperative complications (POCs) after LDLD in donors. This is a retrospective cohort study in 688 donors between June 1995 and February 2014 at Hospital Sírio-Libanês and A.C. Camargo Cancer Center, in São Paulo, Brazil. Primary outcome was POC graded ≥III according to the Clavien-Dindo classification. Left lateral segment (LLS), left lobe (LL), and right lobe resections (RL) were conducted in 492 (71.4%), 109 (15.8%), and 87 (12.6%) donors, respectively. In total, 43 (6.2%) developed POCs, which were more common after RL than LLS and LL (14/87 (16.1%) versus 23/492 (4.5%) and 6/109 (5.5%), resp., ). Multivariate analysis showed that RL resection (OR: 2.81, 95% CI: 1.32 to 3.01; ), smoking status (OR: 3.2, 95% CI: 1.35 to 7.56; ), and blood transfusion (OR: 3.15, 95% CI: 1.45 to 6.84; ) were independently associated with POCs. RL resection, intraoperative blood transfusion, and smoking were associated with increased risk for POCs in donors. Helry L. Candido, Eduardo A. da Fonseca, Flávia H. Feier, Renata Pugliese, Marcel A. Benavides, Enis D. Silva, Karina Gordon, Marcelo Gama de Abreu, Jaume Canet, Paulo Chapchap, and Joao Seda Neto Copyright © 2015 Helry L. Candido et al. All rights reserved. Plasma Exchange for the Recurrence of Primary Focal Segmental Glomerulosclerosis in Adult Renal Transplant Recipients: A Meta-Analysis Mon, 30 Nov 2015 06:33:50 +0000 http://www.hindawi.com/journals/jtrans/2015/639628/ Background. Posttransplant recurrence of primary focal segmental glomerulosclerosis (rFSGS) in the form of massive proteinuria is not uncommon and has detrimental consequences on renal allograft survival. A putative circulating permeability factor has been implicated in the pathogenesis leading to widespread use of plasma exchange (PLEX). We reviewed published studies to assess the role of PLEX on treatment of rFSGS in adults. Methods. Eligible manuscripts compared PLEX or variants with conventional care for inducing proteinuria remission (PR) in rFSGS and were identified through MEDLINE and reference lists. Data were abstracted in parallel by two reviewers. Results. We detected 6 nonrandomized studies with 117 cases enrolled. In a random effects model, the pooled risk ratio for the composite endpoint of partial or complete PR was 0,38 in favour of PLEX (95% CI: 0,23–0,61). No statistical heterogeneity was observed among included studies (%, = 0,42). On average, 9–26 PLEX sessions were performed to achieve PR. Renal allograft loss due to recurrence was lower (range: 0%–67%) in patients treated with PLEX. Conclusion. Notwithstanding the inherent limitations of small, observational trials, PLEX appears to be effective for PR in rFSGS. Additional research is needed to further elucidate its optimal use and impact on long-term allograft survival. Georgios Vlachopanos, Argyrios Georgalis, and Harikleia Gakiopoulou Copyright © 2015 Georgios Vlachopanos et al. All rights reserved. Psychosocial Status of Liver Transplant Candidates in Iran and Its Correlation with Health-Related Quality of Life and Depression and Anxiety Sun, 15 Nov 2015 11:10:53 +0000 http://www.hindawi.com/journals/jtrans/2015/329615/ Objectives. The study was aimed at providing a psychosocial profile for Iranian liver transplant candidates referred to an established liver transplantation program. Material and Methods. Patients assessed for liver transplant candidacy in Imam Khomeini Hospital (Tehran, Iran) between March 2013 and September 2014 were included. The following battery of tests were administered: Psychosocial Assessment of Candidates for Transplant (PACT), the Short-Form health survey (SF-36), and Hospital Anxiety and Depression Scale (HADS). Results. Psychosocial assessment in 205 liver transplant candidates revealed significant impairments in several SF-36 domains; social functioning was the least and physical functioning was the most impaired domains. The prevalence of cases with probable anxiety and depressive disorders, according to HADS, was 13.8% and 5.6%, respectively. According to PACT, 24.3% of the assessed individuals were considered good or excellent candidates. In 11.2%, transplantation seemed poor candidate due to at least one major psychosocial or lifestyle risk factor. Poor candidate quality was associated with impaired health-related quality of life and higher scores on anxiety and depression scales (). Conclusions. Transplant programs could implement specific intervention programs based on normative databases to address the psychosocial issues in patients in order to improve patient care, quality of life, and transplant outcomes. Maryam Banihashemi, Mohsen Hafezi, Mohsen Nasiri-Toosi, Ali Jafarian, Mohammad Reza Abbasi, Mohammad Arbabi, Maryam Abdi, Mahzad Khavarian, and Ali-Akbar Nejatisafa Copyright © 2015 Maryam Banihashemi et al. All rights reserved. Influence of Deceased Donor and Pretransplant Recipient Parameters on Early Overall Kidney Graft-Survival in Germany Sun, 11 Oct 2015 12:54:08 +0000 http://www.hindawi.com/journals/jtrans/2015/307230/ Background. Scarcity of grafts for kidney transplantation (KTX) caused an increased consideration of deceased donors with substantial risk factors. There is no agreement on which ones are detrimental for overall graft-survival. Therefore, we investigated in a nationwide multicentre study the impact of donor and recipient related risks known before KTX on graft-survival based on the original data used for allocation and graft acceptance. Methods. A nationwide deidentified multicenter study-database was created of data concerning kidneys donated and transplanted in Germany between 2006 and 2008 as provided by the national organ procurement organization (Deutsche Stiftung Organtransplantation) and BQS Institute. Multiple Cox regression (significance level 5%, hazard ratio [95% CI]) was conducted (, isolated KTX). Results. Risk factors associated with graft-survival were donor age (1.020 [1.013–1.027] per year), donor size (0.985 [0.977–0.993] per cm), donor’s creatinine at admission (1.002 [1.001–1.004] per µmol/L), donor treatment with catecholamine (0.757 [0.635–0.901]), and reduced graft-quality at procurement (1.549 [1.217–1.973]), as well as recipient age (1.012 [1.003–1.021] per year), actual panel reactive antibodies (1.007 [1.002–1.011] per percent), retransplantation (1.850 [1.484–2.306]), recipient’s cardiovascular comorbidity (1.436 [1.212–1.701]), and use of IL2-receptor antibodies for induction (0.741 [0.619–0.887]). Conclusion. Some donor characteristics persist to impact graft-survival (e.g., age) while the effect of others could be mitigated by elaborate donor-recipient match and care. Carl-Ludwig Fischer-Fröhlich, Marcus Kutschmann, Johanna Feindt, Irene Schmidtmann, Günter Kirste, Nils R. Frühauf, Ulrike Wirges, Axel Rahmel, and Christina Schleicher Copyright © 2015 Carl-Ludwig Fischer-Fröhlich et al. All rights reserved. Delayed Graft Function in Kidney Transplants: Time Evolution, Role of Acute Rejection, Risk Factors, and Impact on Patient and Graft Outcome Thu, 10 Sep 2015 07:24:27 +0000 http://www.hindawi.com/journals/jtrans/2015/163757/ Background. Although numerous risk factors for delayed graft function (DGF) have been identified, the role of ischemia-reperfusion injury and acute rejection episodes (ARE) occurring during the DGF period is ill-defined and DGF impact on patient and graft outcome remains controversial. Methods. From 1983 to 2014, 1784 kidney-only transplantations from deceased donors were studied. Classical risk factors for DGF along with two novel ones, recipient’s perioperative saline loading and residual diuresis, were analyzed by logistic regression and receiver operating characteristic (ROC) curves. Results. Along with other risk factors, absence of perioperative saline loading increases acute rejection incidence (OR = 1.9 [1.2–2.9]). Moreover, we observed two novel risk factors for DGF: patient’s residual diuresis ≤500 mL/d (OR = 2.3 [1.6–3.5]) and absence of perioperative saline loading (OR = 3.3 [2.0–5.4]). Area under the curve of the ROC curve (0.77 [0.74–0.81]) shows an excellent discriminant power of our model, irrespective of rejection. DGF does not influence patient survival . However, graft survival is decreased only when rejection was associated with DGF .  Conclusions. Perioperative saline loading efficiently prevents ischemia-reperfusion injury, which is the predominant factor inducing DGF. DGF per se has no influence on patient and graft outcome. Its incidence is currently close to 5% in our centre. Martin Chaumont, Judith Racapé, Nilufer Broeders, Fadoua El Mountahi, Annick Massart, Thomas Baudoux, Jean-Michel Hougardy, Dimitri Mikhalsky, Anwar Hamade, Alain Le Moine, Daniel Abramowicz, and Pierre Vereerstraeten Copyright © 2015 Martin Chaumont et al. All rights reserved. Alternative Living Kidney Donation Programs Boost Genetically Unrelated Donation Wed, 02 Sep 2015 09:11:16 +0000 http://www.hindawi.com/journals/jtrans/2015/748102/ Donor-recipient ABO and/or HLA incompatibility used to lead to donor decline. Development of alternative transplantation programs enabled transplantation of incompatible couples. How did that influence couple characteristics? Between 2000 and 2014, 1232 living donor transplantations have been performed. In conventional and ABO-incompatible transplantation the willing donor becomes an actual donor for the intended recipient. In kidney-exchange and domino-donation the donor donates indirectly to the intended recipient. The relationship between the donor and intended recipient was studied. There were 935 conventional and 297 alternative program transplantations. There were 66 ABO-incompatible, 68 domino-paired, 62 kidney-exchange, and 104 altruistic donor transplantations. Waiting list recipients () were excluded as they did not bring a living donor. 1131 couples remained of whom 196 participated in alternative programs. Genetically unrelated donors (486) were primarily partners. Genetically related donors (645) were siblings, parents, children, and others. Compared to genetically related couples, almost three times as many genetically unrelated couples were incompatible and participated in alternative programs (). 62% of couples were genetically related in the conventional donation program versus 32% in alternative programs (). Patient and graft survival were not significantly different between recipient programs. Alternative donation programs increase the number of transplantations by enabling genetically unrelated donors to donate. Rosalie A. Poldervaart, Mirjam Laging, Tessa Royaards, Judith A. Kal-van Gestel, Madelon van Agteren, Marry de Klerk, Willij Zuidema, Michiel G. H. Betjes, and Joke I. Roodnat Copyright © 2015 Rosalie A. Poldervaart et al. All rights reserved. Boceprevir-Based Triple Antiviral Therapy for Chronic Hepatitis C Virus Infection in Kidney-Transplant Candidates Thu, 16 Jul 2015 11:17:07 +0000 http://www.hindawi.com/journals/jtrans/2015/159795/ Background. There are few data on the combination of (pegylated-) interferon- (Peg-IFN-) α, ribavirin, and first-generation direct-acting antiviral agents (DAAs). Our aim was to describe the efficacy and safety of Peg-IFN-α, ribavirin, and boceprevir in hemodialysis patients. Patients. Six hemodialysis patients, chronically infected by genotype-1 HCV, were given Peg-IFN-α (135 µg/week), ribavirin (200 mg/d), and boceprevir (2400 mg/d) for 48 weeks. Results. At initiation of antiviral therapy, median viral concentration was 5.68 (3.78–6.55) log IU/mL. HCV RNA was undetectable in four of the six patients at week 4 and in all patients at week 24. A breakthrough was observed in two patients between weeks 24 and 48, and a third patient stopped antiviral therapy between weeks 24 and 48 because of severe peripheral neuropathy. At week 48, HCV RNA was undetectable in three patients. Of these, two patients relapsed within a month after antiviral therapy was stopped. Hence, only one patient had a sustained virological response; he was a previous partial responder. Overall, anemia was the main side effect. Conclusion. A triple antiviral therapy based on Peg-IFN-α, ribavirin, and boceprevir is not optimal at treating hemodialysis patients with chronic HCV infection. Studies using new-generation drugs are required in this setting. Mireille Mehawej, Lionel Rostaing, Laurent Alric, Arnaud Del Bello, Jacques Izopet, and Nassim Kamar Copyright © 2015 Mireille Mehawej et al. All rights reserved. Factors Associated with Uncontrolled Hypertension among Renal Transplant Recipients Attending Nephrology Clinics in Nairobi, Kenya Tue, 14 Jul 2015 11:38:31 +0000 http://www.hindawi.com/journals/jtrans/2015/746563/ Objective. To determine the factors associated with poor blood pressure control among renal transplant recipients in a resource-limited setting. Methods. A cross-sectional study was carried out on renal transplant recipients at the Kenyatta National Hospital. Sociodemographic details, blood pressure, urine albumin : creatinine ratio, and adherence using the MMAS-8 questionnaire were noted. Independent factors associated with uncontrolled hypertension were determined using logistic regression analysis. Results. 85 subjects were evaluated. Mean age was 42.4 (SD ± 12.2) years, with a male : female ratio of 1.9 : 1. Fifty-five patients (64.7%) had uncontrolled hypertension (BP ≥ 130/80 mmHg). On univariate analysis, male sex (OR 3.7, 95% CI 1.4–9.5, ), higher levels of proteinuria (), and nonadherence to antihypertensives (OR 18, 95% CI 5.2–65.7, ) were associated with uncontrolled hypertension. On logistic regression analysis, male sex (adjusted OR 4.6, 95% CI 1.1–19.0, ) and nonadherence (adjusted OR 33.8, 95% CI 8.6–73.0, ) were independently associated with uncontrolled hypertension. Conclusion. Factors associated with poor blood pressure control in this cohort were male sex and nonadherence to antihypertensives. Emphasis on adherence to antihypertensive therapy must be pursued within this population. Mary N. Kubo, Joshua K. Kayima, Anthony J. Were, Seth O. McLigeyo, and Elijah N. Ogola Copyright © 2015 Mary N. Kubo et al. All rights reserved. Proximal Tubular Injury in Medullary Rays Is an Early Sign of Acute Tacrolimus Nephrotoxicity Wed, 24 Jun 2015 08:25:33 +0000 http://www.hindawi.com/journals/jtrans/2015/142521/ Tacrolimus (FK506) is one of the principal immunosuppressive agents used after solid organ transplantations to prevent allograft rejection. Chronic renal injury induced by tacrolimus is characterized by linear fibrosis in the medullary rays; however, the early morphologic findings of acute tacrolimus nephrotoxicity are not well characterized. Kidney injury molecule-1 (KIM-1) is a specific injury biomarker that has been proven to be useful in the diagnosis of mild to severe acute tubular injury on renal biopsies. This study was motivated by a patient with acute kidney injury associated with elevated serum tacrolimus levels in whom KIM-1 staining was present only in proximal tubules located in the medullary rays in the setting of otherwise normal light, immunofluorescent, and electron microscopy. We subsequently evaluated KIM-1 expression in 45 protocol and 39 indicated renal transplant biopsies to determine whether higher serum levels of tacrolimus were associated with acute segment specific injury to the proximal tubule, as reflected by KIM-1 staining in the proximal tubules of the cortical medullary rays. The data suggest that tacrolimus toxicity preferentially affects proximal tubules in medullary rays and that this targeted injury is a precursor lesion for the linear fibrosis seen in chronic tacrolimus toxicity. Diane Cosner, Xu Zeng, and Ping L. Zhang Copyright © 2015 Diane Cosner et al. All rights reserved. MicroRNAs in Kidney Transplantation: Living up to Their Expectations? Mon, 11 May 2015 13:21:41 +0000 http://www.hindawi.com/journals/jtrans/2015/354826/ Since the discovery of microRNAs, ample research has been conducted to elucidate their involvement in an array of (patho)physiological conditions. Ischemia reperfusion injury is a major problem in kidney transplantation and its mechanism is still not fully known, nor is there an effective therapy. Furthermore, no biomarker is available to specifically measure (ischemic) damage after kidney transplantation or predict transplantation outcome. In this review, we summarize studies conducted on microRNAs in renal ischemia reperfusion injury and kidney transplantation. Although the number of publications on miRNAs in different areas of nephrology is increasing every year, only a limited number of reports that address the role of miRNAs in relation to ischemia reperfusion injury or kidney transplantation are available. All reports up to June 2014 on microRNAs in renal IRI, kidney transplantation, and renal allograft status were included. Design of the studies was highly variable and there was limited overlap between microRNAs found in these reports. No single microRNA expression pattern could be found, although multiple microRNAs involved in the immune response seem to be altered after ischemia reperfusion injury and kidney transplantation. Although there is a growing interest in microRNA research in kidney transplantation aiming to identify biomarkers and therapeutical targets, to date, no specific microRNA has been demonstrated to be applicable as either one, mostly because of lack of specificity. More systematical research is needed to determine whether microRNAs can be applied as biomarker, therapeutic target, or therapeutic agent in kidney transplantation. Eline K. van den Akker, Frank J. M. F. Dor, Jan N. M. IJzermans, and Ron W. F. de Bruin Copyright © 2015 Eline K. van den Akker et al. All rights reserved. Breakdown in the Organ Donation Process and Its Effect on Organ Availability Thu, 09 Apr 2015 10:54:44 +0000 http://www.hindawi.com/journals/jtrans/2015/831501/ Background. This study examines the effect of breakdown in the organ donation process on the availability of transplantable organs. A process breakdown is defined as a deviation from the organ donation protocol that may jeopardize organ recovery. Methods. A retrospective analysis of donation-eligible decedents was conducted using data from an independent organ procurement organization. Adjusted effect of process breakdown on organs transplanted from an eligible decedent was examined using multivariable zero-inflated Poisson regression. Results. An eligible decedent is four times more likely to become an organ donor when there is no process breakdown (adjusted OR: 4.01; 95% CI: 1.6838, 9.6414; ) even after controlling for the decedent’s age, gender, race, and whether or not a decedent had joined the state donor registry. However once the eligible decedent becomes a donor, whether or not there was a process breakdown does not affect the number of transplantable organs yielded. Overall, for every process breakdown occurring in the care of an eligible decedent, one less organ is available for transplant. Decedent’s age is a strong predictor of likelihood of donation and the number of organs transplanted from a donor. Conclusion. Eliminating breakdowns in the donation process can potentially increase the number of organs available for transplant but some organs will still be lost. Manik Razdan, Howard B. Degenholtz, Jeremy M. Kahn, and Julia Driessen Copyright © 2015 Manik Razdan et al. All rights reserved. The Benefit of Sirolimus Maintenance Immunosuppression and Rabbit Antithymocyte Globulin Induction in Liver Transplant Recipients That Develop Acute Kidney Injury in the Early Postoperative Period Wed, 11 Mar 2015 09:26:42 +0000 http://www.hindawi.com/journals/jtrans/2015/926168/ Published data are limited describing renal outcomes in orthotopic liver transplant (OLT) recipients prescribed sirolimus (SRL) maintenance immunosuppression (MIS) and rabbit antithymocyte globulin (rATG) induction. We investigated whether SRL MIS and rATG induction facilitated recovery of acute kidney injury in the early postoperative period. This retrospective descriptive study screened 308 consecutive OLTs performed between 2006 and 2009. All patients received rATG induction with steroid avoidance. MIS consisted of SRL or TAC with mycophenolate mofetil. A total of 197 patients were included: 168 (85%) received TAC and 29 (15%) received SRL for a median of 365 days. Demographics were similar between groups except for a higher incidence of pretransplant renal dysfunction in the SRL recipients (SRL 59% versus TAC 21%; ). The eGFR was significantly () higher for all time points in the TAC group with the exception of month 2. However, improvement in eGFR was significantly () greater in the SRL group postoperatively. Our study suggests that rATG induction and SRL maintenance immunosuppression facilitate renal recovery for liver transplant recipients that develop acute kidney injury in the early postoperative period. Benjamin T. Duhart Jr., Winston A. Ally, Amy G. Krauss, Joanna Q. Hudson, James D. Eason, Vinaya Rao, and Jason M. Vanatta Copyright © 2015 Benjamin T. Duhart Jr. et al. All rights reserved. A Nationwide Assessment of the Burden of Urinary Tract Infection among Renal Transplant Recipients Wed, 25 Feb 2015 12:41:45 +0000 http://www.hindawi.com/journals/jtrans/2015/854640/ Objective. Evaluate the prevalence and outcomes of urinary tract infection (UTI) among renal transplant recipients. Methods. A secondary analysis of the Nationwide Inpatient Sample 2009–2011 was conducted. Survey-weighted multivariable regression analyses were used to examine the impact of UTI on transplant complications, total charges, and length of stay. Results. A total of 1,044 renal transplant recipients, representing a population estimate of 49,862, were included in the study. UTI was most common in transplant recipients with hypertension (53%) and prevalence was noted to be 28.2 and 65.9 cases per 1,000 for men and women, respectively. UTI increased the likelihood of transplant complications (182% for men, 169% for women). Total charges were 28% higher among men as compared to 22% among women with UTI. Such infection also increased the length of stay by 87% among men and 74% among women. Discussion. UTI in renal transplant recipients was associated with prolonged length of stay, total charges, and increased odds of transplant complications. Interventions to prevent UTI among such patients should be a priority area for future research and practice. Benjamin J. Becerra, Monideepa B. Becerra, and Nasia Safdar Copyright © 2015 Benjamin J. Becerra et al. All rights reserved. Risk-Adjusted Analysis of Relevant Outcome Drivers for Patients after More Than Two Kidney Transplants Sun, 01 Feb 2015 09:10:18 +0000 http://www.hindawi.com/journals/jtrans/2015/712049/ Renal transplantation is the treatment of choice for patients suffering end-stage renal disease, but as the long-term renal allograft survival is limited, most transplant recipients will face graft loss and will be considered for a retransplantation. The goal of this study was to evaluate the patient and graft survival of the 61 renal transplant recipients after second or subsequent renal transplantation, transplanted in our institution between 1990 and 2010, and to identify risk factors related to inferior outcomes. Actuarial patient survival was 98.3%, 94.8%, and 88.2% after one, three, and five years, respectively. Actuarial graft survival was 86.8%, 80%, and 78.1% after one, three, and five years, respectively. Risk-adjusted analysis revealed that only age at the time of last transplantation had a significant influence on patient survival, whereas graft survival was influenced by multiple immunological and surgical factors, such as the number of HLA mismatches, the type of immunosuppression, the number of surgical complications, need of reoperation, primary graft nonfunction, and acute rejection episodes. In conclusion, third and subsequent renal transplantation constitute a valid therapeutic option, but inferior outcomes should be expected among elderly patients, hyperimmunized recipients, and recipients with multiple operations at the site of last renal transplantation. Lampros Kousoulas, Florian W. R. Vondran, Paulina Syryca, Juergen Klempnauer, Harald Schrem, and Frank Lehner Copyright © 2015 Lampros Kousoulas et al. All rights reserved. Interstitial Lung Disease Associated with mTOR Inhibitors in Solid Organ Transplant Recipients: Results from a Large Phase III Clinical Trial Program of Everolimus and Review of the Literature Thu, 18 Dec 2014 00:10:30 +0000 http://www.hindawi.com/journals/jtrans/2014/305931/ Interstitial lung disease (ILD) has been reported with the use of mammalian target of rapamycin inhibitors (mTORi). The clinical and safety databases of three Phase III trials of everolimus in de novo kidney (A2309), heart (A2310), and liver (H2304) transplant recipients (TxR) were searched using a standardized MedDRA query (SMQ) search for ILD followed by a case-by-case medical evaluation. A literature search was conducted in MEDLINE and EMBASE. Out of the 1,473 de novo TxR receiving everolimus in Phase III trials, everolimus-related ILD was confirmed in six cases (one kidney, four heart, and one liver TxR) representing an incidence of 0.4%. Everolimus was discontinued in three of the four heart TxR, resulting in ILD improvement or resolution. Outcome was fatal in the kidney TxR (in whom everolimus therapy was continued) and in the liver TxR despite everolimus discontinuation. The literature review identified 57 publications on ILD in solid organ TxR receiving everolimus or sirolimus. ILD presented months or years after mTORi initiation and symptoms were nonspecific and insidious. The event was more frequent in patients with a late switch to mTORi. In most cases, ILD was reversed after prompt mTORi discontinuation. ILD induced by mTORi is an uncommon and potentially fatal event warranting early recognition and drug discontinuation. Patricia Lopez, Sven Kohler, and Seema Dimri Copyright © 2014 Patricia Lopez et al. All rights reserved. The Impact of the Introduction of MELD on the Dynamics of the Liver Transplantation Waiting List in São Paulo, Brazil Thu, 27 Nov 2014 11:58:49 +0000 http://www.hindawi.com/journals/jtrans/2014/219789/ Until July 15, 2006, the time on the waiting list was the main criterion for allocating deceased donor livers in the state of São Paulo, Brazil. After this date, MELD has been the basis for the allocation of deceased donor livers for adult transplantation. Our aim was to compare the waitlist dynamics before MELD (1997–2005) and after MELD (2006–2012) in our state. A retrospective study was conducted including the data from all the liver transplant candidate waiting lists from July 1997 to December 2012. The data were related to the actual number of liver transplantations (Tr), the incidence of new patients on the list (I), and the number of patients who died while being on the waitlist (D) from 1997 to 2005 (the pre-MELD era) and from 2006 to 2012 (the post-MELD era). The number of transplantations from 1997 to 2005 and from 2006 to 2012 increased nonlinearly, with a clear trend to levelling to equilibrium at approximately 350 and 500 cases per year, respectively. The implementation of the MELD score resulted in a shorter waiting time until liver transplantation. Additionally, there was a significant effect on the waitlist dynamics in the first 4 years; however, the curves diverge from there, implying a null long-range effect on the waitlist by the MELD scores. Eleazar Chaib, Eduardo Massad, Bruno Butturi Varone, Andre Leopoldino Bordini, Flavio Henrique Ferreira Galvão, Alessandra Crescenzi, Arnaldo Bernal Filho, and Luiz Augusto Carneiro D’Albuquerque Copyright © 2014 Eleazar Chaib et al. All rights reserved. Outcomes of Renal Transplantation in Brunei Darussalam over a Twenty-Year Period (1993–2012) Wed, 12 Nov 2014 11:51:43 +0000 http://www.hindawi.com/journals/jtrans/2014/784805/ Objectives. Brunei Darussalam has a high prevalence and incidence of end stage renal disease (ESRD). Up until 2012, all renal transplantations were performed in overseas centres, either as government-sponsored (living-related transplantation) or as self-sponsored (commercialized transplantation) ones. We hypothesize that graft and patient survival of Brunei renal transplant patients are on a par with international standards. Materials and Methods. Data of all renal transplant patients in Brunei were analysed over a twenty-year period from registry records and case notes. Comparative survival data from other countries were obtained from PubMed-listed literature. Results. A total of 49 transplantation procedures were performed in foreign centres between 1993 and 2012. 29 were government-sponsored and 20 were self-sponsored transplantations. The 5- and 10-year overall patient survival rates were 93.3% and 90.1%, respectively. The 5- and 10-year overall graft survival rates were 91.1% and 81.2%. There is no difference in the survival outcomes of government-sponsored and self-sponsored patients. Living-related (government-sponsored) and commercialised (self-sponsored) grafts had equivalent survival to those reported in the literature. Conclusion. Our survival data was on par with those achieved in many countries. We hope to use this information to convince local stakeholders and patients to favour transplantation as the preferred modality of RRT. Jackson Tan, Muhammad Abdul Mabood Khalil, Si Yen Tan, Muhammad Khalil, Dalinatul Ahmed, Shaukat Zinna, and William Chong Copyright © 2014 Jackson Tan et al. All rights reserved.