Journal of Transplantation http://www.hindawi.com The latest articles from Hindawi Publishing Corporation © 2014 , Hindawi Publishing Corporation . All rights reserved. Donor Heart Utilization following Cardiopulmonary Arrest and Resuscitation: Influence of Donor Characteristics and Wait Times in Transplant Regions Tue, 08 Jul 2014 00:00:00 +0000 http://www.hindawi.com/journals/jtrans/2014/519401/ Background. Procurement of hearts from cardiopulmonary arrest and resuscitated (CPR) donors for transplantation is suboptimal. We studied the influences of donor factors and regional wait times on CPR donor heart utilization. Methods. From UNOS database (1998 to 2012), we identified 44,744 heart donors, of which 4,964 (11%) received CPR. Based on procurement of heart for transplantation, CPR donors were divided into hearts procured (HP) and hearts not procured (HNP) groups. Logistic regression analysis was used to identify predictors of heart procurement. Results. Of the 4,964 CPR donors, 1,427 (28.8%) were in the HP group. Donor characteristics that favored heart procurement include younger age (25.5 ± 15 yrs versus 39 ± 18 yrs, ), male gender (34% versus 23%, ), shorter CPR duration (<15 min versus >30 min, ), and head trauma (60% versus 15%). Among the 11 UNOS regions, the highest procurement was in Region 1 (37%) and the lowest in Region 3 (24%). Regional transplant volumes and median waiting times did not influence heart procurement rates. Conclusions. Only 28.8% of CPR donor hearts were procured for transplantation. Factors favoring heart procurement include younger age, male gender, short CPR duration, and traumatic head injury. Heart procurement varied by region but not by transplant volumes or wait times. Mohammed Quader, Luke Wolfe, Gundars Katlaps, and Vigneshwar Kasirajan Copyright © 2014 Mohammed Quader et al. All rights reserved. Risk-Stratified Cardiovascular Screening Including Angiographic and Procedural Outcomes of Percutaneous Coronary Interventions in Renal Transplant Candidates Thu, 19 Jun 2014 10:14:40 +0000 http://www.hindawi.com/journals/jtrans/2014/854397/ Background. Benefits of cardiac screening in kidney transplant candidates (KTC) will be dependent on the availability of effective interventions. We retrospectively evaluated characteristics and outcome of percutaneous coronary interventions (PCI) in KTC selected for revascularization by a cardiac screening approach. Methods. In 267 patients evaluated 2003 to 2006, screening tests performed were reviewed and PCI characteristics correlated with major adverse cardiovascular events (MACE) during a follow-up of 55 months. Results. Stress tests in 154 patients showed ischemia in 28 patients (89% high risk). Of 58 patients with coronary angiography, 38 had significant stenoses and 18 cardiac interventions (6.7% of all). 29 coronary lesions in 17/18 patients were treated by PCI. Angiographic success rate was 93.1%, but procedural success rate was only 86.2%. Long lesions () and diffuse disease () were associated with MACE. In high risk patients, cardiac screening did not improve outcome as 21.7% of patients with versus 15.5% of patients without properly performed cardiac screening had MACE (). Conclusion. The moderate procedural success of PCI and poor outcome in long and diffuse coronary lesions underscore the need to define appropriate revascularization strategies in KTC, which will be a prerequisite for cardiac screening to improve outcome in these high-risk patients. Julian König, Martin Möckel, Eda Mueller, Wolfgang Bocksch, Seema Baid-Agrawal, Nina Babel, Ralf Schindler, Petra Reinke, and Peter Nickel Copyright © 2014 Julian König et al. All rights reserved. Peak Serum AST Is a Better Predictor of Acute Liver Graft Injury after Liver Transplantation When Adjusted for Donor/Recipient BSA Size Mismatch (ASTi) Mon, 09 Jun 2014 09:04:13 +0000 http://www.hindawi.com/journals/jtrans/2014/351984/ Background. Despite the marked advances in the perioperative management of the liver transplant recipient, an assessment of clinically significant graft injury following preservation and reperfusion remains difficult. In this study, we hypothesized that size-adjusted AST could better approximate real AST values and consequently provide a better reflection of the extent of graft damage, with better sensitivity and specificity than current criteria. Methods. We reviewed data on 930 orthotopic liver transplant recipients. Size-adjusted AST (ASTi) was calculated by dividing peak AST by our previously reported index for donor-recipient size mismatch, the BSAi. The predictive value of ASTi of primary nonfunction (PNF) and graft survival was assessed by receiver operating characteristic curve, logistic regression, Kaplan-Meier survival, and Cox proportional hazard model. Results. Size-adjusted peak AST (ASTi) was significantly associated with subsequent occurrence of PNF and graft failure. In our study cohort, the prediction of PNF by the combination of ASTi and PT-INR had a higher sensitivity and specificity compared to current UNOS criteria. Conclusions. We conclude that size-adjusted AST (ASTi) is a simple, reproducible, and sensitive marker of clinically significant graft damage. Kyota Fukazawa, Seigo Nishida, and Ernesto A. Pretto Jr. Copyright © 2014 Kyota Fukazawa et al. All rights reserved. Resolution of Mild Ganciclovir-Resistant Cytomegalovirus Disease with Reduced-Dose Cidofovir and CMV-Hyperimmune Globulin Sun, 01 Jun 2014 11:19:29 +0000 http://www.hindawi.com/journals/jtrans/2014/342319/ Ganciclovir-resistant cytomegalovirus (CMV) is associated with significant morbidity in solid organ transplant recipients. Management of ganciclovir-resistant CMV may be complicated by nephrotoxicity which is commonly observed with recommended therapies and/or rejection induced by “indirect” viral effects or reduction of immunosuppression. Herein, we report a series of four high serologic risk (donor CMV positive/recipient CMV negative) kidney transplant patients diagnosed with ganciclovir-resistant CMV disease. All patients initially developed “breakthrough” viremia while still receiving valganciclovir prophylaxis after transplant and were later confirmed to exhibit UL97 mutations after failing to eradicate virus on adequate dosages of valganciclovir. The patients were subsequently and successfully treated with reduced-dose (1-2 mg/kg) cidofovir and CMV-hyperimmune globulin, given in 2-week intervals. In addition, all patients exhibited stable renal function after completion of therapy, and none experienced acute rejection. The combination of reduced-dose cidofovir and CMV-hyperimmune globulin appeared to be a safe and effective regimen in patients with mild disease due to ganciclovir-resistant CMV. Samir J. Patel, Samantha A. Kuten, Richard J. Knight, Dana M. Hong, and A. Osama Gaber Copyright © 2014 Samir J. Patel et al. All rights reserved. Adjuvant Ciprofloxacin for Persistent BK Polyomavirus Infection in Kidney Transplant Recipients Thu, 08 May 2014 13:07:00 +0000 http://www.hindawi.com/journals/jtrans/2014/107459/ Background. BK virus (BKV) infection is a common complication following kidney transplantation. Immunosuppression reduction is the cornerstone of treatment while adjuvant drugs have been tried in small uncontrolled studies. We sought to examine our center’s experience with the use of ciprofloxacin in patients with persistent BKV infection. Methods. Retrospective evaluation of the effect of a 30-day ciprofloxacin course (250 mg twice daily) on BKV infection in kidney transplant recipients who had been diagnosed with BK viruria ≥106 copies/mL and viremia ≥500 copies/mL and in whom the infection did not resolve after immunosuppression reduction and/or treatment with other adjuvant agents. BKV in plasma and urine was evaluated after 3 months following treatment with ciprofloxacin. Results. Nine kidney transplant recipients received ciprofloxacin at a median of 130 days following the initial reduction in immunosuppression. Three patients showed complete viral clearance and another 3 had a ≥50% decrease in plasma viral load. No serious adverse events secondary to ciprofloxacin were reported and no grafts were lost due to BKV up to 1 year after treatment. Conclusion. Ciprofloxacin may be a useful therapy for persistent BKV infection despite conventional treatment. Randomized trials are required to evaluate the potential benefit of this adjuvant therapy. David Arroyo, Sindhu Chandran, Parsia A. Vagefi, and David Wojciechowski Copyright © 2014 David Arroyo et al. All rights reserved. Anemia Control in Kidney Transplant Recipients Using Once-Monthly Continuous Erythropoietin Receptor Activator: A Prospective, Observational Study Sun, 04 May 2014 12:52:04 +0000 http://www.hindawi.com/journals/jtrans/2014/179705/ In a multicenter, prospective, observational study of 279 kidney transplant recipients with anemia, the efficacy and safety of once-monthly continuous erythropoietin receptor activator (C.E.R.A.) were assessed to a maximum of 15 months. The main efficacy variable was the proportion of patients achieving a hemoglobin level of 11-12 g/dL at each of visits between months 7 and 9. At study entry, 224 patients (80.3%) were receiving erythropoiesis stimulating agent (ESA) therapy including darbepoetin alfa (98), epoetin beta (61), and C.E.R.A. (45). The mean (SD) time between C.E.R.A. applications was 34.0 (11.9) days. Among 193 patients for whom efficacy data were available, mean (SD) hemoglobin was 11.1 (0.99) g/dL at study entry, 11.5 (1.1) g/dL at month 7, 11.6 (1.3) g/dL at month 9, and 11.4 (1.1) g/dL at month 15. During months 7–9, 20.7% of patients had all hemoglobin values within the range 11-12 g/dL and 64.8% were within 10–13 g/dL. Seven patients (2.5%) discontinued C.E.R.A. due to adverse events or serious adverse events. In this observational trial under real-life conditions, once-monthly C.E.R.A. therapy achieved stable hemoglobin levels in stable kidney transplant recipients with good tolerability, and with no requirement for any dose change in 43% of patients. Klemens Budde, Thomas Rath, and Volker Kliem Copyright © 2014 Klemens Budde et al. All rights reserved. An Association between BK Virus Replication in Bone Marrow and Cytopenia in Kidney-Transplant Recipients Tue, 29 Apr 2014 09:36:41 +0000 http://www.hindawi.com/journals/jtrans/2014/252914/ The human polyomavirus BK (BKV) is associated with severe complications, such as ureteric stenosis and polyomavirus-associated nephropathy (PVAN), which often occur in kidney-transplant patients. However, it is unknown if BKV can replicate within bone marrow. The aim of this study was to search for BKV replication within the bone marrow of kidney-transplant patients presenting with a hematological disorder. Seventy-two kidney-transplant patients underwent bone-marrow aspiration for cytopenia. At least one virus was detected in the bone marrow of 25/72 patients (35%), that is, parvovirus B19 alone (n = 8), parvovirus plus Epstein-Barr virus (EBV) (n = 3), cytomegalovirus (n = 4), EBV (n = 2), BKV alone (n = 7), and BKV plus EBV (n = 1). Three of the eight patients who had BKV replication within the bone marrow had no detectable BKV replication in the blood. Neutropenia was observed in all patients with BKV replication in the bone marrow, and blockade of granulocyte maturation was observed. Hematological disorders disappeared in all patients after doses of immunosuppressants were reduced. In conclusion, an association between BKV replication in bone marrow and hematological disorders, especially neutropenia, was observed. Further studies are needed to confirm these findings. Emilie Pambrun, Catherine Mengelle, Geneviève Fillola, Patrick Laharrague, Laure Esposito, Isabelle Cardeau-Desangles, Arnaud Del Bello, Jacques Izopet, Lionel Rostaing, and Nassim Kamar Copyright © 2014 Emilie Pambrun et al. All rights reserved. Attitudes to Medication after Kidney Transplantation and Their Association with Medication Adherence and Graft Survival: A 2-Year Follow-Up Study Mon, 28 Apr 2014 07:38:14 +0000 http://www.hindawi.com/journals/jtrans/2014/675301/ Background. Nonadherence to medication is a common problem after kidney transplantation. The aim of this study was to explore attitudes towards medication, adherence, and the relationship with clinical outcomes. Method. Kidney recipients participated in a Q-methodological study 6 weeks after transplantation. As a measure of medication adherence, respondents completed the Basel Assessment of Adherence to Immunosuppressive Medications Scale (BAASIS©-interview). Moreover, the intrapatient variability in the pharmacokinetics of tacrolimus was calculated, which measures stability of drug intake. Data on graft survival was retrieved from patient records up to 2 years after transplantation. Results. 113 renal transplant recipients (19–75 years old) participated in the study. Results revealed three attitudes towards medication adherence—attitude 1: “confident and accurate,” attitude 2: “concerned and vigilant,” and attitude 3: “appearance oriented and assertive.” We found association of attitudes with intrapatient variability in pharmacokinetics of tacrolimus, but not with self-reported nonadherence or graft survival. However, self-reported nonadherence immediately after transplantation was associated with lower two-year graft survival. Conclusion. These preliminary findings suggest that nonadherence shortly after kidney transplantation may be a risk factor for lower graft survival in the years to follow. The attitudes to medication were not a risk factor. Mirjam Tielen, Job van Exel, Mirjam Laging, Denise K. Beck, Roshni Khemai, Teun van Gelder, Michiel G. H. Betjes, Willem Weimar, and Emma K. Massey Copyright © 2014 Mirjam Tielen et al. All rights reserved. Use of Adjuvant Sorafenib in Liver Transplant Recipients with High-Risk Hepatocellular Carcinoma Thu, 10 Apr 2014 09:53:01 +0000 http://www.hindawi.com/journals/jtrans/2014/913634/ The efficacy of liver transplantation (LT) for hepatocellular (HCC) is limited by tumor recurrence rates of 10–15%. We undertook this pilot study to examine the use of sorafenib as adjuvant therapy in high-risk LT recipients. Methods. We prospectively enrolled patients transplanted for HCC into a treatment protocol utilizing sorafenib if their explant examination showed evidence of viable tumor exceeding Milan criteria. We utilized as historical controls patients transplanted previously, whose explant tumor characteristics exceeded Milan criteria, but who were not “preemptively” treated with sorafenib. Wilcoxon two-sample test and Fisher’s exact test were used to compare survival and recurrence rates between the two groups. Results. Seven patients were treated with sorafenib and compared to 12 historical “controls.” Two of 7 treated patients suffered from HCC recurrence. Of the comparison group, 9 experienced HCC recurrence and all succumbed to disease. Dose reduction improved tolerance of drug. The overall rate of HCC recurrence was decreased in the adjuvant therapy group compared to historical controls (29% versus 75%, ). Disease free 1-year survival for the treated versus untreated group was 100% versus 66%, respectively. Conclusion. Adjuvant use of sorafenib is safe and decreases risk of HCC recurrence in high-risk LT recipients. Kirti Shetty, Chiranjeev Dash, and Jacqueline Laurin Copyright © 2014 Kirti Shetty et al. All rights reserved. The Role of Imaging in Patient Selection, Preoperative Planning, and Postoperative Monitoring in Human Upper Extremity Allotransplantation Thu, 27 Mar 2014 16:26:49 +0000 http://www.hindawi.com/journals/jtrans/2014/169546/ Objective. To describe the role of imaging in vascular composite allotransplantation based on one institution’s experience with upper extremity allotransplant patients. Methods. The institutional review board approved this review of HIPAA-compliant patient data without the need for individual consent. A retrospective review was performed of imaging from 2008 to 2011 on individuals undergoing upper extremity transplantation. This demonstrated that, of the 19 patients initially considered, 5 patients with a mean age of 37 underwent transplantation. Reports were correlated clinically to delineate which preoperative factors lead to patient selection versus disqualification and what concerns dictated postoperative imaging. Findings were subdivided into musculoskeletal and vascular imaging criterion. Results. Within the screening phase, musculoskeletal exclusion criterion included severe shoulder arthropathy, poor native bone integrity, and marked muscular atrophy. Vascular exclusion criterion included loss of sufficient arterial or venous supply and significant distortion of the native vascular architecture. Postoperative imaging was used to document healing and hardware integrity. Postsurgical angiography and ultrasound were used to monitor for endothelial proliferation or thrombosis as signs of rejection and vascular complication. Conclusion. Multimodality imaging is an integral component of vascular composite allotransplantation surgical planning and surveillance to maximize returning form and functionality while minimizing possible complications. Eira S. Roth, David G. Buck, Vijay S. Gorantla, Joseph E. Losee, Daniel E. Foust, and Cynthia A. Britton Copyright © 2014 Eira S. Roth et al. All rights reserved. Midterm Experience of Ipsilateral Axillary-Axillary Arteriovenous Loop Graft as Tertiary Access for Haemodialysis Sun, 23 Mar 2014 09:09:10 +0000 http://www.hindawi.com/journals/jtrans/2014/908738/ Objectives. To present a series of ipsilateral axillary artery to axillary vein loop arm grafts as an alternative vascular access procedure for haemodialysis in patients with difficult access. Design. Retrospective case series. Methods. Patients who underwent an axillary loop arteriovenous graft from September 2009 to September 2012 were included. Preoperative venous imaging to exclude central venous stenosis and to image arm/axillary veins was performed. A cuffed PTFE graft was anastomosed to the distal axillary artery and axillary vein and looped on the arm. Results. 25 procedures were performed on 22 patients. Median age was 51 years, with 9 males and 13 females. Median number of previous access procedures was 3 (range 0–7). Median followup was 16.4 months (range 1–35). At 3 months and 1 year, the primary and secondary patency rates were 70% and 72% and 36% and 37%, respectively. There were 11 radiological interventions in 6 grafts including 5 angioplasties and 6 thrombectomies. There were 19 surgical procedures in 10 grafts, including thrombectomy, revision, repair for bleeding, and excision. Conclusions. Our series demonstrates that the axillary loop arm graft yields acceptable early patency rates in a complex group of patients but to maintain graft patency required high rates of surgical and radiological intervention, in particular graft thrombectomy. J. P. Hunter and M. L. Nicholson Copyright © 2014 J. P. Hunter and M. L. Nicholson. All rights reserved. Significance of Urinary Proteome Pattern in Renal Allograft Recipients Thu, 13 Mar 2014 13:37:58 +0000 http://www.hindawi.com/journals/jtrans/2014/139361/ Urinary proteomics is developing as a platform of urinary biomarkers of immense potential in recent years. The definition of urinary proteome in the context of renal allograft and characterization of different proteome patterns in various graft dysfunctions have led to the development of a distinct science of this noninvasive tool. Substantial numbers of studies have shown that different renal allograft disease states, both acute and chronic, could portray unique urinary proteome pattern enabling early diagnosis of graft dysfunction and proper manipulation of immunosuppressive strategy that could impact graft prognosis. The methodology of the urinary proteome is nonetheless not more complex than that of other sophisticated assays of conventional urinary protein analysis. Moreover, the need for a centralized database is also felt by the researchers as more and more studies have been presenting their results from different corners and as systems of organizing these newly emerging data being developed at international and national levels. In this context concept of urinary proteomics in renal allograft recipients would be of significant importance in clinical transplantation. Sufi M. Suhail Copyright © 2014 Sufi M. Suhail. All rights reserved. Three-Year Outcomes in Kidney Transplant Patients Randomized to Steroid-Free Immunosuppression or Steroid Withdrawal, with Enteric-Coated Mycophenolate Sodium and Cyclosporine: The Infinity Study Wed, 05 Mar 2014 00:00:00 +0000 http://www.hindawi.com/journals/jtrans/2014/171898/ In a six-month, multicenter, open-label trial, de novo kidney transplant recipients at low immunological risk were randomized to steroid avoidance or steroid withdrawal with IL-2 receptor antibody (IL-2RA) induction, enteric-coated mycophenolate sodium (EC-MPS: 2160 mg/day to week 6, 1440 mg/day thereafter), and cyclosporine. Results from a 30-month observational follow-up study are presented. Of 166 patients who completed the core study on treatment, 131 entered the follow-up study (70 steroid avoidance, 61 steroid withdrawal). The primary efficacy endpoint of treatment failure (clinical biopsy-proven acute rejection (BPAR) graft loss, death, or loss to follow-up) occurred in 21.4% (95% CI 11.8–31.0%) of steroid avoidance patients and 16.4% (95% CI 7.1–25.7%) of steroid withdrawal patients by month 36 (). BPAR had occurred in 20.0% and 11.5%, respectively (). The incidence of adverse events with a suspected relation to steroids during months 6–36 was 22.9% versus 37.1% (). By month 36, 32.4% and 51.7% of patients in the steroid avoidance and steroid withdrawal groups, respectively, were receiving oral steroids. In conclusion, IL-2RA induction with early intensified EC-MPS dosing and CNI therapy in de novo kidney transplant patients at low immunological risk may achieve similar three-year efficacy regardless of whether oral steroids are withheld for at least three months. A. Thierry, G. Mourad, M. Büchler, G. Choukroun, O. Toupance, N. Kamar, F. Villemain, Y. Le Meur, C. Legendre, P. Merville, M. Kessler, A.-E. Heng, B. Moulin, S. Queré, F. Di Giambattista, A. Lecuyer, and G. Touchard Copyright © 2014 A. Thierry et al. All rights reserved. The Role of mTOR Inhibitors in Liver Transplantation: Reviewing the Evidence Tue, 25 Feb 2014 14:19:35 +0000 http://www.hindawi.com/journals/jtrans/2014/845438/ Despite the success of liver transplantation, long-term complications remain, including de novo malignancies, metabolic syndrome, and the recurrence of hepatitis C virus (HCV) and hepatocellular carcinoma (HCC). The current mainstay of treatment, calcineurin inhibitors (CNIs), can also worsen posttransplant renal dysfunction, neurotoxicity, and diabetes. Clearly there is a need for better immunosuppressive agents that maintain similar rates of efficacy and renal function whilst minimizing adverse effects. The mammalian target of rapamycin (mTOR) inhibitors with a mechanism of action that is different from other immunosuppressive agents has the potential to address some of these issues. In this review we surveyed the literature for reports of the use of mTOR inhibitors in adult liver transplantation with respect to renal function, efficacy, safety, neurological symptoms, de novo tumors, and the recurrence of HCC and HCV. The results of our review indicate that mTOR inhibitors are associated with efficacy comparable to CNIs while having benefits on renal function in liver transplantation. We also consider newer dosing schedules that may limit side effects. Finally, we discuss evidence that mTOR inhibitors may have benefits in the oncology setting and in relation to HCV-related allograft fibrosis, metabolic syndrome, and neurotoxicity. Goran B. Klintmalm and Björn Nashan Copyright © 2014 Goran B. Klintmalm and Björn Nashan. All rights reserved. Alendronate as an Effective Treatment for Bone Loss and Vascular Calcification in Kidney Transplant Recipients Wed, 19 Feb 2014 07:56:28 +0000 http://www.hindawi.com/journals/jtrans/2014/269613/ Kidney transplant recipients develop secondary osteoporosis induced by immunosuppressive medication, with a high risk of fracture, and abdominal aortic calcification (AC) is a known predictor of cardiovascular mortality. In this study of 12 stable kidney recipients, we estimated the preventive effect of bisphosphonate treatment on bone loss and progression of AC. We randomly divided the subjects into a treatment group with alendronate (group A: 5 subjects) and a control group (group C: 7 subjects). Group A patients received 35 mg/week of alendronate over 24 months, while group C patients were not administered with any bisphosphonates. Two major endpoints were established: (1) the time-dependent change in bone mineral density (BMD) estimated with DEXA and (2) progression of abdominal AC, calculated twice as an index (ACI) using computed tomography data. Over the 2-year study period, group A patients showed significantly increased BMD of 1.86 ± 0.85% ( versus baseline), and almost complete inhibition of ACI progression (38.2 ± 24.2% to 39.6 ± 24.3%), but group C patients showed a decrease in BMD decline with bone loss and progression of ACI (32.8 ± 25.0% to 37.8 ± 29.2%, ). In conclusion, alendronate therapy was an effective treatment in kidney transplant recipients for secondary osteoporosis and vascular calcification as ectopic calcification. This clinical trial is registered with number JMA-IIA00155 of JMACCT CTR. Masanori Okamoto, Shintaro Yamanaka, Wataru Yoshimoto, and Takashi Shigematsu Copyright © 2014 Masanori Okamoto et al. All rights reserved. Donor-Recipient Size Mismatch in Paediatric Renal Transplantation Thu, 13 Feb 2014 12:45:05 +0000 http://www.hindawi.com/journals/jtrans/2014/317574/ Introduction. End stage renal failure in children is a rare but devastating condition, and kidney transplantation remains the only permanent treatment option. The aim of this review was to elucidate the broad surgical issues surrounding the mismatch in size of adult kidney donors to their paediatric recipients. Methods. A comprehensive literature search was undertaken on PubMed, MEDLINE, and Google Scholar for all relevant scientific articles published to date in English language. Manual search of the bibliographies was also performed to supplement the original search. Results. Size-matching kidneys for transplantation into children is not feasible due to limited organ availability from paediatric donors, resulting in prolonged waiting list times. Transplanting a comparatively large adult kidney into a child may lead to potential challenges related to the surgical incision and approach, vessel anastomoses, wound closure, postoperative cardiovascular stability, and age-correlated maturation of the graft. Conclusion. The transplantation of an adult kidney into a size mismatched paediatric recipient significantly reduces waiting times for surgery; however, it presents further challenges in terms of both the surgical procedure and the post-operative management of the patient’s physiological parameters. J. Donati-Bourne, H. W. Roberts, and R. A. Coleman Copyright © 2014 J. Donati-Bourne et al. All rights reserved. The First Fifty ABO Blood Group Incompatible Kidney Transplantations: The Rotterdam Experience Thu, 06 Feb 2014 00:00:00 +0000 http://www.hindawi.com/journals/jtrans/2014/913902/ This study describes the single center experience and long-term results of ABOi kidney transplantation using a pretransplantation protocol involving immunoadsorption combined with rituximab, intravenous immunoglobulins, and triple immune suppression. Fifty patients received an ABOi kidney transplant in the period from 2006 to 2012 with a follow-up of at least one year. Eleven antibody mediated rejections were noted of which 5 were mixed antibody and cellular mediated rejections. Nine cellular mediated rejections were recorded. Two grafts were lost due to rejection in the first year. One-year graft survival of the ABOi grafts was comparable to 100 matched ABO compatible renal grafts, 96% versus 99%. At 5-year follow-up, the graft survival was 90% in the ABOi versus 97% in the control group. Posttransplantation immunoadsorption was not an essential part of the protocol and no association was found between antibody titers and subsequent graft rejection. Steroids could be withdrawn safely 3 months after transplantation. Adverse events specifically related to the ABOi protocol were not observed. The currently used ABOi protocol shows good short and midterm results despite a high rate of antibody mediated rejections in the first years after the start of the program. Madelon van Agteren, Willem Weimar, Annelies E. de Weerd, Peter A. W. te Boekhorst, Jan N. M. Ijzermans, Jaqueline van de Wetering, and Michiel G. H. Betjes Copyright © 2014 Madelon van Agteren et al. All rights reserved. The Natural History of Biopsy-Negative Rejection after Heart Transplantation Wed, 18 Dec 2013 18:27:06 +0000 http://www.hindawi.com/journals/jtrans/2013/236720/ Purpose. The most recent International Society for Heart and Lung Transplantation (ISHLT) biopsy scale classifies cellular and antibody-mediated rejections. However, there are cases with acute decline in left ventricular ejection fraction (LVEF ≤ 45%) but no evidence of rejection on biopsy. Characteristics and treatment response of this biopsy negative rejection (BNR) have yet to be elucidated. Methods. Between 2002 and 2012, we found 12 cases of BNR in 11 heart transplant patients as previously defined. One of the 11 patients was treated a second time for BNR. Characteristics and response to treatment were noted. Results. 12 cases (of 11 patients) were reviewed and 11 occurred during the first year after transplant. 8 cases without heart failure symptoms were treated with an oral corticosteroids bolus and taper or intravenous immunoglobulin. Four cases with heart failure symptoms were treated with thymoglobulin, intravenous immunoglobulin, and intravenous methylprednisolone followed by an oral corticosteroids bolus and taper. Overall, 7 cases resulted in return to normal left ventricular function within a mean of 14 ± 10 days from the initial biopsy. Conclusion. BNR includes cardiac dysfunction and can be a severe form of rejection. Characteristics of these cases of rejection are described with most cases responding to appropriate therapy. Zhaoyi Tang, Jon Kobashigawa, Matthew Rafiei, Lily Kagan Stern, and Michele Hamilton Copyright © 2013 Zhaoyi Tang et al. All rights reserved. Systemic Heparinisation in Laparoscopic Live Donor Nephrectomy Mon, 16 Dec 2013 09:01:23 +0000 http://www.hindawi.com/journals/jtrans/2013/138926/ Introduction. Systemic heparinisation is advocated during laparoscopic live donor nephrectomy (LDN) as a preventative measure against renal vascular thrombosis during the warm ischaemic interval. This study compares the outcome with and without the administration of systemic heparinisation. Methods. A retrospective analysis was performed on 186 consecutive LDN patients between April 2008 and November 2012. Systemic heparin (2000–3000 IU) was administered intravenously to donors (hep ). From January 2010, heparin was not used systemically in this group of LDN (no hep ). Outcome measures included donor and recipient complications, initial graft function, and 12 month graft survival. Results. The demographics of both heparinised and non-heparinised donors were similar. The warm ischaemic time (WIT) was comparable in both groups (WIT; hep versus no hep minutes; ). There was no difference in complication rates, no episodes of graft thrombosis, and no incidences of primary nonfunction in either group. Delayed graft function occurred in 4/109 and 1/77 (3.6% versus 1.2%; ) and there was no significant difference in graft survival (). Conclusion. Omitting systemic heparinisation during laparoscopic donor nephrectomy is a feasible and safe approach that does not compromise donor or recipient outcome. Charlotte Crotty, Yasmin Tabbakh, Sarah A. Hosgood, and Michael L. Nicholson Copyright © 2013 Charlotte Crotty et al. All rights reserved. Liver Transplantation without Perioperative Transfusions Single-Center Experience Showing Better Early Outcome and Shorter Hospital Stay Thu, 12 Dec 2013 14:12:22 +0000 http://www.hindawi.com/journals/jtrans/2013/649209/ Background. Significant amounts of red blood cells (RBCs) transfusions are associated with poor outcome after liver transplantation (LT). We report our series of LT without perioperative RBC (P-RBC) transfusions to evaluate its influence on early and long-term outcomes following LT. Methods. A consecutive series of LT between 2006 and 2011 was analyzed. P-RBC transfusion was defined as one or more RBC units administrated during or ≤48 hours after LT. We divided the cohort in “No-Transfusion” and “Yes-Transfusion.” Preoperative status, graft quality, and intra- and postoperative variables were compared to assess P-RBC transfusion risk factors and postoperative outcome. Results. LT was performed in 127 patients (“No-Transfusion” = 39 versus “Yes-Transfusion” = 88). While median MELD was significantly higher in Yes-Transfusion (11 versus 21; ) group, platelet count, prothrombin time, and hemoglobin were significantly lower. On multivariate analysis, the unique independent risk factor associated with P-RBC transfusions was preoperative hemoglobin (). Incidence of postoperative bacterial infections (10 versus 27%; ), median ICU (2 versus 3 days; ), and hospital stay (7.5 versus 9 days; ) were negatively influenced by P-RBC transfusions. However, 30-day mortality (10 versus 15%) and one- (86 versus 70%) and 3-year (77 versus 66%) survival were equivalent in both groups. Conclusions. Recipient MELD score was not a predictive factor for P-RBC transfusion. Patients requiring P-RBC transfusions had worse postoperative outcome. Therefore, maximum efforts must be focused on improving hemoglobin levels during waiting list time to prevent using P-RBC in LT recipients. Nicolás Goldaracena, Patricio Méndez, Emilio Quiñonez, Gustavo Devetach, Lucio Koo, Carlos Jeanes, Margarita Anders, Federico Orozco, Pablo D. Comignani, Ricardo C. Mastai, and Lucas McCormack Copyright © 2013 Nicolás Goldaracena et al. All rights reserved. Everolimus in Heart Transplantation: An Update Thu, 05 Dec 2013 13:00:17 +0000 http://www.hindawi.com/journals/jtrans/2013/683964/ The evidence base relating to the use of everolimus in heart transplantation has expanded considerably in recent years, providing clinically relevant information regarding its use in clinical practice. Unless there are special considerations to take into account, all de novo heart transplant patients can be regarded as potential candidates for immunosuppression with everolimus and reduced-exposure calcineurin inhibitor therapy. Caution about the use of everolimus immediately after transplantation should be exercised in certain patients with the risk of severe proteinuria, with poor wound healing, or with uncontrolled severe hyperlipidemia. Initiation of everolimus in the early phase aftertransplant is not advisable in patients with severe pretransplant end-organ dysfunction or in patients on a left ventricular assist device beforetransplant who are at high risk of infection or of wound healing complications. The most frequent reason for introducing everolimus in maintenance heart transplant patients is to support minimization or withdrawal of calcineurin inhibitor therapy, for example, due to impaired renal function or malignancy. Due to its potential to inhibit the progression of cardiac allograft vasculopathy and to reduce cytomegalovirus infection, everolimus should be initiated as soon as possible after heart transplantation. Immediate and adequate reduction of CNI exposure is mandatory from the start of everolimus therapy. Stephan W. Hirt, Christoph Bara, Markus J. Barten, Tobias Deuse, Andreas O. Doesch, Ingo Kaczmarek, Uwe Schulz, Jörg Stypmann, Assad Haneya, and Hans B. Lehmkuhl Copyright © 2013 Stephan W. Hirt et al. All rights reserved. Timing of Hepatic Artery Reperfusion and Biliary Strictures in Liver Transplantation Tue, 03 Dec 2013 10:19:13 +0000 http://www.hindawi.com/journals/jtrans/2013/757389/ During orthotopic liver transplantation (OLT), biliary tract perfusion occurs with hepatic artery reperfusion (HARP), commonly performed after the portal vein reperfusion (PVRP). We examined whether the average time interval between PVRP and HARP impacted on postoperative biliary strictures occurrence. Patients undergoing OLT from 2007 to 2009 were included if they were ≥18 years old, had survived 3 months postoperatively, and had data for PVRP and HARP. Patients receiving allografts from DCD donors were excluded. Patients were followed for 6 months post-OLT. Seventy-five patients met the study inclusion criteria. Of these, 10 patients had a biliary stricture. There was no statistical difference between those with and without biliary stricture in age, gender, etiology, MELD score, graft survival, and time interval between PVRP and HARP. Ninety percent of patients with biliary stricture had a PVRP-HARP time interval >30 minutes, as opposed to 77% of patients without biliary stricture. However, this was not statistically significant. The cold ischemia time was significantly different between the two groups. Time interval for HARP after PVRP did not appear to affect the development of biliary strictures. However, 30 minutes may be suggested as a critical time after which there is an increase in biliary stricture occurrence. Ganesh Gunasekaran, Jyoti Sharma, Leandro C. Mosna, Roxana Bodin, and David C. Wolf Copyright © 2013 Ganesh Gunasekaran et al. All rights reserved. Molecular Signatures of Recurrent Hepatocellular Carcinoma Secondary to Hepatitis C Virus following Liver Transplantation Tue, 26 Nov 2013 15:44:45 +0000 http://www.hindawi.com/journals/jtrans/2013/878297/ Chronic hepatitis C virus (HCV) induced hepatocellular carcinoma (HCC) is a primary indication for liver transplantation (LT). In western countries, the estimated rate of HCC recurrence following LT is between 15% and 20% and is a major cause of mortality. Currently, there is no standard method to treat patients who are at high risk for HCC recurrence. The aim of this study was to investigate the molecular signatures underlying HCC recurrence that may lead to future studies on gene regulation contributing to new therapeutic options. Two groups of patients were selected, one including patients with HCV who developed HCC recurrence (HCC-R) ≤3 years from LT and the second group including patients with HCV who did not have recurrent HCC (HCC-NR). Microarray analysis containing more than 29,000 known genes was performed on formalin-fixed-paraffin-embedded (FFPE) liver tissue from explanted livers. Gene expression profiling revealed 194 differentially regulated genes between the two groups. These genes belonged to cellular networks including cell cycle G1/S checkpoint regulators, RAN signaling, chronic myeloid leukemia signaling, molecular mechanisms of cancer, FXR/RXR activation and hepatic cholestasis. A subset of molecular signatures associated with HCC recurrence was found. The expression levels of these genes were validated by quantitative PCR analysis. Trina Das, Deborah L. Diamond, Matthew Yeh, Sajida Hassan, Janine T. Bryan, Jorge D. Reyes, and James D. Perkins Copyright © 2013 Trina Das et al. All rights reserved. Should We Consider Patients with Coexistent Hepatitis B or C Infection for Orthotopic Heart Transplantation? Thu, 07 Nov 2013 15:40:19 +0000 http://www.hindawi.com/journals/jtrans/2013/748578/ Heart transplantation (HTX) is the gold standard surgical treatment for patients with advanced heart failure. The prevalence of hepatitis B and hepatitis C infection in HTX recipients is over 10%. Despite its increased prevalence, the long-term outcome in this cohort is still not clear. There is a reluctance to place these patients on transplant waiting list given the increased incidence of viral reactivation and chronic liver disease after transplant. The emergence of new antiviral therapies to treat this cohort seems promising but their long-term outcome is yet to be established. The aim of this paper is to review the literature and explore whether it is justifiable to list advanced heart failure patients with coexistent hepatitis B/C infection for HTX. Baskar Sekar, Pippa J. Newton, Simon G. Williams, and Steven M. Shaw Copyright © 2013 Baskar Sekar et al. All rights reserved. Neutrophil Gelatinase-Associated Lipocalin in Kidney Transplantation Is an Early Marker of Graft Dysfunction and Is Associated with One-Year Renal Function Thu, 31 Oct 2013 13:57:39 +0000 http://www.hindawi.com/journals/jtrans/2013/650123/ Urinary neutrophil gelatinase-associated lipocalin (uNGAL) has been suggested as potential early marker of delayed graft function (DGF) following kidney transplantation (KTx). We conducted a prospective study in 40 consecutive KTx recipients to evaluate serial changes of uNGAL within the first week after KTx and assess its performance in predicting DGF (dialysis requirement during initial posttransplant week) and graft function throughout first year. Urine samples were collected on post-KTx days 0, 1, 2, 4, and 7. Linear mixed and multivariable regression models, receiver-operating characteristic (ROC), and areas under ROC curves were used. At all-time points, mean uNGAL levels were significantly higher in patients developing DGF (). Shortly after KTx (3–6 h), uNGAL values were higher in DGF recipients (on average +242 ng/mL, considering mean dialysis time of 4.1 years) and rose further in following days, contrasting with prompt function recipients. Day-1 uNGAL levels accurately predicted DGF (AUC-ROC = 0.93), with a performance higher than serum creatinine (AUC-ROC = 0.76), and similar to cystatin C (AUC-ROC = 0.95). Multivariable analyses revealed that uNGAL levels at days 4 and 7 were strongly associated with one-year serum creatinine. Urinary NGAL is an early marker of graft injury and is independently associated with dialysis requirement within one week after KTx and one-year graft function. Isabel Fonseca, José Carlos Oliveira, Manuela Almeida, Madalena Cruz, Anabela Malho, La Salete Martins, Leonídio Dias, Sofia Pedroso, Josefina Santos, Luísa Lobato, António Castro Henriques, and Denisa Mendonça Copyright © 2013 Isabel Fonseca et al. All rights reserved. Dendritic Cell-Based Approaches for Therapeutic Immune Regulation in Solid-Organ Transplantation Thu, 24 Oct 2013 12:01:34 +0000 http://www.hindawi.com/journals/jtrans/2013/761429/ To avoid immune rejection, allograft recipients require drug-based immunosuppression, which has significant toxicity. An emerging approach is adoptive transfer of immunoregulatory cells. While mature dendritic cells (DCs) present donor antigen to the immune system, triggering rejection, regulatory DCs interact with regulatory T cells to promote immune tolerance. Intravenous injection of immature DCs of either donor or host origin at the time of transplantation have prolonged allograft survival in solid-organ transplant models. DCs can be treated with pharmacological agents before injection, which may attenuate their maturation in vivo. Recent data suggest that injected immunosuppressive DCs may inhibit allograft rejection, not by themselves, but through conventional DCs of the host. Genetically engineered DCs have also been tested. Two clinical trials in type-1 diabetes and rheumatoid arthritis have been carried out, and other trials, including one trial in kidney transplantation, are in progress or are imminent. Giuseppe Vassalli Copyright © 2013 Giuseppe Vassalli. All rights reserved. Impact of Right-Sided Nephrectomy on Long-Term Outcomes in Retroperitoneoscopic Live Donor Nephrectomy at Single Center Mon, 21 Oct 2013 13:41:26 +0000 http://www.hindawi.com/journals/jtrans/2013/546373/ Objective. To assess the long-term graft survival of right-sided retroperitoneoscopic live donor nephrectomy (RPLDN), we compared the outcomes of right- and left-sided RPLDN. Methods. Five hundred and thirty-three patients underwent live donor renal transplantation with allografts procured by RPLDN from July 2001 to August 2010 at our institute. Of these, 24 (4.5%) cases were selected for right-sided RPLDN (R-RPLDN) according to our criteria for donor kidney selection. Study variables included peri- and postoperative clinical data. Results. No significant differences were found in the recipients' postoperative graft function and incidence of slow graft function. Despite significant increased warm ischemic time (WIT: mean 5.9 min versus 4.7 min, ) in R-RPLDN compared to that in L-RPLDN, there was no significant difference between the two groups regarding long-term patient and graft survival. The complication rate in R-RPLDN was not significantly different compared to that in L-RPLDN (17% versus 6.5%, ). No renal vein thrombosis was experienced in either groups. Conclusions. Although our study was retrospective and there was only a small number of R-RPLDN patients, R-RPLDN could be an option for laparoscopic live donor nephrectomy because of similar results, with the sole exception of WIT, in L-RPLDN, and its excellent long-term graft outcomes. Kazuya Omoto, Taiji Nozaki, Masashi Inui, Tomokazu Shimizu, Toshihito Hirai, Yugo Sawada, Hideki Ishida, and Kazunari Tanabe Copyright © 2013 Kazuya Omoto et al. All rights reserved. Hypothermic Machine Perfusion Preservation of the DCD Kidney: Machine Effects Thu, 10 Oct 2013 16:03:36 +0000 http://www.hindawi.com/journals/jtrans/2013/802618/ Purpose. Kidneys from DCD donors represent a significant pool, but preservation problems exist. The study objective was to test the importance of machine type for hypothermic preservation of DCD kidneys. Methods. Adult Beagle dog kidneys underwent 45 minutes of warm in situ ischemia followed by hypothermic perfusion for 24 hours (Belzer-MPS Solution) on either an ORS LifePort or a Waters RM3 using standard perfusion protocols. Kidneys were then autotransplanted, and renal function was assessed over 7 days following contralateral nephrectomy. Results. Renal vascular resistance was not different between the two pumps. After 24 hours, the oxygen partial pressure and oxygen delivery in the LifePort perfusate were significantly lower than those in the RM3 but not low enough to change lactate production. TheLifePort ran significantly colder than RM3 (2° versus 5°C). The arterial pressure waveform of the RM3 was qualitatively different from the waveform of the LifePort. Preservation injury after transplantation was not different between the devices. When the LifePort was changed to nonpulsatile flow, kidneys displayed significantly greater preservation injury compared to RM3. Conclusions. Both LifePort and RM3 can be used for hypothermic machine perfusion preservation of DCD kidneys with equal outcomes as long as the duty cycle remains pulsatile. Susanne L. Lindell, Heather Muir, John Brassil, and Martin J. Mangino Copyright © 2013 Susanne L. Lindell et al. All rights reserved. Circulating CD4+CD28null T Cells May Increase the Risk of an Atherosclerotic Vascular Event Shortly after Kidney Transplantation Tue, 01 Oct 2013 11:50:40 +0000 http://www.hindawi.com/journals/jtrans/2013/841430/ Proinflammatory CD4+ T cells without the costimulatory molecule CD28 (CD4+CD28null T cells) are expanded in patients with end-stage renal disease (ESRD) and associated with atherosclerotic vascular events (AVE). In a prospective study, the number of circulating CD4+CD28null T cells was established in 295 ESRD patients prior to receiving a kidney allograft. Within the first year after transplantation, an AVE occurred in 20 patients. Univariate analysis showed that besides a history of cardiovascular disease (CVDpos, HR 8.1, ), age (HR 1.04, ), dyslipidaemia (HR 8.8, ), and the % of CD4+CD28null T cells (HR 1.04 per % increase, 95% CI 1.00–1.09, ) were significantly associated with the occurrence of a posttransplantation AVE. In a multivariate analysis, only CVDpos remained a significant risk factor with a significant and positive interaction between the terms CVDpos and the % of CD4+CD28null T cells (HR 1.05, 95% CI 1.03–1.11, ). Within the CVDpos group, the incidence of an AVE was 13% in the lowest tertile compared to 25% in the highest tertile of % of CD4+CD28null T cells. In conclusion, the presence of circulating CD4+CD28null T cells is associated with an increased risk for a cardiovascular event shortly after kidney transplantation. Michiel G. H. Betjes, Willem Weimar, and Nicolle H. R. Litjens Copyright © 2013 Michiel G. H. Betjes et al. All rights reserved. Current Practice of Heart Donor Evaluation in Germany: Multivariable Risk Factor Analysis Confirms Practicability of Guidelines Mon, 30 Sep 2013 08:44:44 +0000 http://www.hindawi.com/journals/jtrans/2013/701854/ Background. Organ shortage has liberalised the acceptance criteria of grafts for heart transplantation, but which donor characteristics ultimately influence the decision to perform transplantation? For the first time this was evaluated using real-time donor data from the German organ procurement organization (DSO). Observed associations are discussed with regard to international recommendations and guidelines. Methods. 5291 German donors (2006–2010) were formally eligible for heart donation. In logistic regression models 160 donor parameters were evaluated to assess their influence on using grafts for transplantation (random split of cases: 2/3 study sample, 1/3 validation sample). Results. Successful procurement was determined by low donor age (OR 0.87 per year; 95% CI [0.85–0.89], ), large donor height (OR 1.04 per cm; 95% CI [1.02–1.06], ), exclusion of impaired left ventricular function or wall motion (OR 0.01; 95% CI [0.002–0.036], ), arrhythmia (OR 0.05; 95% CI [0.009–0.260], ), and of severe coronary artery disease (OR 0.003; 95% CI [<0.001–0.01], ). Donor characteristics differed between cases where the procedure was aborted without and with allocation initiated via Eurotransplant. Sylke Ruth Zeissig, Carl-Ludwig Fischer-Froehlich, Frank Polster, Nils R. Fruehauf, Guenter Kirste, and Irene Schmidtmann Copyright © 2013 Sylke Ruth Zeissig et al. All rights reserved.