Journal of Transplantation http://www.hindawi.com The latest articles from Hindawi Publishing Corporation © 2014 , Hindawi Publishing Corporation . All rights reserved. Use of Adjuvant Sorafenib in Liver Transplant Recipients with High-Risk Hepatocellular Carcinoma Thu, 10 Apr 2014 09:53:01 +0000 http://www.hindawi.com/journals/jtrans/2014/913634/ The efficacy of liver transplantation (LT) for hepatocellular (HCC) is limited by tumor recurrence rates of 10–15%. We undertook this pilot study to examine the use of sorafenib as adjuvant therapy in high-risk LT recipients. Methods. We prospectively enrolled patients transplanted for HCC into a treatment protocol utilizing sorafenib if their explant examination showed evidence of viable tumor exceeding Milan criteria. We utilized as historical controls patients transplanted previously, whose explant tumor characteristics exceeded Milan criteria, but who were not “preemptively” treated with sorafenib. Wilcoxon two-sample test and Fisher’s exact test were used to compare survival and recurrence rates between the two groups. Results. Seven patients were treated with sorafenib and compared to 12 historical “controls.” Two of 7 treated patients suffered from HCC recurrence. Of the comparison group, 9 experienced HCC recurrence and all succumbed to disease. Dose reduction improved tolerance of drug. The overall rate of HCC recurrence was decreased in the adjuvant therapy group compared to historical controls (29% versus 75%, ). Disease free 1-year survival for the treated versus untreated group was 100% versus 66%, respectively. Conclusion. Adjuvant use of sorafenib is safe and decreases risk of HCC recurrence in high-risk LT recipients. Kirti Shetty, Chiranjeev Dash, and Jacqueline Laurin Copyright © 2014 Kirti Shetty et al. All rights reserved. The Role of Imaging in Patient Selection, Preoperative Planning, and Postoperative Monitoring in Human Upper Extremity Allotransplantation Thu, 27 Mar 2014 16:26:49 +0000 http://www.hindawi.com/journals/jtrans/2014/169546/ Objective. To describe the role of imaging in vascular composite allotransplantation based on one institution’s experience with upper extremity allotransplant patients. Methods. The institutional review board approved this review of HIPAA-compliant patient data without the need for individual consent. A retrospective review was performed of imaging from 2008 to 2011 on individuals undergoing upper extremity transplantation. This demonstrated that, of the 19 patients initially considered, 5 patients with a mean age of 37 underwent transplantation. Reports were correlated clinically to delineate which preoperative factors lead to patient selection versus disqualification and what concerns dictated postoperative imaging. Findings were subdivided into musculoskeletal and vascular imaging criterion. Results. Within the screening phase, musculoskeletal exclusion criterion included severe shoulder arthropathy, poor native bone integrity, and marked muscular atrophy. Vascular exclusion criterion included loss of sufficient arterial or venous supply and significant distortion of the native vascular architecture. Postoperative imaging was used to document healing and hardware integrity. Postsurgical angiography and ultrasound were used to monitor for endothelial proliferation or thrombosis as signs of rejection and vascular complication. Conclusion. Multimodality imaging is an integral component of vascular composite allotransplantation surgical planning and surveillance to maximize returning form and functionality while minimizing possible complications. Eira S. Roth, David G. Buck, Vijay S. Gorantla, Joseph E. Losee, Daniel E. Foust, and Cynthia A. Britton Copyright © 2014 Eira S. Roth et al. All rights reserved. Midterm Experience of Ipsilateral Axillary-Axillary Arteriovenous Loop Graft as Tertiary Access for Haemodialysis Sun, 23 Mar 2014 09:09:10 +0000 http://www.hindawi.com/journals/jtrans/2014/908738/ Objectives. To present a series of ipsilateral axillary artery to axillary vein loop arm grafts as an alternative vascular access procedure for haemodialysis in patients with difficult access. Design. Retrospective case series. Methods. Patients who underwent an axillary loop arteriovenous graft from September 2009 to September 2012 were included. Preoperative venous imaging to exclude central venous stenosis and to image arm/axillary veins was performed. A cuffed PTFE graft was anastomosed to the distal axillary artery and axillary vein and looped on the arm. Results. 25 procedures were performed on 22 patients. Median age was 51 years, with 9 males and 13 females. Median number of previous access procedures was 3 (range 0–7). Median followup was 16.4 months (range 1–35). At 3 months and 1 year, the primary and secondary patency rates were 70% and 72% and 36% and 37%, respectively. There were 11 radiological interventions in 6 grafts including 5 angioplasties and 6 thrombectomies. There were 19 surgical procedures in 10 grafts, including thrombectomy, revision, repair for bleeding, and excision. Conclusions. Our series demonstrates that the axillary loop arm graft yields acceptable early patency rates in a complex group of patients but to maintain graft patency required high rates of surgical and radiological intervention, in particular graft thrombectomy. J. P. Hunter and M. L. Nicholson Copyright © 2014 J. P. Hunter and M. L. Nicholson. All rights reserved. Significance of Urinary Proteome Pattern in Renal Allograft Recipients Thu, 13 Mar 2014 13:37:58 +0000 http://www.hindawi.com/journals/jtrans/2014/139361/ Urinary proteomics is developing as a platform of urinary biomarkers of immense potential in recent years. The definition of urinary proteome in the context of renal allograft and characterization of different proteome patterns in various graft dysfunctions have led to the development of a distinct science of this noninvasive tool. Substantial numbers of studies have shown that different renal allograft disease states, both acute and chronic, could portray unique urinary proteome pattern enabling early diagnosis of graft dysfunction and proper manipulation of immunosuppressive strategy that could impact graft prognosis. The methodology of the urinary proteome is nonetheless not more complex than that of other sophisticated assays of conventional urinary protein analysis. Moreover, the need for a centralized database is also felt by the researchers as more and more studies have been presenting their results from different corners and as systems of organizing these newly emerging data being developed at international and national levels. In this context concept of urinary proteomics in renal allograft recipients would be of significant importance in clinical transplantation. Sufi M. Suhail Copyright © 2014 Sufi M. Suhail. All rights reserved. Three-Year Outcomes in Kidney Transplant Patients Randomized to Steroid-Free Immunosuppression or Steroid Withdrawal, with Enteric-Coated Mycophenolate Sodium and Cyclosporine: The Infinity Study Wed, 05 Mar 2014 00:00:00 +0000 http://www.hindawi.com/journals/jtrans/2014/171898/ In a six-month, multicenter, open-label trial, de novo kidney transplant recipients at low immunological risk were randomized to steroid avoidance or steroid withdrawal with IL-2 receptor antibody (IL-2RA) induction, enteric-coated mycophenolate sodium (EC-MPS: 2160 mg/day to week 6, 1440 mg/day thereafter), and cyclosporine. Results from a 30-month observational follow-up study are presented. Of 166 patients who completed the core study on treatment, 131 entered the follow-up study (70 steroid avoidance, 61 steroid withdrawal). The primary efficacy endpoint of treatment failure (clinical biopsy-proven acute rejection (BPAR) graft loss, death, or loss to follow-up) occurred in 21.4% (95% CI 11.8–31.0%) of steroid avoidance patients and 16.4% (95% CI 7.1–25.7%) of steroid withdrawal patients by month 36 (). BPAR had occurred in 20.0% and 11.5%, respectively (). The incidence of adverse events with a suspected relation to steroids during months 6–36 was 22.9% versus 37.1% (). By month 36, 32.4% and 51.7% of patients in the steroid avoidance and steroid withdrawal groups, respectively, were receiving oral steroids. In conclusion, IL-2RA induction with early intensified EC-MPS dosing and CNI therapy in de novo kidney transplant patients at low immunological risk may achieve similar three-year efficacy regardless of whether oral steroids are withheld for at least three months. A. Thierry, G. Mourad, M. Büchler, G. Choukroun, O. Toupance, N. Kamar, F. Villemain, Y. Le Meur, C. Legendre, P. Merville, M. Kessler, A.-E. Heng, B. Moulin, S. Queré, F. Di Giambattista, A. Lecuyer, and G. Touchard Copyright © 2014 A. Thierry et al. All rights reserved. The Role of mTOR Inhibitors in Liver Transplantation: Reviewing the Evidence Tue, 25 Feb 2014 14:19:35 +0000 http://www.hindawi.com/journals/jtrans/2014/845438/ Despite the success of liver transplantation, long-term complications remain, including de novo malignancies, metabolic syndrome, and the recurrence of hepatitis C virus (HCV) and hepatocellular carcinoma (HCC). The current mainstay of treatment, calcineurin inhibitors (CNIs), can also worsen posttransplant renal dysfunction, neurotoxicity, and diabetes. Clearly there is a need for better immunosuppressive agents that maintain similar rates of efficacy and renal function whilst minimizing adverse effects. The mammalian target of rapamycin (mTOR) inhibitors with a mechanism of action that is different from other immunosuppressive agents has the potential to address some of these issues. In this review we surveyed the literature for reports of the use of mTOR inhibitors in adult liver transplantation with respect to renal function, efficacy, safety, neurological symptoms, de novo tumors, and the recurrence of HCC and HCV. The results of our review indicate that mTOR inhibitors are associated with efficacy comparable to CNIs while having benefits on renal function in liver transplantation. We also consider newer dosing schedules that may limit side effects. Finally, we discuss evidence that mTOR inhibitors may have benefits in the oncology setting and in relation to HCV-related allograft fibrosis, metabolic syndrome, and neurotoxicity. Goran B. Klintmalm and Björn Nashan Copyright © 2014 Goran B. Klintmalm and Björn Nashan. All rights reserved. Alendronate as an Effective Treatment for Bone Loss and Vascular Calcification in Kidney Transplant Recipients Wed, 19 Feb 2014 07:56:28 +0000 http://www.hindawi.com/journals/jtrans/2014/269613/ Kidney transplant recipients develop secondary osteoporosis induced by immunosuppressive medication, with a high risk of fracture, and abdominal aortic calcification (AC) is a known predictor of cardiovascular mortality. In this study of 12 stable kidney recipients, we estimated the preventive effect of bisphosphonate treatment on bone loss and progression of AC. We randomly divided the subjects into a treatment group with alendronate (group A: 5 subjects) and a control group (group C: 7 subjects). Group A patients received 35 mg/week of alendronate over 24 months, while group C patients were not administered with any bisphosphonates. Two major endpoints were established: (1) the time-dependent change in bone mineral density (BMD) estimated with DEXA and (2) progression of abdominal AC, calculated twice as an index (ACI) using computed tomography data. Over the 2-year study period, group A patients showed significantly increased BMD of 1.86 ± 0.85% ( versus baseline), and almost complete inhibition of ACI progression (38.2 ± 24.2% to 39.6 ± 24.3%), but group C patients showed a decrease in BMD decline with bone loss and progression of ACI (32.8 ± 25.0% to 37.8 ± 29.2%, ). In conclusion, alendronate therapy was an effective treatment in kidney transplant recipients for secondary osteoporosis and vascular calcification as ectopic calcification. This clinical trial is registered with number JMA-IIA00155 of JMACCT CTR. Masanori Okamoto, Shintaro Yamanaka, Wataru Yoshimoto, and Takashi Shigematsu Copyright © 2014 Masanori Okamoto et al. All rights reserved. Donor-Recipient Size Mismatch in Paediatric Renal Transplantation Thu, 13 Feb 2014 12:45:05 +0000 http://www.hindawi.com/journals/jtrans/2014/317574/ Introduction. End stage renal failure in children is a rare but devastating condition, and kidney transplantation remains the only permanent treatment option. The aim of this review was to elucidate the broad surgical issues surrounding the mismatch in size of adult kidney donors to their paediatric recipients. Methods. A comprehensive literature search was undertaken on PubMed, MEDLINE, and Google Scholar for all relevant scientific articles published to date in English language. Manual search of the bibliographies was also performed to supplement the original search. Results. Size-matching kidneys for transplantation into children is not feasible due to limited organ availability from paediatric donors, resulting in prolonged waiting list times. Transplanting a comparatively large adult kidney into a child may lead to potential challenges related to the surgical incision and approach, vessel anastomoses, wound closure, postoperative cardiovascular stability, and age-correlated maturation of the graft. Conclusion. The transplantation of an adult kidney into a size mismatched paediatric recipient significantly reduces waiting times for surgery; however, it presents further challenges in terms of both the surgical procedure and the post-operative management of the patient’s physiological parameters. J. Donati-Bourne, H. W. Roberts, and R. A. Coleman Copyright © 2014 J. Donati-Bourne et al. All rights reserved. The First Fifty ABO Blood Group Incompatible Kidney Transplantations: The Rotterdam Experience Thu, 06 Feb 2014 00:00:00 +0000 http://www.hindawi.com/journals/jtrans/2014/913902/ This study describes the single center experience and long-term results of ABOi kidney transplantation using a pretransplantation protocol involving immunoadsorption combined with rituximab, intravenous immunoglobulins, and triple immune suppression. Fifty patients received an ABOi kidney transplant in the period from 2006 to 2012 with a follow-up of at least one year. Eleven antibody mediated rejections were noted of which 5 were mixed antibody and cellular mediated rejections. Nine cellular mediated rejections were recorded. Two grafts were lost due to rejection in the first year. One-year graft survival of the ABOi grafts was comparable to 100 matched ABO compatible renal grafts, 96% versus 99%. At 5-year follow-up, the graft survival was 90% in the ABOi versus 97% in the control group. Posttransplantation immunoadsorption was not an essential part of the protocol and no association was found between antibody titers and subsequent graft rejection. Steroids could be withdrawn safely 3 months after transplantation. Adverse events specifically related to the ABOi protocol were not observed. The currently used ABOi protocol shows good short and midterm results despite a high rate of antibody mediated rejections in the first years after the start of the program. Madelon van Agteren, Willem Weimar, Annelies E. de Weerd, Peter A. W. te Boekhorst, Jan N. M. Ijzermans, Jaqueline van de Wetering, and Michiel G. H. Betjes Copyright © 2014 Madelon van Agteren et al. All rights reserved. The Natural History of Biopsy-Negative Rejection after Heart Transplantation Wed, 18 Dec 2013 18:27:06 +0000 http://www.hindawi.com/journals/jtrans/2013/236720/ Purpose. The most recent International Society for Heart and Lung Transplantation (ISHLT) biopsy scale classifies cellular and antibody-mediated rejections. However, there are cases with acute decline in left ventricular ejection fraction (LVEF ≤ 45%) but no evidence of rejection on biopsy. Characteristics and treatment response of this biopsy negative rejection (BNR) have yet to be elucidated. Methods. Between 2002 and 2012, we found 12 cases of BNR in 11 heart transplant patients as previously defined. One of the 11 patients was treated a second time for BNR. Characteristics and response to treatment were noted. Results. 12 cases (of 11 patients) were reviewed and 11 occurred during the first year after transplant. 8 cases without heart failure symptoms were treated with an oral corticosteroids bolus and taper or intravenous immunoglobulin. Four cases with heart failure symptoms were treated with thymoglobulin, intravenous immunoglobulin, and intravenous methylprednisolone followed by an oral corticosteroids bolus and taper. Overall, 7 cases resulted in return to normal left ventricular function within a mean of 14 ± 10 days from the initial biopsy. Conclusion. BNR includes cardiac dysfunction and can be a severe form of rejection. Characteristics of these cases of rejection are described with most cases responding to appropriate therapy. Zhaoyi Tang, Jon Kobashigawa, Matthew Rafiei, Lily Kagan Stern, and Michele Hamilton Copyright © 2013 Zhaoyi Tang et al. All rights reserved. Systemic Heparinisation in Laparoscopic Live Donor Nephrectomy Mon, 16 Dec 2013 09:01:23 +0000 http://www.hindawi.com/journals/jtrans/2013/138926/ Introduction. Systemic heparinisation is advocated during laparoscopic live donor nephrectomy (LDN) as a preventative measure against renal vascular thrombosis during the warm ischaemic interval. This study compares the outcome with and without the administration of systemic heparinisation. Methods. A retrospective analysis was performed on 186 consecutive LDN patients between April 2008 and November 2012. Systemic heparin (2000–3000 IU) was administered intravenously to donors (hep ). From January 2010, heparin was not used systemically in this group of LDN (no hep ). Outcome measures included donor and recipient complications, initial graft function, and 12 month graft survival. Results. The demographics of both heparinised and non-heparinised donors were similar. The warm ischaemic time (WIT) was comparable in both groups (WIT; hep versus no hep minutes; ). There was no difference in complication rates, no episodes of graft thrombosis, and no incidences of primary nonfunction in either group. Delayed graft function occurred in 4/109 and 1/77 (3.6% versus 1.2%; ) and there was no significant difference in graft survival (). Conclusion. Omitting systemic heparinisation during laparoscopic donor nephrectomy is a feasible and safe approach that does not compromise donor or recipient outcome. Charlotte Crotty, Yasmin Tabbakh, Sarah A. Hosgood, and Michael L. Nicholson Copyright © 2013 Charlotte Crotty et al. All rights reserved. Liver Transplantation without Perioperative Transfusions Single-Center Experience Showing Better Early Outcome and Shorter Hospital Stay Thu, 12 Dec 2013 14:12:22 +0000 http://www.hindawi.com/journals/jtrans/2013/649209/ Background. Significant amounts of red blood cells (RBCs) transfusions are associated with poor outcome after liver transplantation (LT). We report our series of LT without perioperative RBC (P-RBC) transfusions to evaluate its influence on early and long-term outcomes following LT. Methods. A consecutive series of LT between 2006 and 2011 was analyzed. P-RBC transfusion was defined as one or more RBC units administrated during or ≤48 hours after LT. We divided the cohort in “No-Transfusion” and “Yes-Transfusion.” Preoperative status, graft quality, and intra- and postoperative variables were compared to assess P-RBC transfusion risk factors and postoperative outcome. Results. LT was performed in 127 patients (“No-Transfusion” = 39 versus “Yes-Transfusion” = 88). While median MELD was significantly higher in Yes-Transfusion (11 versus 21; ) group, platelet count, prothrombin time, and hemoglobin were significantly lower. On multivariate analysis, the unique independent risk factor associated with P-RBC transfusions was preoperative hemoglobin (). Incidence of postoperative bacterial infections (10 versus 27%; ), median ICU (2 versus 3 days; ), and hospital stay (7.5 versus 9 days; ) were negatively influenced by P-RBC transfusions. However, 30-day mortality (10 versus 15%) and one- (86 versus 70%) and 3-year (77 versus 66%) survival were equivalent in both groups. Conclusions. Recipient MELD score was not a predictive factor for P-RBC transfusion. Patients requiring P-RBC transfusions had worse postoperative outcome. Therefore, maximum efforts must be focused on improving hemoglobin levels during waiting list time to prevent using P-RBC in LT recipients. Nicolás Goldaracena, Patricio Méndez, Emilio Quiñonez, Gustavo Devetach, Lucio Koo, Carlos Jeanes, Margarita Anders, Federico Orozco, Pablo D. Comignani, Ricardo C. Mastai, and Lucas McCormack Copyright © 2013 Nicolás Goldaracena et al. All rights reserved. Everolimus in Heart Transplantation: An Update Thu, 05 Dec 2013 13:00:17 +0000 http://www.hindawi.com/journals/jtrans/2013/683964/ The evidence base relating to the use of everolimus in heart transplantation has expanded considerably in recent years, providing clinically relevant information regarding its use in clinical practice. Unless there are special considerations to take into account, all de novo heart transplant patients can be regarded as potential candidates for immunosuppression with everolimus and reduced-exposure calcineurin inhibitor therapy. Caution about the use of everolimus immediately after transplantation should be exercised in certain patients with the risk of severe proteinuria, with poor wound healing, or with uncontrolled severe hyperlipidemia. Initiation of everolimus in the early phase aftertransplant is not advisable in patients with severe pretransplant end-organ dysfunction or in patients on a left ventricular assist device beforetransplant who are at high risk of infection or of wound healing complications. The most frequent reason for introducing everolimus in maintenance heart transplant patients is to support minimization or withdrawal of calcineurin inhibitor therapy, for example, due to impaired renal function or malignancy. Due to its potential to inhibit the progression of cardiac allograft vasculopathy and to reduce cytomegalovirus infection, everolimus should be initiated as soon as possible after heart transplantation. Immediate and adequate reduction of CNI exposure is mandatory from the start of everolimus therapy. Stephan W. Hirt, Christoph Bara, Markus J. Barten, Tobias Deuse, Andreas O. Doesch, Ingo Kaczmarek, Uwe Schulz, Jörg Stypmann, Assad Haneya, and Hans B. Lehmkuhl Copyright © 2013 Stephan W. Hirt et al. All rights reserved. Timing of Hepatic Artery Reperfusion and Biliary Strictures in Liver Transplantation Tue, 03 Dec 2013 10:19:13 +0000 http://www.hindawi.com/journals/jtrans/2013/757389/ During orthotopic liver transplantation (OLT), biliary tract perfusion occurs with hepatic artery reperfusion (HARP), commonly performed after the portal vein reperfusion (PVRP). We examined whether the average time interval between PVRP and HARP impacted on postoperative biliary strictures occurrence. Patients undergoing OLT from 2007 to 2009 were included if they were ≥18 years old, had survived 3 months postoperatively, and had data for PVRP and HARP. Patients receiving allografts from DCD donors were excluded. Patients were followed for 6 months post-OLT. Seventy-five patients met the study inclusion criteria. Of these, 10 patients had a biliary stricture. There was no statistical difference between those with and without biliary stricture in age, gender, etiology, MELD score, graft survival, and time interval between PVRP and HARP. Ninety percent of patients with biliary stricture had a PVRP-HARP time interval >30 minutes, as opposed to 77% of patients without biliary stricture. However, this was not statistically significant. The cold ischemia time was significantly different between the two groups. Time interval for HARP after PVRP did not appear to affect the development of biliary strictures. However, 30 minutes may be suggested as a critical time after which there is an increase in biliary stricture occurrence. Ganesh Gunasekaran, Jyoti Sharma, Leandro C. Mosna, Roxana Bodin, and David C. Wolf Copyright © 2013 Ganesh Gunasekaran et al. All rights reserved. Molecular Signatures of Recurrent Hepatocellular Carcinoma Secondary to Hepatitis C Virus following Liver Transplantation Tue, 26 Nov 2013 15:44:45 +0000 http://www.hindawi.com/journals/jtrans/2013/878297/ Chronic hepatitis C virus (HCV) induced hepatocellular carcinoma (HCC) is a primary indication for liver transplantation (LT). In western countries, the estimated rate of HCC recurrence following LT is between 15% and 20% and is a major cause of mortality. Currently, there is no standard method to treat patients who are at high risk for HCC recurrence. The aim of this study was to investigate the molecular signatures underlying HCC recurrence that may lead to future studies on gene regulation contributing to new therapeutic options. Two groups of patients were selected, one including patients with HCV who developed HCC recurrence (HCC-R) ≤3 years from LT and the second group including patients with HCV who did not have recurrent HCC (HCC-NR). Microarray analysis containing more than 29,000 known genes was performed on formalin-fixed-paraffin-embedded (FFPE) liver tissue from explanted livers. Gene expression profiling revealed 194 differentially regulated genes between the two groups. These genes belonged to cellular networks including cell cycle G1/S checkpoint regulators, RAN signaling, chronic myeloid leukemia signaling, molecular mechanisms of cancer, FXR/RXR activation and hepatic cholestasis. A subset of molecular signatures associated with HCC recurrence was found. The expression levels of these genes were validated by quantitative PCR analysis. Trina Das, Deborah L. Diamond, Matthew Yeh, Sajida Hassan, Janine T. Bryan, Jorge D. Reyes, and James D. Perkins Copyright © 2013 Trina Das et al. All rights reserved. Should We Consider Patients with Coexistent Hepatitis B or C Infection for Orthotopic Heart Transplantation? Thu, 07 Nov 2013 15:40:19 +0000 http://www.hindawi.com/journals/jtrans/2013/748578/ Heart transplantation (HTX) is the gold standard surgical treatment for patients with advanced heart failure. The prevalence of hepatitis B and hepatitis C infection in HTX recipients is over 10%. Despite its increased prevalence, the long-term outcome in this cohort is still not clear. There is a reluctance to place these patients on transplant waiting list given the increased incidence of viral reactivation and chronic liver disease after transplant. The emergence of new antiviral therapies to treat this cohort seems promising but their long-term outcome is yet to be established. The aim of this paper is to review the literature and explore whether it is justifiable to list advanced heart failure patients with coexistent hepatitis B/C infection for HTX. Baskar Sekar, Pippa J. Newton, Simon G. Williams, and Steven M. Shaw Copyright © 2013 Baskar Sekar et al. All rights reserved. Neutrophil Gelatinase-Associated Lipocalin in Kidney Transplantation Is an Early Marker of Graft Dysfunction and Is Associated with One-Year Renal Function Thu, 31 Oct 2013 13:57:39 +0000 http://www.hindawi.com/journals/jtrans/2013/650123/ Urinary neutrophil gelatinase-associated lipocalin (uNGAL) has been suggested as potential early marker of delayed graft function (DGF) following kidney transplantation (KTx). We conducted a prospective study in 40 consecutive KTx recipients to evaluate serial changes of uNGAL within the first week after KTx and assess its performance in predicting DGF (dialysis requirement during initial posttransplant week) and graft function throughout first year. Urine samples were collected on post-KTx days 0, 1, 2, 4, and 7. Linear mixed and multivariable regression models, receiver-operating characteristic (ROC), and areas under ROC curves were used. At all-time points, mean uNGAL levels were significantly higher in patients developing DGF (). Shortly after KTx (3–6 h), uNGAL values were higher in DGF recipients (on average +242 ng/mL, considering mean dialysis time of 4.1 years) and rose further in following days, contrasting with prompt function recipients. Day-1 uNGAL levels accurately predicted DGF (AUC-ROC = 0.93), with a performance higher than serum creatinine (AUC-ROC = 0.76), and similar to cystatin C (AUC-ROC = 0.95). Multivariable analyses revealed that uNGAL levels at days 4 and 7 were strongly associated with one-year serum creatinine. Urinary NGAL is an early marker of graft injury and is independently associated with dialysis requirement within one week after KTx and one-year graft function. Isabel Fonseca, José Carlos Oliveira, Manuela Almeida, Madalena Cruz, Anabela Malho, La Salete Martins, Leonídio Dias, Sofia Pedroso, Josefina Santos, Luísa Lobato, António Castro Henriques, and Denisa Mendonça Copyright © 2013 Isabel Fonseca et al. All rights reserved. Dendritic Cell-Based Approaches for Therapeutic Immune Regulation in Solid-Organ Transplantation Thu, 24 Oct 2013 12:01:34 +0000 http://www.hindawi.com/journals/jtrans/2013/761429/ To avoid immune rejection, allograft recipients require drug-based immunosuppression, which has significant toxicity. An emerging approach is adoptive transfer of immunoregulatory cells. While mature dendritic cells (DCs) present donor antigen to the immune system, triggering rejection, regulatory DCs interact with regulatory T cells to promote immune tolerance. Intravenous injection of immature DCs of either donor or host origin at the time of transplantation have prolonged allograft survival in solid-organ transplant models. DCs can be treated with pharmacological agents before injection, which may attenuate their maturation in vivo. Recent data suggest that injected immunosuppressive DCs may inhibit allograft rejection, not by themselves, but through conventional DCs of the host. Genetically engineered DCs have also been tested. Two clinical trials in type-1 diabetes and rheumatoid arthritis have been carried out, and other trials, including one trial in kidney transplantation, are in progress or are imminent. Giuseppe Vassalli Copyright © 2013 Giuseppe Vassalli. All rights reserved. Impact of Right-Sided Nephrectomy on Long-Term Outcomes in Retroperitoneoscopic Live Donor Nephrectomy at Single Center Mon, 21 Oct 2013 13:41:26 +0000 http://www.hindawi.com/journals/jtrans/2013/546373/ Objective. To assess the long-term graft survival of right-sided retroperitoneoscopic live donor nephrectomy (RPLDN), we compared the outcomes of right- and left-sided RPLDN. Methods. Five hundred and thirty-three patients underwent live donor renal transplantation with allografts procured by RPLDN from July 2001 to August 2010 at our institute. Of these, 24 (4.5%) cases were selected for right-sided RPLDN (R-RPLDN) according to our criteria for donor kidney selection. Study variables included peri- and postoperative clinical data. Results. No significant differences were found in the recipients' postoperative graft function and incidence of slow graft function. Despite significant increased warm ischemic time (WIT: mean 5.9 min versus 4.7 min, ) in R-RPLDN compared to that in L-RPLDN, there was no significant difference between the two groups regarding long-term patient and graft survival. The complication rate in R-RPLDN was not significantly different compared to that in L-RPLDN (17% versus 6.5%, ). No renal vein thrombosis was experienced in either groups. Conclusions. Although our study was retrospective and there was only a small number of R-RPLDN patients, R-RPLDN could be an option for laparoscopic live donor nephrectomy because of similar results, with the sole exception of WIT, in L-RPLDN, and its excellent long-term graft outcomes. Kazuya Omoto, Taiji Nozaki, Masashi Inui, Tomokazu Shimizu, Toshihito Hirai, Yugo Sawada, Hideki Ishida, and Kazunari Tanabe Copyright © 2013 Kazuya Omoto et al. All rights reserved. Hypothermic Machine Perfusion Preservation of the DCD Kidney: Machine Effects Thu, 10 Oct 2013 16:03:36 +0000 http://www.hindawi.com/journals/jtrans/2013/802618/ Purpose. Kidneys from DCD donors represent a significant pool, but preservation problems exist. The study objective was to test the importance of machine type for hypothermic preservation of DCD kidneys. Methods. Adult Beagle dog kidneys underwent 45 minutes of warm in situ ischemia followed by hypothermic perfusion for 24 hours (Belzer-MPS Solution) on either an ORS LifePort or a Waters RM3 using standard perfusion protocols. Kidneys were then autotransplanted, and renal function was assessed over 7 days following contralateral nephrectomy. Results. Renal vascular resistance was not different between the two pumps. After 24 hours, the oxygen partial pressure and oxygen delivery in the LifePort perfusate were significantly lower than those in the RM3 but not low enough to change lactate production. TheLifePort ran significantly colder than RM3 (2° versus 5°C). The arterial pressure waveform of the RM3 was qualitatively different from the waveform of the LifePort. Preservation injury after transplantation was not different between the devices. When the LifePort was changed to nonpulsatile flow, kidneys displayed significantly greater preservation injury compared to RM3. Conclusions. Both LifePort and RM3 can be used for hypothermic machine perfusion preservation of DCD kidneys with equal outcomes as long as the duty cycle remains pulsatile. Susanne L. Lindell, Heather Muir, John Brassil, and Martin J. Mangino Copyright © 2013 Susanne L. Lindell et al. All rights reserved. Circulating CD4+CD28null T Cells May Increase the Risk of an Atherosclerotic Vascular Event Shortly after Kidney Transplantation Tue, 01 Oct 2013 11:50:40 +0000 http://www.hindawi.com/journals/jtrans/2013/841430/ Proinflammatory CD4+ T cells without the costimulatory molecule CD28 (CD4+CD28null T cells) are expanded in patients with end-stage renal disease (ESRD) and associated with atherosclerotic vascular events (AVE). In a prospective study, the number of circulating CD4+CD28null T cells was established in 295 ESRD patients prior to receiving a kidney allograft. Within the first year after transplantation, an AVE occurred in 20 patients. Univariate analysis showed that besides a history of cardiovascular disease (CVDpos, HR 8.1, ), age (HR 1.04, ), dyslipidaemia (HR 8.8, ), and the % of CD4+CD28null T cells (HR 1.04 per % increase, 95% CI 1.00–1.09, ) were significantly associated with the occurrence of a posttransplantation AVE. In a multivariate analysis, only CVDpos remained a significant risk factor with a significant and positive interaction between the terms CVDpos and the % of CD4+CD28null T cells (HR 1.05, 95% CI 1.03–1.11, ). Within the CVDpos group, the incidence of an AVE was 13% in the lowest tertile compared to 25% in the highest tertile of % of CD4+CD28null T cells. In conclusion, the presence of circulating CD4+CD28null T cells is associated with an increased risk for a cardiovascular event shortly after kidney transplantation. Michiel G. H. Betjes, Willem Weimar, and Nicolle H. R. Litjens Copyright © 2013 Michiel G. H. Betjes et al. All rights reserved. Current Practice of Heart Donor Evaluation in Germany: Multivariable Risk Factor Analysis Confirms Practicability of Guidelines Mon, 30 Sep 2013 08:44:44 +0000 http://www.hindawi.com/journals/jtrans/2013/701854/ Background. Organ shortage has liberalised the acceptance criteria of grafts for heart transplantation, but which donor characteristics ultimately influence the decision to perform transplantation? For the first time this was evaluated using real-time donor data from the German organ procurement organization (DSO). Observed associations are discussed with regard to international recommendations and guidelines. Methods. 5291 German donors (2006–2010) were formally eligible for heart donation. In logistic regression models 160 donor parameters were evaluated to assess their influence on using grafts for transplantation (random split of cases: 2/3 study sample, 1/3 validation sample). Results. Successful procurement was determined by low donor age (OR 0.87 per year; 95% CI [0.85–0.89], ), large donor height (OR 1.04 per cm; 95% CI [1.02–1.06], ), exclusion of impaired left ventricular function or wall motion (OR 0.01; 95% CI [0.002–0.036], ), arrhythmia (OR 0.05; 95% CI [0.009–0.260], ), and of severe coronary artery disease (OR 0.003; 95% CI [<0.001–0.01], ). Donor characteristics differed between cases where the procedure was aborted without and with allocation initiated via Eurotransplant. Sylke Ruth Zeissig, Carl-Ludwig Fischer-Froehlich, Frank Polster, Nils R. Fruehauf, Guenter Kirste, and Irene Schmidtmann Copyright © 2013 Sylke Ruth Zeissig et al. All rights reserved. New Onset Diabetes Mellitus in Living Donor versus Deceased Donor Liver Transplant Recipients: Analysis of the UNOS/OPTN Database Tue, 24 Sep 2013 10:17:53 +0000 http://www.hindawi.com/journals/jtrans/2013/269096/ New onset diabetes after transplantation (NODAT) occurs less frequently in living donor liver transplant (LDLT) recipients than in deceased donor liver transplant (DDLT) recipients. The aim of this study was to compare the incidence and predictive factors for NODAT in LDLT versus DDLT recipients. The Organ Procurement and Transplant Network/United Network for Organ Sharing database was reviewed from 2004 to 2010, and 902 LDLT and 19,582 DDLT nondiabetic recipients were included. The overall incidence of NODAT was 12.2% at 1 year after liver transplantation. At 1, 3, and 5 years after transplant, the incidence of NODAT in LDLT recipients was 7.4, 2.1, and 2.6%, respectively, compared to 12.5, 3.4, and 1.9%, respectively, in DDLT recipients. LDLT recipients have a lower risk of NODAT compared to DDLT recipients (hazard ratio = 0.63 (0.52–0.75), ). Predictors for NODAT in LDLT recipients were hepatitis C (HCV) and treated acute cellular rejection (ACR). Risk factors in DDLT recipients were recipient male gender, recipient age, body mass index, donor age, donor diabetes, HCV, and treated ACR. LDLT recipients have a lower incidence and fewer risk factors for NODAT compared to DDLT recipients. Early identification of risk factors will assist timely clinical interventions to prevent NODAT complications. Anitha D. Yadav, Yu-Hui Chang, Bashar A. Aqel, Thomas J. Byrne, Harini A. Chakkera, David D. Douglas, David C. Mulligan, Jorge Rakela, Hugo E. Vargas, and Elizabeth J. Carey Copyright © 2013 Anitha D. Yadav et al. All rights reserved. International Heart Valve Bank Survey: A Review of Processing Practices and Activity Outcomes Sun, 15 Sep 2013 16:26:44 +0000 http://www.hindawi.com/journals/jtrans/2013/163150/ A survey of 24 international heart valve banks was conducted to acquire information on heart valve processing techniques used and outcomes achieved. The objective was to provide an overview of heart valve banking activities for tissue bankers, tissue banking associations, and regulatory bodies worldwide. Despite similarities found for basic manufacturing processes, distinct differences in procedural details were also identified. The similarities included (1) use of sterile culture media for procedures, (2) antibiotic decontamination, (3) use of dimethyl sulfoxide (DMSO) as a cryoprotectant, (4) controlled rate freezing for cryopreservation, and (5) storage at ultralow temperatures of below −135°C. Differences in procedures included (1) type of sterile media used, (2) antibiotics combination, (3) temperature and duration used for bioburden reduction, (4) concentration of DMSO used for cryopreservation, and (5) storage duration for released allografts. For most banks, the primary reasons why allografts failed to meet release criteria were positive microbiological culture and abnormal morphology. On average, 85% of allografts meeting release criteria were implanted, with valve size and type being the main reasons why released allografts were not used clinically. The wide variation in percentage of allografts meeting release requirements, despite undergoing validated manufacturing procedures, justifies the need for regular review of important outcomes as cited in this paper, in order to encourage comparison and improvements in the HVBs’ processes. Wee Ling Heng, Helmi Albrecht, Paul Chiappini, Yeong Phang Lim, and Linda Manning Copyright © 2013 Wee Ling Heng et al. All rights reserved. Renal Transplantation from Elderly Living Donors Thu, 12 Sep 2013 15:57:54 +0000 http://www.hindawi.com/journals/jtrans/2013/475964/ Acceptance of elderly living kidney donors remains controversial due to the higher incidence of comorbidity and greater risk of postoperative complications. This is a review of publications in the English language between 2000 and 2013 about renal transplantation from elderly living donors to determine trends and effects of donation, and the outcomes of such transplantation. The last decade witnessed a 50% increase in living kidney donor transplants, with a disproportionate increase in donors >60 years. There is no accelerated loss of kidney function following donation, and the incidence of established renal failure (ERF) and hypertension among donors is similar to that of the general population. The overall incidence of ERF in living donors is about 0.134 per 1000 years. Elderly donors require rigorous assessment and should have a predicted glomerular filtration rate of at least 37.5 mL/min/1.73 m2 at the age of 80. Though elderly donors had lower glomerular filtration rate before donation, proportionate decline after donation was similar in both young and elderly groups. The risks of delayed graft function, acute rejection, and graft failure in transplants from living donors >65 years are significantly higher than transplants from younger donors. A multicentred, long-term, and prospective database addressing the outcomes of kidneys from elderly living donors is recommended. Jacob A. Akoh and Umasankar Mathuram Thiyagarajan Copyright © 2013 Jacob A. Akoh and Umasankar Mathuram Thiyagarajan. All rights reserved. Preoperative Cardiac Variables of Diastolic Dysfunction and Clinical Outcomes in Lung Transplant Recipients Thu, 12 Sep 2013 08:37:14 +0000 http://www.hindawi.com/journals/jtrans/2013/391620/ Background. Orthotopic lung transplantation is now widely performed in patients with advanced lung disease. Patients with moderate or severe ventricular systolic dysfunction are typically excluded from lung transplantation; however, there is a paucity of data regarding the prognostic significance of abnormal left ventricular diastolic function and elevated pretransplant pulmonary pressures. Methods. We reviewed the characteristics of 111 patients who underwent bilateral and unilateral lung transplants from 200 to 2009 in order to evaluate the prognostic significance of preoperative markers of diastolic function, including invasively measured pulmonary capillary wedge pressure (PCWP) and echocardiographic variables of diastolic dysfunction including mitral and . Results. Out of 111 patients, 62 were male (56%) and average age was 54.0 ± 10.5 years. Traditional echocardiographic Doppler variables of abnormal diastolic function, including and , did not predict adverse events (). Mildly elevated pretransplant PCWP (16–20 mmHg) and moderately/severely elevated PCWP (>20 mmHg) were not associated with adverse clinical events after transplant (). Additionally, all clinical endpoints did not show any statistical significance between the two groups. Conclusions. Pre-lung transplant invasive and echocardiographic findings of elevated pulmonary pressures and abnormal left ventricular diastolic function are not predictive of adverse posttransplant clinical events. Ajay Yadlapati, Joseph P. Lynch III, Rajan Saggar, David Ross, John A. Belperio, Stephen Weigt, Abbas Ardehali, Tristan Grogan, Eric H. Yang, and Jamil Aboulhosn Copyright © 2013 Ajay Yadlapati et al. All rights reserved. A Short Period of Ventilation without Perfusion Seems to Reduce Atelectasis without Harming the Lungs during Ex Vivo Lung Perfusion Wed, 11 Sep 2013 15:33:01 +0000 http://www.hindawi.com/journals/jtrans/2013/729286/ To evaluate the lung function of donors after circulatory deaths (DCDs), ex vivo lung perfusion (EVLP) has been shown to be a valuable method. We present modified EVLP where lung atelectasis is removed, while the lung perfusion is temporarily shut down. Twelve pigs were randomized into two groups: modified EVLP and conventional EVLP. When the lungs had reached 37°C in the EVLP circuit, lung perfusion was temporarily shut down in the modified EVLP group, and positive end-expiratory pressure (PEEP) was increased to 10 cm H2O for 10 minutes. In the conventional EVLP group, PEEP was increased to 10 cm H2O for 10 minutes with unchanged lung perfusion. In the modified EVLP group, the arterial oxygen partial pressure (PaO2) was 18.5 ± 7.0 kPa before and 64.5 ± 6.0 kPa after the maneuver (). In the conventional EVLP group, the PaO2 was 16.8 ± 3.1 kPa and 46.8 ± 2.7 kPa after the maneuver (; ). In the modified EVLP group, the pulmonary graft weight was unchanged, while in the conventional EVLP group, the pulmonary graft weight was significantly increased. Modified EVLP with normoventilation of the lungs without ongoing lung perfusion for 10 minutes may eliminate atelectasis almost completely without harming the lungs. Sandra Lindstedt, Leif Pierre, and Richard Ingemansson Copyright © 2013 Sandra Lindstedt et al. All rights reserved. Beyond Poiseuille: Preservation Fluid Flow in an Experimental Model Mon, 26 Aug 2013 10:26:26 +0000 http://www.hindawi.com/journals/jtrans/2013/605326/ Poiseuille’s equation describes the relationship between fluid viscosity, pressure, tubing diameter, and flow, yet it is not known if cold organ perfusion systems follow this equation. We investigated these relationships in an ex vivo model and aimed to offer some rationale for equipment selection. Increasing the cannula size from 14 to 20 Fr increased flow rate by a mean (SD) of 13 (12)%. Marshall’s hyperosmolar citrate was three times less viscous than UW solution, but flows were only 45% faster. Doubling the bag pressure led to a mean (SD) flow rate increase of only 19 (13)%, not twice the rate. When external pressure devices were used, 100 mmHg of continuous pressure increased flow by a mean (SD) of 43 (17)% when compared to the same pressure applied initially only. Poiseuille’s equation was not followed; this is most likely due to “slipping” of preservation fluid within the plastic tubing. Cannula size made little difference over the ranges examined; flows are primarily determined by bag pressure and fluid viscosity. External infusor devices require continuous pressurisation to deliver high flow. Future studies examining the impact of perfusion variables on graft outcomes should include detailed equipment descriptions. Saurabh Singh, Lucy V. Randle, Paul T. Callaghan, Christopher J. E. Watson, and Chris J. Callaghan Copyright © 2013 Saurabh Singh et al. All rights reserved. Occurrence of Fatal and Nonfatal Adverse Outcomes after Heart Transplantation in Patients with Pretransplant Noncytotoxic HLA Antibodies Mon, 29 Jul 2013 11:52:34 +0000 http://www.hindawi.com/journals/jtrans/2013/519680/ HLA antibodies (HLA ab) in transplant candidates have been associated with poor outcome. However, clinical relevance of noncytotoxic antibodies after heart transplant (HT) is controversial. By using a Luminex-based HLA screening, we retested pretransplant sera from HT recipients testing negative for cytotoxic HLA ab and for prospective crossmatch. Out of the 173 consecutive patients assayed (; 16% females; 47% ischemic etiology), 32 (18%) showed pretransplant HLA ab, and 12 (7%) tested positive against both class I and class II HLA. Recipients with any HLA ab had poorer survival than those without ( versus %; ), accounting for a doubled independent mortality risk (). In addition, HLA-ab detection was associated with increased prevalence of early graft failure (35 versus 15%; ) and late cellular rejection (29 versus 11%; ). Of the subgroup of 37 patients suspected for antibody mediated rejection (AMR), the 9 with pretransplant HLA ab were more likely to display pathological AMR grade 2 (). By an inexpensive, luminex-based, HLA-screening assay, we were able to detect non-cytotoxic HLA ab predicting fatal and nonfatal adverse outcomes after heart transplant. Allocation strategies and desensitization protocols need to be developed and prospectively tested in these patients. Luciano Potena, Andrea Bontadini, Sandra Iannelli, Fiorenza Fruet, Ornella Leone, Francesco Barberini, Laura Borgese, Valentina Manfredini, Marco Masetti, Gaia Magnani, Francesco Fallani, Francesco Grigioni, and Angelo Branzi Copyright © 2013 Luciano Potena et al. All rights reserved. Cardiac Troponin Elevation Predicts Mortality in Patients Undergoing Orthotopic Liver Transplantation Sun, 14 Jul 2013 09:28:41 +0000 http://www.hindawi.com/journals/jtrans/2013/252838/ Introduction. While patients undergoing orthotopic liver transplantation (OLT) have high cardiovascular event rates, preoperative risk stratification may not necessarily predict those susceptible patients. Troponin T (TnT) may help predict patients at risk for cardiovascular complications. Methods. Consecutive patients undergoing OLT at Mayo Clinic in Florida between 1998 and 2010 who had TnT obtained within 10 days following surgery were included. Three groups were compared based on TnT level: (1) normal (TnT  ng/mL), (2) intermediate (TnT 0.02–0.11 ng/mL), and (3) elevated (TnT  ng/mL). Overall and cardiovascular mortality was assessed. Results. Of the 78 patients included, there was no difference in age, gender, severity of liver disease, and echocardiographic findings. Patients in the normal and intermediate TnT groups had a lower overall mortality rate (14.3% and 0%, resp.) when compared with those with elevated TnT (50%; ). Patients in the elevated TnT group had a cardiovascular mortality rate of 37.5% compared with 1.4% in the other groups combined (). The elevated TnT group had a much higher mortality rate when compared with those in the intermediate group (). Conclusion. TnT may accurately help risk stratify patients in the early postoperative setting to better predict cardiovascular complications. David Snipelisky, Sean Donovan, Michael Levy, Raj Satyanarayana, and Brian Shapiro Copyright © 2013 David Snipelisky et al. All rights reserved.