Abstract

Introduction. The possible risk factors for chronic kidney disease in transplant recipients have not been thoroughly investigated after living-donor liver transplantation. Material and Methods. A retrospective cohort study of consecutive adults who underwent living-donor liver transplantation between May 2004 and October 2016, in a single center, was conducted. Kidney function was investigated successively for all the patients throughout the study period, with 12 months being the shortest follow-up. Postoperative renal dysfunction was defined in accordance with the Chronic Kidney Disease Epidemiology Collaboration criteria. The patients’ demographic data, preoperative and intraoperative parameters, and outcomes were recorded. A calcineurin inhibitor-based immunosuppressive regimen, either tacrolimus or cyclosporine, was used in all the patients. Results. Of the 413 patients included in the study, 33 (8%) who survived for ≥1 year experienced chronic kidney disease 1 year after living-donor liver transplantation. Twenty-seven variables were studied to compare between the patients with normal kidney functions and those who developed chronic kidney disease 1 year after living-donor liver transplantation. Univariate regression analysis for predicting the likelihood of chronic kidney disease at 1 year revealed that the following 4 variables were significant: operative time, P < 0.0005; intraoperative blood loss, P < 0.0005; preoperative renal impairment, P = 0.001; and graft-to-recipient weight ratio (as a negative predictor), P < 0.0005. In the multivariate regression analysis, only 2 variables remained as independent predictors of chronic kidney disease at 1 year, namely, operative time with a cutoff value of ≥714 minutes and graft-to-recipient weight ratio as a negative predictor with a cutoff value of <0.91. Conclusion. In this study, prolonged operative time and small graft-to-recipient weight ratio were independent predictors of chronic kidney disease at 1 year after living-donor liver transplantation.

1. Introduction

Liver transplantation (LT) was approved as a definitive therapy for end-stage liver disease outside the experimental realm by the United States National Institute of Health (USNIH) in 1983. Since then, LT has altered the natural history of end-stage liver disease and is now considered the accepted therapy for a wide spectrum of previously fatal liver diseases [1].

Serum bilirubin level, the international normalized ratio of prothrombin time, and serum creatinine level are the 3 components of the model for end-stage liver disease (MELD), which has served as the basis for liver allocation since February 2002. This has led to the expansion of the role of renal function assessment during the pretransplant evaluation and throughout the follow-up period [2].

In the literature, kidney function abnormalities before transplantation are mostly associated with a higher possibility of intraoperative complications, infection, prolonged postoperative hospital stay, need for dialysis, and overall financial burden [3]. Moreover, renal failure is associated with increased mortality of patients admitted in intensive care unit in general and in liver transplant recipients in particular, ranging between 27% and 67% depending on the comorbidities [4]. Gonwa et al. reported that 35% of liver transplant recipients with hepatorenal syndrome (HRS) and only 5% without HRS needed renal replacement therapy (RRT) postoperatively [5]. Renal function was not thoroughly studied after living-donor LT (LDLT).

The aim of this study was to assess the incidence and determine the possible risk factors of chronic kidney disease (CKD) in recipients 1 year after LDLT.

2. Materials and Methods

This was a single-center retrospective cohort study. The research protocol was reviewed and approved by the Institutional Research Board and Ethical Committee of the Faculty of Medicine, Mansoura University (Protocol No. R/16.12.82). All data were collected and analyzed to ensure data integrity and patient privacy. The study was conducted in the Gastroenterology Center, Mansoura University, Egypt.

Data of all the patients who underwent LDLT in Mansoura University Gastroenterology Center between May 2004 and October 2016 were collected from a prospectively maintained database. Data were analyzed to detect risk factors of kidney dysfunction after LDLT and its impact on 1-year graft and patient survival. The exclusion criteria were age of <18 years at the time of surgery, the need for preoperative renal replacement therapy (RRT), and/or death within the first 12 months of transplantation.

Patient selection, preoperative assessment, and perioperative management were performed by the same transplant team, including experienced hepatologists, surgeons, anesthetists, and radiologists. The preoperative data included the patients’ demographics, MELD score, basal serum creatinine level, preoperative GFR, liver function tests, Child-Pugh classification, presence of ascites, serum electrolyte levels, and urine analysis. Routine renal function assessment before surgery included serum creatinine level, blood urea nitrogen level, serum uric acid level, urinalysis, and renal ultrasonography. A nephrological consultation was requested for cases with elevated serum creatinine levels of ≥1.5 mg/dl, proteinuria, evidence of acute kidney injury (AKI), or abnormal renal ultrasonographic findings. Intraoperative records were screened for blood loss, hypotensive events, urine output, graft-to-recipient weight ratio (GRWR), warm ischemia time, cold ischemia time, duration of surgery, and vascular complications.

The postoperative data included daily serum creatinine levels and examination for detection of possible postoperative events in form of sepsis, bleeding, bile leak, primary graft failure, delayed graft function, rejection, or ischemia-reperfusion injury. Kidney functions were assessed at the following time points: days 1, 2, 3, 7, and 14 after surgery, 1 month, 3 months, and 1 year after surgery. In case of diagnosed renal impairment, patients were subjected for further assessment.

We did not care much about the transient early postoperative fluctuations of renal function parameters which were corrected by manipulations of immunosuppressive drugs, antibiotics, and fluid balance. CKD was defined as an estimated GFR (eGFR) of <60 mL/(min·1.73 m2) for at least 3 months or >60 mL/(min·1.73 m2), with parameters denoting kidney damage for at least 3 months. Postoperative renal dysfunction was defined according to the Chronic Kidney Disease Epidemiology Collaboration criteria [6].

LDLT recipients were classified at the time of data analysis into 2 groups as follows, with the shortest follow-up being 12 months after the time of surgery: group I, with normal kidney functions, and group II, those who developed CKD.

Most of the recipients had hepatitis C virus (HCV) infection; thus, our program adopted a steroid-free protocol apart from an initial dose of methylprednisolone administrated intravenously (IV) at a dose of 10 mg/kg of the recipient’s weight, immediately after reperfusion of the graft. In addition, IV infusion of 20 mg basiliximab was given on reperfusion and on the fourth postoperative day. A calcineurin inhibitor- (CNI-) based immunosuppressive regimen, either tacrolimus or cyclosporine, was used in all the patients. Tacrolimus therapy was started within the first 12 hours after reperfusion at an oral dosage of 2 × 0.05 mg/(kg·day). The tacrolimus dose was adjusted to a target range of 10-15 μg/L during the first 3 months and 5-10 μg/L after the third month. If cyclosporine was used, a dose of 4 mg/(kg·day), divided twice daily, was administered. The trough level was kept between 150 and 200 μg/L in the first 6 months and then at 100-150 μg/L. Mycophenolate mofetil was used as a part of the initial therapy or as a maintenance immunosuppressive agent. It was given orally starting from the first postoperative day at a dosage of 2 × 15 mg/(kg·day).

In cases with preoperative renal insufficiency (RI) or perioperative AKI, administration of CNIs was delayed for 72 hours after surgery and then a lower target level was adopted (5-10 μg/L) [7].

Data were entered and analyzed using the IBM-SPSS version 21 software. Categorical data were expressed as number (percentage) and compared using the chi-square test (or Fisher exact test). Quantitative data were initially tested for normality by using the Kolmogorov-Smirnov test, where data were considered normally distributed if the P value was >0.050. Quantitative data were expressed as mean ± SD and compared between two groups by using the independent-samples t test if normally distributed or as median (interquartile range [IQR]) or the Mann-Whitney U test if not normally distributed. The receiver-operating characteristic (ROC) curve was plotted between the “sensitivity” (true positive rate) and “1- specificity” (false-positive rate) across a series of cutoff points. The area under the ROC curve is considered as an effective measure of inherent validity of a diagnostic test. This curve is useful in finding the optimal cutoff point to minimize misclassification of diseased and nondiseased subjects. Predicting the likelihood of a dichotomous variable was performed using a logistic regression analysis.

3. Results

During the study period, 500 patients underwent LDLT at the Liver Transplantation Unit, Gastroenterology Center, Mansoura University, Egypt. Their mean age was 51 years (range, 10–64 years). Most of the recipients were men (446, 89.2%). Their median MELD score was 15 (range, 6–48). Most of our patients had chronic HCV infection (453, 90.6%), which was the main indication for LDLT in our study (323, 64.6%). The patients’ demographics are shown in Table 1.

We excluded 87 patients because they were aged ≤18 years (n = 4), were subjected to renal replacement therapy before surgery (n = 2), had no complete follow-up after transplantation or had been referred to another hospital (n = 12), or died during the first 12 months after surgery (n = 69). Of the 413 patients, 33 (8%) developed CKD at 1 year after LDLT (Figure 1).

The comparison between group I patients with normal kidney functions and group II patients who developed CKD 1 year after LT is shown in Table 2.

Cutoff values of ≥ 714 minutes for the operative time, ≥ 7750 ml for the blood loss and ≥ 0.91 for graft-to-recipient weight ratio as a negative predictor were calculated by the receiver-operating characteristic (ROC) curve analysis (Table 3 and Figure 2) to be used in the univariate regression analysis for predicting the likelihood of chronic kidney disease at one year (Table 4).

Multivariate Regression Analysis for Predicting the Likelihood of CKD at 1 Year. A binomial logistic regression was performed to ascertain the effects of preoperative RI, operative time (≥714 minutes), blood loss (≥7750 ml), and GRWR (<0.91, as a negative predictor) on the likelihood that participants would have CKD at 1 year. One studentized residual had a value of 2.649 standard deviations, and two studentized residuals had a value of −2.749 standard deviations, which were kept in the analysis.

The logistic regression model was statistically significant, χ2(4) = 119.599, P < 0.0005. The model explained that 58.9% (Nagelkerke R2) of the variance in RI at 1 year and correctly classified 95.4% of cases. The sensitivity was 97.6%; specificity, 69.7%; positive predictive value, 71.9%; and negative predictive value, 97.4%.

Of the 4 predictive variables, operative time (≥714 minutes) and GRWR (<0.91, as a negative predictor) were statistically significant (as shown in Table 5). The patients with operative times of ≥714 minutes had 37.7 times higher odds to exhibit CKD at 1 year, whereas the patients with a GRWR of <0.91 (as a negative predictor) had 0.072 times higher odds to exhibit RI at 1 year, which means that they had 13.9 times higher odds to not exhibit CKD at 1 year (negative predictor).

4. Discussion

LT has become the only option for patients with end-stage liver disease, and this procedure is permitted in our country from only living donors because deceased donation still lacks legislation [8]. CKD remains a common disorder after LT in spite of the progress of preoperative evaluation, anesthetic medications, surgical techniques, postoperative care, and immunosuppressive therapy. The rate of renal dysfunction is varied among different studies in the context of LT in relation to the source of the graft, whether living or deceased donors; the timing of kidney function monitoring; and the different definitions used for kidney dysfunction [912]. Thus, these definitions should be standardized for a better comparison of studies worldwide. We found that approximately 8% of LDLT recipients have some degree of CKD by the first postoperative year. This rate varies among different studies. Barreto et al. observed a 47% prevalence of some degree of kidney dysfunction [13]. Another 2 studies found a prevalence of approximately 30% [9, 14]. A third group of researchers found only 12% of patients developing renal dysfunction after orthotopic LT [15]. The aim of this study was to clarify the occurrence of renal dysfunction at the point of one year after surgery rather than the time of onset. It was a time point not a time-to-event (survival analysis). To the best of our knowledge, the lower prevalence in this study may be related to the living donation, a result shared by Atalan et al. [16].

In the present study, preoperative renal impairment was significantly more prevalent in the group II patients who developed CKD 1 year after surgery than in group I. This observation has also been reported by previous studies [9, 10, 14, 15, 17]. This may be explained by the following multiple factors: First, preoperative hemodynamic changes may enhance the risk of renal dysfunction in cases of liver cirrhosis by impairment of renal perfusion through immune-mediated vasodilatation, parietal and renal parenchymal edema, hypoalbuminemia, and renin-angiotensin-aldosterone axis disturbances that lead to intravascular hypovolemia. Second is the delay of biological markers uncovering severe renal damage [1820]. Third, intrinsic CKD predisposes patients with end-stage liver disease to kidney dysfunction, stressing on the evident link between the effect of the 2 systems manifested by the more severe encephalopathy, shock, and the higher international normalized ratios in the patients with severe renal dysfunction than in the patients without renal dysfunction [1821]. In spite of the significant difference of preoperative renal impairment between the 2 groups being a potential predictor of CKD at one year (crude odds ratio = 6.274), it was mild in all cases and related to liver disease as well as it was corrected before the time of surgery. Accordingly, in multivariate logistic regression analysis it was not found to be an independent predictor for the likelihood of CKD at one year and only prolonged operative time and graft-to-recipient weight ratio were the only independent predictors of CKD at 1 year. Owing to our long-term aims, we did not take into account the early postoperative transient changes of renal function parameters that were normalized by fluid correction and minor manipulations of antibiotics and immunosuppressive drugs without persistent effects.

Deceased liver donation is not legalized in our country till now, and only LDLT is allowed. This technique results in a partial graft that has a reduced overall parenchymal mass as compared with the whole-organ allograft. Such smaller grafts may be unable to meet the metabolic and hemodynamic demands of the recipients and may be implicated as a cause of allograft dysfunction and complications, including renal dysfunction, with a lower GRWR. Although GRWR of ≥0.8% is universally accepted, its median value (IQR 25–75) was 0.805 (0.8–0.9) in our group of patients who experienced renal dysfunction at 1 year as compared with 1.008 (0.9–1.2) in the other group, reaching a statistical significance (P < 0.0005). This finding is in accordance with a Korean study that included 284 cases of LDLT [22]. This may be explained as part of small-for-size (SFS) graft syndrome with poorly defined pathogenesis representing a group of manifestations and complications of insufficient graft size or function without an obvious technical explanation [23, 24]. We think that this is the first study that measured a cutoff GRWR of <0.91 as a predictor of renal dysfunction 1 year after LT.

We observed that the operative time was more prolonged in the renal impairment group than in the healthy group, with a statistically significant difference. A logical explanation would be the relationship between surgical duration and clamping with frequent hypotensive episodes and the need for more norepinephrine and the use of fluids rich in chloride. This finding is in agreement with those reported by relatively recent studies [2530]. We analyzed the data, which resulted in an unpreceded cutoff time of ≥714 minutes as a predictor of renal dysfunction 1 year after LT.

In our study, the median (IQR 25–75) intraoperative blood loss volume was 8500 cc (7000–9750) in our group of patients who experienced renal dysfunction at 1 year as compared with 5400 cc (3700–7600 cc) in the other group, reaching statistical significance (P < 0.0005) [27]. This finding was supported by the fact that meticulous control of bleeding intraoperatively and stabilization of hemodynamic and electrolyte disturbances with correction of myocardial functions are crucial for prevention of renal disturbances after LT [11].

In addition, we did not observe a significant difference between the 2 groups in relation to the immunosuppression protocol, in contrast to other studies [16, 17]. In our center, the immunosuppressive protocol adopts a low-dose tacrolimus therapy (serum target of 5–8 ng/mL) with mycophenolic acid, which corresponds to the renal function-sparing immunosuppression regimen in many studies [3134]. This protocol could significantly minimize kidney dysfunction in comparison with the standard immunosuppressant treatment with CNI (serum tacrolimus target of 8–10 ng/mL) [16, 17].

In conclusion, our study demonstrated that preliminary data revealed that 4 factors predicted the occurrence of CKD 1 year after LDLT. These factors are preoperative renal impairment, GRWR, intraoperative blood loss, and operative time. In the multivariate regression analysis, only 2 factors remained as independent predictors of CKD at 1 year, namely, GRWR as a negative predictor with a cutoff value of <0.91 and prolonged operative time with a cutoff value of ≥714 minutes. The retrospective design of this study is an important limitation because of the possibility of missing data. Further larger studies are recommended to better understand the tolerability and safety of different immunosuppression protocols on the pathophysiology of graft-associated renal dysfunction with the possible factors related to poor outcome.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Ethical Approval

The study was approved by the Institutional Review Board code R/16.12.82.

Conflicts of Interest

The authors declare that they have no conflicts of interest.