How Old Is Old? An Age-Stratified Analysis of Elderly Liver Donors above 65

In liver transplantation, older donor age is a well-known risk factor for dismal outcomes, especially due to the high susceptibility of older grafts to ischemia-reperfusion injury. However, whether the factors correlating with impaired graft and patient survival following the transplantation of older grafts follow a linear trend among elderly donors remains elusive. In this study, liver transplantations between January 2006 and May 2018 were analyzed retrospectively. Ninety-two recipients of grafts from donors ≥65 years were identified and divided into two groups: (1) ≥65–69 and (2) ≥ 70 years. One-year patient survival was comparable between recipients of grafts from donors ≥65–69 and ≥70 years (78.9% and 70.0%). One-year graft survival was 73.1% (donor ≥65–69) and 62.5% (donor ≥ 70), while multivariate analysis revealed superior one-year graft survival to be associated with a donor age of ≥65–69. No statistically significant differences were found for rates of primary non-function. The influence of donor age on graft and patient survival appears not to have a distinct impact on dismal outcomes in the range of 65–70 years. The impact of old donor age needs to be balanced with other risk factors, as these donors provide grafts that offer a lifesaving graft function.


Introduction
In orthotopic liver transplantation (oLT), donor organ shortage to a variable extent is a challenge for many centers and procurement organizations. The unmet need for life-sustaining grafts demands the utilization of so-called extended criteria donors, e.g., those with pre-existing diseases, donation after cardiac arrest or cardiac death, prolonged intensive care stay, etc. [1]. Among those, elevated donor age is a matter of ongoing debate. Usage of older grafts is complicated by their reduced capacity for recovery and their high susceptibility to ischemia-reperfusion injury. Donor age has been shown to negatively impact the outcome of oLT linear over the entire age range in large enough studies and is part of the widely accepted donor risk index (DRI) [2][3][4]. Internationally, the maximum accepted donor age in deceased donors differs substantially. Lately, the utilization of older donors has captured the focus of discussion in the United States. Halazun et al. demonstrated a very unbalanced utilization of livers of donors exceeding 70 years in the various United Network for Organ Sharing (UNOS) regions [5]. At the same time, the utilization of septuagenarian and octogenarian donors has become common practice in other regions worldwide [3]. This discrepancy reflects the ongoing debate on the risks and prospects of elderly donors in oLT. Advanced donor age has been shown to add overproportioned risk in hepatitis c virus (HCV)-positive recipients before the advent of direct antiviral acting agents (DAA) [6]. Hepatocellular carcinoma (HCC) recipients seem to be less affected by the observed reduced graft survival of older donors in general [2]. Haugen et al. recently demonstrated that refusal of a graft on account of the donor's old age leads to inferior survival prospects for the potential recipient [7]. The aforementioned insights and findings are helpful in the allocation process if an elderly graft is offered. Nevertheless, center-specific and individual circumstances may contribute to the effects on the outcome of oLT recipients of elderly grafts. This retrospective single-center analysis was consequently designed to identify the risks and potential benefits by utilizing grafts from donors exceeding 64 years of age.

Study Design and Study Population
All adult patients who underwent oLT at the Department of General, Visceral and Transplant Surgery, University Hospital Münster, Germany between January 2006 and May 2018 were screened for inclusion. Ninety-two patients with donors ≥ 65 years were identified and included in the final analysis. Based on the donor age, the eligible patients were further stratified into two groups: (1) ≥65-69 and (2) ≥70 years. The study design was a retrospective single-center study with a follow-up period of 12 months. The study was conducted as per the ethical principles stated in the Declaration of Helsinki and all patients provided consent for the routine recording of clinical data. Specific study approval was obtained from the local ethics committee (Ethik-Kommission der Ärztekammer Westfalen-Lippe und Westfälischen Wilhelms-Universität, No. 2019-473-f-S). All retrospectively collected data were from patients' charts, Eurotransplant Network Information System (ENIS), or in-house transplant data files. Before analysis, all data were de-identified. All allografts were procured from donation after brain death donors.

Demographic Data
Baseline donor parameters were obtained from ENIS and included age, gender, body mass index (BMI) as well as donor center (national/international), and a donor risk index (DRI) was calculated, as described [4]. Baseline datasets for eligible recipients included age, gender, BMI, a model for end-stage liver disease (MELD) score, indication for oLT, high urgency (HU) status, hepatitis C status, cold and warm ischemia time, as well as the number of prior transplants. Indication for oLT was defined as the underlying cause/disease and classified into the following indications: acute liver failure (ALF), hepatocellular carcinoma (HCC), viral hepatitis, cholangitis (including primary sclerosing cholangitis (PSC), primary biliary cholangitis (PBC) and secondary sclerosing cholangitis (SSC)), alcoholic cirrhosis, polycystic liver disease (PLD), and others (including Caroli disease, hemochromatosis, amyloidosis, and cryptogenic cirrhosis).

Outcome Measures
The primary outcome for this study was one-year patient survival, estimated by the Kaplan-Meier methodology and compared using Log-Rank testing. The secondary outcome parameters encompassed one-year overall and death-censored graft survival, as well as 30-day and 90-day patient and graft survival, primary non-function (PNF, graft failure (excluding any identifiable cause such as rejection and/or vascular thrombosis) resulting in death or re-transplantation within 30 days of the initial oLT). Additional secondary outcome measures were the rates of biopsy-proven acute rejections (BPAR), early allograft dysfunction as defined by Olthoff et al. [8], peak serum values of alanine transaminase (ALT) and aspartate transaminase (AST), length of stay in an intensive care unit (ICU), length of stay in the hospital, death during initial hospitalization, number, and length of readmissions (not oLT specific) following discharge, frequency of ischemic-type biliary lesions (ITBL) and rates of re-transplantations. All re-transplantations within the observational period of 12 months were counted as a complication for the respective group of the initial oLT and not as an addition oLT case for the respective group, as reported previously [9,10].

Statistical Analysis
Continuous variables were presented as mean and standard deviation and the number of reoperations and the length of readmissions as median (minimum and maximum). Categorial variables were presented as absolute and relative frequencies. Comparisons between the two groups were done either by Wilcoxon rank-sum test for continuous variables or using the Fisher exact test for categorial variables. Survival between groups was compared using the Log-Rank test and p-values < 0.05 were considered statistically significant. Additionally, Cox proportional hazards regression models were used to investigate the influence of different variables on one-year patient survival, death-censored graft survival, and overall graft survival. The univariate analysis included donor age (as a continuous variable as well as a categorial variable (≥65-69 vs. ≥70)), recipient age, HCV status, cold and warm ischemia time, labMELD, PNF, BPAR, re-operation, number of readmissions, and stay at ICU. Thereafter, a stepwise selection was used to identify the relevant variables in a multivariable Cox regression analysis. Results are shown as hazard ratios (HR) with a 95% confidence interval (CI) and p-value of the likelihood ratio test. All statistical analyses were performed with SAS 9.4, SAS Institute, Cary, NC, USA.

Study Population Characteristics
Four hundred and six patients were included in the primary explorative analysis, representing all patients who received an oLT between January 2006 and May 2018 at the Department of General, Visceral and Transplant Surgery, University Hospital Muenster, Germany. The average donor age in the overall cohort was 51.9 ± 15.1 years, the youngest donor being 12 and the oldest, 84. Patients were then stratified based on donor age with a cut-off point of 65 years. Applying this strategy, it was found that 314 (77.3%) patients received grafts from a donor younger than 65, while 92 (22.7%) patients were transplanted with a graft from a donor ≥ 65 years old. When comparing baseline recipient characteristics (Table 1), significant differences were found for recipient age (<65: 51.5; ≥65: 55.4, p = 0.005) as well as the number of HU oLTs (<65: 22; ≥65: 1, p = 0.037). Next, the 92 patients who received a liver graft from donors ≥ 65 were further stratified based on the donor age into two groups: (1) ≥65-69 (52 patients, 12.8%) and (2) ≥70 (40 patients, 9.8%) years (Table 2). While the two groups were comparable regarding the baseline recipient characteristics (age, gender, and BMI), differences were found for underlying disease as an indication for oLT, although not reaching a level of significance: among the younger donor group, fewer recipients were suffering from HCC (17.3% vs. 30%; p = 0.073) and alcoholic cirrhosis (15.3% vs. 35%; p = 0.073), whereas more cases had ALF (13.5% vs. 2.5%; p = 0.073). In addition, the mean labMELD score, warm and cold ischemia times, as well as the percentage of HCV-positive donors showed no differences between the two groups ( Table 2).

Effect of CIT
CIT was classified into four categories (<8 h, 8-10 h, 10-12 h, >12 h) according to Cassuto et al. [11] and the distribution among the age-stratified groups with donors ≥ 65 was analyzed. Most grafts from donors ≥ 65-69 were transplanted after a CIT of 8-10 h (39.2%), while the majority of grafts from donors ≥ 70 were transplanted after a CIT of 10-12 h (39.4%) ( Figure 4A). Next, one-year patient survival was analyzed, and groups were stratified for CIT. No differences were found for CIT subcategories among donors ≥ 65-69 ( Figure 4B) or donors ≥ 70 ( Figure 4C). In addition, no significant differences were found when matching CIT was analyzed across donor subgroups (

Discussions
It is debatable whether center-specific conclusions can be drawn from multicenter prospective studies like the CTS, or analyses from national registr UNOS [5,6]. Therefore, this study aimed to clarify the extent to which considerations of high donor age in oLT can be applied to an LT program with exceeding 65 years, representing 22.6% and 9.8% being older than 70 years, in institution. Our cohort included fewer donors exceeding 70 years (9.8% vs. 13%) years (22.6% vs. 25.2%), compared to recently published data from the Eurotra region and the CTS [5,7]. As expected, our rate of utilization of septuagenarian was notably higher than that reported for the UNOS region (9.8% vs. 4.3%) [6]. Rec baseline characteristics were in general comparable between both elderly don groups. One difference was the rate of ALF as an indication for oLT (13.5% (≥65-6 vs. 2.5% (≥70 years); p = 0.069) among the two elderly donor age categories, even

Discussions
It is debatable whether center-specific conclusions can be drawn from large multicenter prospective studies like the CTS, or analyses from national registries like UNOS [2,5]. Therefore, this study aimed to clarify the extent to which general considerations of high donor age in oLT can be applied to an LT program with donors exceeding 65 years, representing 22.6% and 9.8% being older than 70 years, in a single institution. Our cohort included fewer donors exceeding 70 years (9.8% vs. 13%) and 65 years (22.6% vs. 25.2%), compared to recently published data from the Eurotransplant region and the CTS [2,3]. As expected, our rate of utilization of septuagenarian donors was notably higher than that reported for the UNOS region (9.8% vs. 4.3%) [5]. Recipients' baseline characteristics were in general comparable between both elderly donor age groups. One difference was the rate of ALF as an indication for oLT (13.5% (≥65-69 years) vs. 2.5% (≥70 years); p = 0.069) among the two elderly donor age categories, even though the difference was insignificant, most likely due to the limited number of cases. This finding is as per the CTS and Eurotransplant data and could be attributed to the prioritization of ALF candidates in the allocation. The percentage of HCV-positive recipients of grafts from donors exceeding 69 years was higher compared to recipients of grafts from donors of ≥65-69 years (17.5% vs. 11.5%, p = 0.55), without reaching the level of significance. Since DAAs were available only by the end of the study period, this finding runs counter to the traditionally avoided combination of HCV and older donors. With the introduction of DAAs, HCV-positive recipients achieve outcomes comparable to HCV-negative patients after oLT [12]. Moreover, it was recently shown that even the use of viremic HCV-positive donors in HCV-negative recipients does not, on the other hand, impair the outcome in the post-DAA era [13].
The higher rate of HCC recipients in the older category in our study is also as per CTS and Eurotransplant data. In our opinion, allocating older grafts to HCC recipients is justified, as it was shown that especially older HCC recipients show the least impairment of graft survival by receiving elderly grafts [2]. Allocating elderly grafts to specific indication categories, e.g., HCC, is a matter of ongoing debate and further analyses from large studies and registries might be necessary to clarify this issue with a focus on the individual benefit of older grafts for the recipient [14,15]. Unfortunately, the outstanding work by Haugen et al., revealing a substantial survival benefit for waitlist candidates accepting an elderly graft, could not provide benefit information for specific indications such as HCC [7]. In our population, grafts from donors exceeding 70 years have been allocated more frequently to HCC and alcoholic cirrhosis recipients, whereas ALF recipients preferably received livers from younger donors. Even though the differences did not reach the level of significance, it reflects our intended recipient selection policy at the time. Careful selection of recipients of older grafts appears reasonable to avoid adding risks resulting in unfavorable outcomes. Through these measures, even the use of octogenarian donors has been reported to provide results comparable to the ones achieved with younger donors [16][17][18]. For donors exceeding 70 years, Bertuzzo et al. even reported comparable 5-year graft survival rates in unselected recipients, compared to younger donors [19].
As our study did not reveal any entirely new, previously unreported findings, the interpretation of the data obviously does not lead to changes in the clinical routine. Nevertheless, as mentioned above, the acceptance of grafts in our program is decided based on individual consideration of recipient-and donor-based risk factors. Therefore, we believe that it is essential to know the exact impact of increased donor age in our individual circumstances. This is especially relevant, as we are confronted with a severe donor shortage, forcing us to commonly evaluate marginal grafts, which is reflected by the rather high DRI above 2 in our cohort. Notably, the impact of the routine implementation of normothermic machine perfusion (NMP) on the outcome of older grafts, and likewise marginal grafts, will be the future focus of the transplant community [20]. There is currently one randomized controlled trial that demonstrates reduced ischemia-reperfusion injury (IRI) in older liver grafts (donors ≥ 70 years) undergoing NMP [21]. Whether a reduced IRI translates into superior clinical outcomes (e.g., reduced PNF) remains unknown currently. However, it is undoubted that viability assessment during NMP can provide significant insights into the graft quality and prediction of liver function and thus might help to expand the donor pool by assessing older liver grafts under near-physiological conditions.
Since older liver grafts are highly susceptible to cold-storage-elicited injury, it is recommended to keep the CIT as short as possible (preferably below 8 h [22]), to minimize deleterious effects on the post-transplant outcome. The CIT (donor ≥65-69: 10.5 h, donor ≥ 70: 10.8 h) reported here is rather long compared to previous studies [23,24] and must be considered when comparing the results obtained. However, our analysis also shows that longer ischemia times can be accepted, even for older grafts, when the remaining risk factors are well balanced. In addition, our analysis found no evidence of a higher CIT susceptibility among donors exceeding 70 years, compared to donors ≥ 65-69 years of age, which indicates that the functional reserve and capacity of response and recovery in older liver grafts have no strict linear pattern.
Our study has not obtained insights that are entirely different from the previously published work in this matter. One-year patient survival in our study was unimpaired comparing donors below 65 and 65-69 years (77.1% vs. 78.5%) The survival rates for recipients of grafts exceeding 70 years were inferior to those receiving grafts in the donor age range of 65-69 years, even though the difference was insignificant due to the limited numbers in this single-center analysis. Having established elevated donor age as a risk factor for graft failure in our Cox proportional hazards regression models, it appears conceivable that well-conducted risk balancing might mitigate donor-associated risk factors, since age-stratified Log-Rank analysis revealed no significant differences in graft survival between donors ≥ 65-69 and ≥70 years of age. Given the aforementioned observation that the impact of increasing donor age is linear over the entire age range, one has to conclude that differences are very minute in comparing 65-69 years old donors to those exceeding 70. Accordingly, the meta-analysis of Dasari et al. has not identified 70 years as a donor age threshold predicting impaired graft or patient survival rates in LT [24]. Confirming the linearity of the influence of donor age on the outcome in oLT in their recent analysis of Eurotransplant data, Pratschke et al. also concluded that there is no threshold of donor age for the prediction of graft failure in oLT [25].

Conclusions
In summary, our results confirm previous findings that an absolute cut-off age threshold for donors does not exist. Elevated donor age should rather be considered as one of the several parameters in the decision-making process for allocation and individualized acceptance of grafts for oLT. Elderly donors, especially in the range of 65-69 years, are substantially helping to reduce the burden of donor organ shortage and waitlist mortality. Informed Consent Statement: Informed consent was obtained from all subjects involved in the study.

Data Availability Statement:
The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.