Leveraging Blood-Based Diagnostics to Predict Tumor Biology and Extend the Application and Personalization of Radiotherapy in Liver Cancers

While the incidence of primary liver cancers has been increasing worldwide over the last few decades, the mortality has remained consistently high. Most patients present with underlying liver disease and have limited treatment options. In recent years, radiotherapy has emerged as a promising approach for some patients; however, the risk of radiation induced liver disease (RILD) remains a limiting factor for some patients. Thus, the discovery and validation of biomarkers to measure treatment response and toxicity is critical to make progress in personalizing radiotherapy for liver cancers. While tissue biomarkers are optimal, hepatocellular carcinoma (HCC) is typically diagnosed radiographically, making tumor tissue not readily available. Alternatively, blood-based diagnostics may be a more practical option as blood draws are minimally invasive, widely availability and may be performed serially during treatment. Possible blood-based diagnostics include indocyanine green test, plasma or serum levels of HGF or cytokines, circulating blood cells and genomic biomarkers. The albumin–bilirubin (ALBI) score incorporates albumin and bilirubin to subdivide patients with well-compensated underlying liver dysfunction (Child–Pugh score A) into two distinct groups. This review provides an overview of the current knowledge on circulating biomarkers and blood-based scores in patients with malignant liver disease undergoing radiotherapy and outlines potential future directions.


Introduction
Liver malignancies are a leading cause of cancer-related death worldwide. Hepatocellular carcinoma (HCC) is the most common liver malignancy, and its incidence has more than tripled in the US since 1980 [1,2]. In recent years, radiotherapy has emerged as a promising modality for patients with liver malignancies, aided by advances in target immobilization, localization, and treatment delivery. Intensity modulated radiation therapy (IMRT) and volumetric arc therapy (VMAT) have allowed for highly conformal radiation plans while maximally sparing the surrounding healthy liver as much as possible. MRIbased online adaptive treatment is another promising technique to further spare healthy liver tissue [3][4][5]. Radiotherapy is generally a safe and highly effective treatment with local-control rates for HCC of up to 95% [6][7][8][9][10]. However, the optimal dose and fractionation scheme, as well as radiotherapy technique such as photon versus proton irradiation, remain unclear. In addition, the potential for severe toxicities such as radiation-induced liver disease (RILD) in patients with underlying liver diseases (cirrhosis, steatosis) may be dose-limiting in some patients. [11,12]. These challenges could be addressed by identifying accurate and meaningful biomarkers for liver cancer radiotherapy.
As new treatments emerge, the investigation of biomarkers and clinical scores to predict tumor biology is essential to guide patient management. Among several methods, blood-based diagnostics have unique advantages. Peripheral blood can be easily obtained and collected throughout the treatment and recovery periods, allowing for the measurement of toxicity, treatment response or disease progression. In addition, as blood draws are widely available without elaborate setups or imaging services, its use can be easily adopted into clinical practice. As virtual visits become increasingly common, bloodbased biomarkers may facilitate remote monitoring without requiring patients to travel, potentially alleviating financial toxicities for patients undergoing cancer therapy [13]. The identification of circulating biomarkers in patients with HCC would be of particular use because the diagnosis is typically made radiographically; tissue biopsies are uncommon due to the risk of bleeding and needle-tract seeding [14]. This review provides an overview of current approaches on blood-based diagnostics for patients treated with radiotherapy for liver malignancies.

Clinical Scores: Child-Pugh, Barcelona Clinic Liver Cancer (BCLC), and Albumin-Bilirubin (ALBI) Grades
Historically, Child-Pugh score (CPS) system (also referred to as the Child-Turcotte-Pugh score) has been used to assess hepatic function in a variety of liver malignancies, by itself or as part of the Barcelona Clinic Liver Cancer (BCLC) [15]. Although widely used, there are major concerns on the objectivity of the CPS system and its ability to stratify patients into prognostic groups. First, the presence and severity of both ascites and encephalopathy are subjective, often assessed by a single physician. Second, both parameters are strongly influenced by extrinsic factors such as medication, making them prone to fluctuations that may affect the clinical grade. Finally, some variables are interdependent; for example, low albumin may lead to increased ascites, resulting in a potentially disproportional impact on the total score [16]. The BCLC staging system incorporates the CPS score as a measure of underlying hepatic function, along with the disease extent and Eastern Cooperative Oncology Group (ECOG) performance status to create five stages of HCC. These five stages range from very early stage (BCLC 0) to Terminal stage (BCLC D) and are predictive of prognosis. The BCLC score is widely used in clinical management.
Similarly, the albumin-bilirubin (ALBI) grade has been used to predict prognosis in HCC. Initially developed to assess of liver function after surgery [17,18], it was developed to be more objective than the CPS system. The ALBI grade uses plasma albumin and bilirubin using the formula ALBI = (log10 bilirubin [µmol/L] × 0.66) + (albumin [g/L] × −0.0852) to create the following three groups: grade 1: ≥−2.60, grade 2: −2.61-−1.39 and grade 3: >−1.39 [19]. Since first proposed in 2015 by Johnson and colleagues as a new and less biased approach towards assessment of liver function, there has been increasing interest in validating the ALBI score for liver cancer patients undergoing radiotherapy. Since 2015, more than 1000 patients have been included in studies of pre-treatment ALBI scores, predominantly using retrospective analyses (Table 1). In line with previously published data for patients undergoing liver resection or other liver directed therapies [20][21][22], several studies showed a strong correlation between overall survival and ALBI score [23][24][25][26][27]. However, the specific threshold of a statistically significant increase in ALBI score varied between the studies (e.g., Murray et al. [23]: 0.1 increase vs. Toesca et al. [25]: 0.5), making a cross-study comparison difficult. Of note, the ALBI score outperformed CPS system in predicting mortality in some cases [23,[26][27][28]. Five studies showed a correlation between pre-treatment ALBI score and a decline in liver function following radiotherapy [23][24][25]28,29]. As previously described for other liver cancer therapies, the ALBI score was able to further subdivide patients in CPS group A into two distinct subgroups (ALBI grade 1 and 2) correlating both with overall survival and more importantly with the development of liver toxicity after radiotherapy. In this context, Lo and colleagues showed that the risk of developing RILD increases from 2.4% to 15.1% with between ALBI grades 1 and 2 [24]. It has therefore been suggested that CPS class A patients with an ALBI grade of 2 should be treated like CPS class B patients. In line with these data, Gkika et al. proposed a shorter follow up period (~1 month after irradiation) for patients with higher baseline ALBI score to account for their higher risk of early and late toxicity, and a more restrictive mean liver dose [29]. This decline may not translate to cholangiocarcinoma. Toesca and colleagues found no correlation between ALBI grade and toxicity following radiotherapy in patients with cholangiocarcinoma (CCA) [25]. The authors suggested that the observed discrepancy between patients with HCC and CCA was based on the smaller patient cohort (n = 20 vs. 40 patients), and potential differences in underlying liver fibrosis (15 vs. 65% of patients with fibrosis) between the cohorts. Similarly, ALBI did not have predictive power in patients with CPS class B in two studies [24,27], which again could be due to the small patient sample size but could also reflect a potential inaccuracy of ALBI grade in detecting more severe underlying liver damage. Further prospective studies focusing on these patient subgroups should clarify these inconsistencies.

Indocyanine Green (ICG) Test
Indocyanine Green (ICG) is an FDA approved tricarbocyanine dye that has been utilized since the 1980s to evaluate hepatic function after several procedures [30][31][32][33]. After intravenous infusion, ICG binds to plasma proteins and subsequently gets excreted into the bile without biotransformation, providing a kinetic estimation of liver function. The speed with which ICG is eliminated depends on several parameters including hepatic blood-flow, allowing for an individualized and sensitive estimation of global liver function [34].
A group at the University of Michigan developed a treatment strategy based on the ICG retention rate 15 min after injection (ICGR15) to individually tailor radiation dose to a patient's underlying liver function ( Table 2, I). In two independent studies (with 48 and 131 patients), they observed that ICGR15 levels mid-treatment, but not prior to irradiation, reflected functional liver reserve after radiotherapy [35,36]. Building upon these findings, they built a toxicity model using ICGR15 levels, which led to the superior prediction of liver toxicity following radiation (AUC = 0.86 vs. 0.75; p = 0.04) [37]. Based on these results, the group designed a phase II study where patients were re-assessed for changes in ICGR15 4 weeks after the third of five planned irradiation fractions. If hepatic function was unchanged at this time point, the remaining two fractions were given as initially planned. If liver function had declined, the radiation plan was either adjusted or patients were re-tested after another 4 weeks to allow for more recovery time [38]. The proposed adjustment of radiation dose based on ICGR15 was feasible and led to high local control rates (1y = 99%; 2y = 95%) with low complication rates (increase in CPS 1 or 2 points within 6 months: 14%/7%). Interestingly, a recent study by the same group found that an ALBI-centric model performed equivalently to the ICGR15 model (AUC = 0.79 for both), supporting the use of the less labor-intensive ALBI grade as a biomarker [39]. More importantly, only preand mid-treatment levels of ALBI were associated with liver toxicity following radiation. Nevertheless, this approach may be of particular benefit to patients with CPS class B and C liver function, who currently are either excluded from radiotherapy or receive insufficient dosing due to concerns for further hepatic toxicity.

Hepatocyte Growth Factor (HGF)
Hepatocyte growth factor (HGF) is the ligand to the mesenchymal-epithelial transition factor (MET) tyrosine-kinase receptor and is produced in the liver mainly by activated stromal cells called hepatic stellate cells. The activation of the HGF/MET pathway promotes cell proliferation, survival and invasion [55]. The MET pathway physiologically activates during liver tissue regeneration; however, its constitutive activation in cancer cells can lead to a more aggressive and metastatic phenotype [56,57]. Based on these effects, HGF has been investigated as a biomarker for a variety of cancers, with most of these studies showing a negative correlation between plasma HGF levels and patient survival [58][59][60][61]. For patients with HCC, serum HGF levels were significantly elevated compared to healthy individuals [62].
The specific predictive value of circulating HGF for radiotherapy/SBRT response and RILD remain to be fully characterized [45,62,63] (Table 2, II). In a cohort of 104 patients treated with radiotherapy for liver malignancies, Cuneo and colleagues observed a correlation between pre-treatment HGF levels and overall survival (HGF high vs. low: 14.5 vs. 28.1 months). More interestingly, both pre-and mid-treatment levels of HGF predicted liver toxicity after treatment; however, multivariant analysis showed that mid-treatment HGF levels had the highest predictability [40]. Similarly, in a cohort of HCC and ICC patients treated with proton radiotherapy (n = 43), pre-treatment HGF levels were predictive of liver damage after irradiation [41]. Of note, the definition of liver damage considered significant differed between the studies (CTP increase of at least 2 points versus 1 point), as the number of events was small in proton-treated patients. Interestingly, a study by Ng and colleagues did not find a correlation between blood HGF levels and liver toxicity at any time point after radiotherapy [14]. Of note, the reported median HGF levels in these studies differed greatly (1.4 ng/mL (range not reported) [40] vs. 2.31 ng/mL (range 1.037-8 ng/mL) [64] vs. 0.824 ng/mL (range 0.16-11 ng/mL) [45]). In addition, 18 out of 47 patient samples were out of the detection range of the assay [45]. Thus, the observed discrepancy might arise from differences in the patient cohorts investigated or the assay used. Additional insights should be gained from ongoing randomized phase III trial of proton versus photon SBRT in HCC patients (NCT03186898), which will investigate plasma HGF as an integrated biomarker for susceptibility to RILD.

Cytokines
The release of damage-associated molecular patterns (DAMPs), including cytokines and interleukins, has been reported following irradiation via NF-κB [64,65] (Figure 1). It is well known that irradiation leads to an increase in vessel permeability resulting in both extravasation and recruitment of different leukocyte populations, which in turn creates a pro-inflammatory microenvironment [66]. This inflamed environment induces the release of cytokines, chemokines, and several other factors into the bloodstream, which can be measured by plasma assays such as ELISA. Several studies have evaluated various cytokines and interleukins as potential biomarkers for patients with HCC undergoing radiotherapy ( Table 2, III). The following section will focus on markers that have been reported so far. cytokines and interleukins as potential biomarkers for patients with HCC undergoing radiotherapy ( Table 2, III). The following section will focus on markers that have been reported so far. CD40 Ligand (CD40L), also known as CD154, is mainly expressed on the surface of platelets, endothelial cells and T-cells. It is cleaved by matrix metalloproteases creating the pro-inflammatory molecule soluble CD40 ligand (sCD40L) [67,68]. It has been suggested that patients with cancer have higher levels of sCD40L than patients without cancer [69], and that there may be an association between higher plasma levels of sCD40L and the presence of metastatic disease [70]. For HCC patients undergoing radiotherapy, the potential association between OS and plasma levels of sCD40L has been inconsistent. Ng and colleagues reported that lower levels of sCD40L before treatment were associated with higher risk of death three months after treatment [45]. There was no correlation found in a study by Cuneo et al. (sCD40L baseline: HR = 0.876; p-value= 0.122 and sCD40L 1 month: HR = 0.907; p-value= 0.568). Interestingly, both studies showed that a decrease in sCD40L at mid-treatment was predictive of liver disfunction following radiotherapy. These results suggest that sCD40L may have the potential to mitigate toxicity by facilitating a tailored radiation dose during therapy.

Commen
CD40 Ligand (CD40L), also known as CD154, is mainly expressed on the surface of platelets, endothelial cells and T-cells. It is cleaved by matrix metalloproteases creating the pro-inflammatory molecule soluble CD40 ligand (sCD40L) [67,68]. It has been suggested that patients with cancer have higher levels of sCD40L than patients without cancer [69], and that there may be an association between higher plasma levels of sCD40L and the presence of metastatic disease [70]. For HCC patients undergoing radiotherapy, the potential association between OS and plasma levels of sCD40L has been inconsistent. Ng and colleagues reported that lower levels of sCD40L before treatment were associated with higher risk of death three months after treatment [45]. There was no correlation found in a study by Cuneo et al. (CD40L baseline: HR 0.876; p-value = 0.122 and CD40L 1 month: HR 0.907; p-value = 0.568). Interestingly, both studies showed that a decrease in sCD40L at mid-treatment was predictive of liver disfunction following radiotherapy. These results suggest that sCD40L may have the potential to mitigate toxicity by facilitating a tailored radiation dose during therapy.
Interleukine-6 (IL-6) is a pro-inflammatory cytokine that is known to be elevated in patients with HCC, reflecting chronic inflammation in the liver [71]. Moreover, plasma IL-6 levels have been shown to stratify patients into early vs. late-stage disease and correlate with treatment outcomes after resection or trans-arterial chemoembolization [72][73][74]. With radiotherapy, high levels of circulating IL-6 seem to be associated with overall negative treatment outcomes. Ng and colleagues reported that high levels of soluble IL-6 receptor (sIL-6R) at baseline or mid-treatment, as well as baseline soluble gp160 (sgp130), were correlated with both an increased risk of death three months after treatment and an increased risk of liver toxicity. sIL-6R is the cleaved version of the membrane-bound IL-6R, which plays an important role in trans-signaling of the cytokine IL-6, whereas sgp130 inhibits the signaling of this pathway [75]. In addition, two other groups reported that high circulating IL-6 levels at baseline or mid-treatment were predictive of local failure after radiotherapy (HR 1.15 (95% CI 1.04-1.25) [43] and RR 1.019 (95% CI 1.011-1.028) [76]). Of note, for patients with HCC, this correlation might only be evident in non-treatment-naïve patients [44]. Interestingly, Ajdari et al. found a correlation between the volume of liver radiated with low doses (5Gy, V5) and the risk of liver toxicity, as well as higher plasma IL-6 levels, during treatment [43]. These results indicate that clinically significant increases in circulating IL-6 levels may be detectable even earlier than at mid-treatment. These results are in line with previously published data on increased IL-6 levels following radiation in patients with other cancer types [76][77][78][79][80][81].
Tumor necrosis factor alpha (TNFα) is another inflammatory cytokine that can promote tumor progression, survival, and metastasis. TNFα is synthesized as pro-TNFα mainly by NK-cells, T-cells and macrophages and is activated upon enzymatical cleavage. There are two soluble TNFα receptors. TNFα receptor 1 (TNFR1) is ubiquitously expressed. TNFα receptor 2 (TNFR2) is mainly expressed on immune cells and shows a higher affinity for TNFα than TNFR1 [79]. Most biological activity of TNFα is thought to occur via TNFR1 [80]. Although TNFα has been shown to play a major role in development and maintenance of liver inflammation and cirrhosis [81], reports have been inconsistent. Cha and colleagues found no correlation between overall survival and plasma levels of TNFα in patients with HCC undergoing radiotherapy (univariate analysis: p = 0.573) [44]. This may be related to the short half-life of TNFα in plasma (ca. 20-70 min) [82] as plasma samples were generated before and after the completion of radiotherapy. Interestingly, two other studies that investigated plasma levels of TNFα receptors over the course of radiotherapy found a correlation with the development of liver toxicity [45,46]. Cousins and colleagues found an association between liver toxicity and TNFR1, while Ng and colleagues found an association with mid-treatment TNFR2 levels. Additionally, Ng and colleagues found that TNFR2 levels were associated with a higher risk of death both at baseline and during treatment. TNFR1 levels were not detectable and thus not evaluated in this study [45]. Further studies are needed to clarify the relationship between the levels of TNFα and TNFα receptors in blood with survival and radiation-induced toxicity in patients undergoing liver radiotherapy.

Circulating Blood Cells
Radiation-induced lymphopenia (RIL) has been recently identified as a negative biomarker for patients. A recent meta-analysis reported that patients who experienced grade ≥3 lymphopenia after irradiation demonstrated an increased risk of death of 65%, compared to patients who did not [83]. Multiple factors have been reported to influence the risk of developing RIL, including patient age, sex, baseline absolute lymphocyte count (ALC), irradiated tumor volume and fractionation [49,50]. For patients undergoing radiotherapy for liver malignancies, several groups have reported transient decreases in lymphocyte counts [47][48][49][50][51][52] (Table 2, IV). However, the reported duration of RIL differed between studies and ranged between 1 month [48] and up to 1 year [51] following irradiation.
Studies have been inconsistent in the type of lymphocyte most impacted by radiotherapy. Gustafson and colleagues observed a decrease in CD3+ T-cells in a more diverse patient cohort including HCC, CCA and patients with liver metastasis [48]. Zhang et al. reported most the prominent change in CD19+ B-cell counts in a cohort of HCC patients [49]. In contrast, Zhuang et al. did not observe any association between CD19+ B-cell, CD3+ T-cell or CD4+ T-cell counts and survival in their patient cohort [51]. Interestingly, the severity of RIL may depend on radiation technique. Zhang and colleagues showed that patients treated with SBRT had the least decline in lymphocytes, compared to patients treated with conventional radiotherapy [49], suggesting that hypofractionation may reduce the risk of developing RIL. Furthermore, it has been proposed that the unique physical properties of proton radiation may reduce the risk and severity of RIL, as protons come to rest within a pre-specified range according to the Bragg peak, reducing the low-dose radiation to healthy liver tissue [84][85][86]. In this context, a recent study comparing lymphocyte levels of patients either undergoing photon or proton radiation for HCC showed that proton therapy led to a higher ALC nadir and significantly longer overall survival (33 vs. 13 months, p = 0.002) [54].
Several reports showed a correlation between overall survival and low lymphocyte counts in liver cancer patients, although the specific time points analyzed and parameters evaluated have differed between studies [49][50][51][52]. Of note, markers correlating with overall survival might be specific to cancer histology, as we have previously reported for HCC versus CCA patients [47]. Three studies also reported inverse correlations between the baseline neutrophil-to-lymphocyte ratio (NLR) and treatment outcome (Byun et al.: HR 1.03 (95%CI 1.01-1.06); p= 0.016) [48,50,53]. Hsiang and colleagues additionally reported an association between liver toxicity and the maximal change in NLR [liver toxicity rate (delta NLR< vs. >1.9): 7.5% vs. 35.1%].
In summary, the available data suggest that the immune system should be recognized as an organ-at-risk in radiation treatment planning. Investigating the effect of radiation on circulating blood cells and their potential as biomarkers will be crucial in developing strategies for radio-immunotherapy approaches in the future [87].

Genomic Biomarkers
The advent of next generation sequencing has provided new opportunities for utilizing liquid biopsies for diagnostic purposes [88] (Table 2, V). A promising biomarker in this context is circulating cell-free DNA (cfDNA), which are fragments of 150 to 200 base pairs of double-stranded DNA segments that have a short half-life. Levels of cfDNA been reported to be increased after surgery, in autoimmune disease and other types of inflammation [89]. In a first study of HCC patients undergoing radiotherapy, Park and colleagues observed quantitative changes in cfDNA and stratified patients in low and high cfDNA subgroups [90]. Baseline cfDNA levels correlated with tumor size and stage, as previously reported for other cancer subtypes [91]. Patients in the low cfDNA group following treatment showed improved local control compared to patients with high cfDNA (HR = 2.405 (95% CI 1.059-5.460). The authors hypothesized that continuously elevated cfDNA plasma levels might indicate residual disease or external tumor progression. However, cfDNA levels may be less reliable in patients with acute or chronic inflammatory conditions, as inflammation may also increase cfDNA levels and results must be interpreted accordingly [92].

Other Soluble Factors
Several other factors have been evaluated by single groups as potential biomarkers in the plasma of patients with both primary and secondary liver cancers undergoing radiotherapy ( Table 2, VI). Suh and colleagues observed that plasma levels of vascular endothelial growth factor (VEGF) normalized to platelet count (VEGF/plt) were elevated after radiotherapy [93]. Additionally, higher baseline levels of VEGF/plt were associated with reduced progression-free survival (PFS) and higher risk of outfield intrahepatic failure. Based on these observations, the authors hypothesized that a combination of radiotherapy with anti-angiogenic therapy might have a synergistic effect. A randomized phase III trial (RTOG 1112) is currently investigating the effect of sorafenib alone versus sorafenib in combination with SBRT (NCT01730937), and its results will provide insights into potential synergy. Similarly, Kim and colleagues studied soluble programmed death-ligand 1 (sPD-L1) in the plasma and found that increased levels following radiotherapy correlated with poor survival and the development of metastasis in HCC patients. However, on multivariate analysis, only tumor size remained statistically significant in this study [94]. Radiation-induced upregulation of PD-L1 on cancer cells has been reported both in vitro and in vivo models [94][95][96]. Several trials are currently investigating the combination of radiotherapy with multiple immune checkpoint blockading therapies [96,97]. In an exploratory study, Ng and colleagues observed changes in plasma metabolomic profiles, including the lipid and amino acid metabolism of patients undergoing radiotherapy. Several metabolites were associated with underlying liver function, receipt of radiotherapy and, most interestingly, liver function decline following radiotherapy [98].
Two small trials investigated potential biomarkers in patients undergoing chemoradiation for primary and secondary liver malignancies. One looked at patients undergoing chemoradiation for liver and lung metastases of colorectal cancer (n = 35) resulted in a mid-treatment increase in serum ceramide, a pro-apoptotic sphingolipid. Interestingly, high versus low ceramide levels were able to categorize patients into responders and non-responders, respectively [99]. Another promising marker explored in the plasma samples of 10 HCC patients is the inter-alpha inhibitor heavy chain H4 isoform 2 precursor (ITIH4) [100]. Patients were stratified by prognosis, and those in the favorable prognosis group showed significant higher levels of ITIH4 at baseline compared to those with a poor prognosis. Both ceramide and ITIH4 levels need to be evaluated in larger randomized studies to further evaluate their clinical potential.

Summary and Future Directions
Radiotherapy is a mainstay of treatment for many cancers, and its integration with other therapies holds great promise for liver cancers. Further progress will be facilitated by new strategies for treatment personalization. However, these strategies will depend on the discovery and validation of biomarkers to guide individualized patient management. This limitation is particularly evident in patients with HCC, as they often have underlying liver insufficiency and poorer overall health that often limit management. In recent years, irradiation of the liver has emerged as a valuable treatment option for these patients, but for those with more severe liver dysfunction, its applicability is hampered by concerns of RILD that may result in further impairment of liver function. The availability of predictive and prognostic biomarkers, ones which can also allow for the adjustment of treatment parameters such as dose and field size as treatment progresses, would increase the availability and safety of radiation therapy for a larger group of patients. Blood-based biomarkers may be most practical as they are widely available and can be easily and frequently measured. However, these blood-based biomarkers rely on the assumption that local therapy will result in a measurable systemic effect. Several prognostic and predictive biomarkers have recently been identified to bridge this gap (Table 3). However, most studies have been small and retrospective in nature, with heterogeneous treatment regimens that limit comparability. Interestingly, most markers predictive of liver toxicity following radiotherapy were measured mid-treatment, while the prognostic biomarkers are more typically measured prior to treatment. In addition, most studies have focused on patients with HCC. The unique features of HCC, a highly vascularized tumor often found in the setting of underlying liver fibrosis or cirrhosis, may limit the applicability of these findings to other primary and secondary liver cancers. To increase conclusiveness and validation of these biomarkers, well-powered, prospective randomized controlled trials need to be conducted. Studies are particularly needed for patients with CTP B and C scores, as they currently have limited treatment options and may particularly benefit from radiotherapy. Given the short treatment times with some treatment regimens such as stereotactic body radiation (SBRT), predictive tests with a short turnover time are favored. Alternatively, a test could be performed following half the dose with a planned break, as proposed by colleagues at the University of Michigan Health Services [37,38]. Table 3. Potential prognostic and predictive scores and biomarkers for patients undergoing radiotherapy for liver malignancies.

Conclusions
Blood-based diagnostics hold great potential for personalizing radiotherapy in patients with malignant liver disease. Strong evidence supports the notion that patients with CP class A baseline liver function should be further stratified by ALBI score prior to radiation treatment and treated according to their assigned subgroup. Other promising biomarkers are currently being investigated in prospective randomized trials and may provide additional insights into their prognostic or predictive value.  Acknowledgments: Illustration in Figure 1 was created using BioRender.com (accessed on 2 July 2021).

Conflicts of Interest:
F.H. received nonfinancial research support (provision of activity trackers to institution for studies) from Beurer GmbH outside the submitted work; F.H.'s department (Radiation Oncology, University Hospital Tuebingen, Germany) has research agreements with Elekta, Philips, Siemens, Sennewald Medizintechnik, Kaiku Health, TheraPanacea, PTW, and ITV. D.G.D. received consultant fees from Simcere, Surface Oncology, Sophia Bioscience, Innocoll and BMS and research grants from Exelixis, BMS and Surface Oncology. No reagents from these companies were used in this study.