1. Introduction
Malnutrition is a common, yet underestimated risk factor for poor prognosis in hospitalized patients [
1]. Malnutrition affects approximately 20–50% of hospitalized patients [
1,
2,
3]. In the absence of appropriate treatment, up to two-thirds of those already malnourished experience further deterioration during hospitalization, while about one-third of patients admitted in a normal nutritional state develop malnutrition during their hospital stay [
4]. Among patients with cardiovascular diseases (CVD), the prevalence of malnutrition reaches up to 70%, depending on the assessment method used. This condition is associated with disease progression, reduced exercise tolerance, increased rates of rehospitalization, and poorer overall prognosis [
5].
Optimization of a patient’s health status prior to planned surgery can significantly reduce the risk of postoperative complications [
2]. However, such a prehabilitation strategy has limited applicability in two dimensions. The first relates to the urgency of the procedure or the potential for urgency to arise within an unpredictable time frame. In this situation, the limiting factor is time, which may be insufficient to achieve a satisfactory therapeutic effect.
The second aspect concerns comorbidities, the extent to which they are controlled, and their impact on current functioning—factors that can be assessed, for example, using the American Society of Anesthesiologists Physical Status (ASA-PS) classification [
3]. In this case, the challenge lies in the patient’s adaptive capacity to undergo nutritional intervention, particularly when combined with the recommended physical activity aimed at improving global metabolism. This issue is particularly relevant to hospitalized older adults, who belong to a high-risk group for adverse health outcomes. Limited physical activity in this population promotes the development of sarcopenia. Frailty syndrome is observed in approximately 40–80% of patients with heart failure (HF), and its prevalence increases with age and disease [
6]. Chronic inflammation accompanying HF, characterized by elevated levels of pro-inflammatory cytokines such as TNF-α and interleukin-6, enhances muscle catabolism, thereby contributing to sarcopenia and the development of frailty. Frailty and malnutrition frequently coexist, forming a vicious cycle of mutually reinforcing metabolic and functional impairments. The coexistence of these conditions is associated with poorer clinical outcomes, more frequent rehospitalizations, reduced quality of life, and increased mortality. The shared pathophysiological mechanism underlying these disorders is chronic inflammation, which impairs muscle function, suppresses appetite, and increases protein degradation, thereby contributing to disease progression [
7]. Additionally, reduced mobility and impaired muscle coordination hinder chewing and swallowing, significantly limiting the absorption of essential nutrients [
8]. Another factor increasing the risk of malnutrition is pharmacotherapy, which may lead to decreased appetite, irritation of the gastrointestinal mucosa, impaired intestinal motility, and reduced saliva production, all of which can adversely affect the patient’s nutritional status. Furthermore, steroid therapy following heart transplantation alters body composition and metabolism, contributing to weight gain, and glucocorticoids may indirectly affect albumin levels by modulating inflammation and protein metabolism [
9,
10,
11]. Special attention should be given to patients with obesity, who may be paradoxically malnourished. This condition, known as sarcopenic obesity, is characterized by a loss of muscle mass accompanied by an excess of adipose tissue [
6]. These patients may also experience qualitative malnutrition, defined as deficiencies in vitamins and minerals despite a high BMI.
Additional factors, although difficult to evaluate parametrically, include organizational and administrative aspects of the healthcare system, which are associated with prolonged hospital stays and an increased risk of potentially preventable hospital complications such as pressure ulcers, infections, and surgical site infections [
12]. Equally important are patient cooperation, treatment tolerance, and the risk of interactions between medications and nutrients.
It is equally problematic in clinical practice to accurately determine the nutritional status. Currently, there is no unified definition of malnutrition or consistent nutritional guidelines for patients with HF [
5], which would take into account both the course of the disease and the presence of comorbidities.
The diagnosis of nutritional disorders is further complicated by the lack of an ideal tool for assessing the degree of malnutrition. Nutritional status may be assessed using anthropometric parameters (e.g., body weight, body mass index (BMI), body composition, and their temporal changes) as well as laboratory measures (e.g., prealbumin, transferrin, albumin, total protein, lymphocyte count, and cholesterol concentration) [
13,
14,
15,
16,
17]. Each of these approaches has significant limitations, and it must be acknowledged that the risk of misclassification of malnutrition is substantial [
16,
17,
18,
19]. These markers are nonspecific, and laboratory reference ranges do not necessarily reflect underlying metabolic disequilibrium. Therefore, it is increasingly recommended to integrate information from all available methods in a given case to reliably identify patients for whom nutritional intervention is essential [
20].
All these problems, as in a lens, focus on an extremely specific group of patients, namely heart recipients. In patients debilitated by heart disease, in whom the timing of surgery cannot be planned, comprehensive nutritional intervention is challenging to implement due to the complexity of multi-organ interactions and the use of pharmacotherapy, which significantly affects commonly used nutritional markers [
21,
22].
This study primarily focused on identifying laboratory biomarkers that could facilitate and expedite the detection of malnutrition in patients following orthotopic heart transplantation (OHT). The aim was to investigate the associations between selected routinely measured venous blood parameters during hospitalization and the risk of malnutrition assessed based on BMI.
3. Results
The study group consisted of 53 patients, 7 women and 46 men. The median age was 54 years (IQR 44–62). The median BMI was 26.3 kg/m
2 (IQR 23–29). Eight patients (15%) were undernourished, defined as BMI < 22 kg/m
2. The characteristics of the subjects, including laboratory parameters of nutritional status, are presented in
Table 1.
Only the distribution of albumin, hemoglobin, and total protein values in the evaluated population was normal, while the other parameters deviated from normal. Considering the accepted laboratory standards, the prevalence of malnutrition among the subjects ranged from 13% to 100% (
Table 2), which translated into discrepancies in estimating malnutrition using these biomarkers at a level of 87% (relative difference of >500%).
Differences in the values of the analyzed nutritional status parameters and the prevalence of malnutrition according to gender (women vs. men), age (older vs. younger), BMI (<22 vs. ≥22 kg/m
2) and Nutritional Risk Screening 2002 (NRS-2002) score (<3 vs. ≥3) are presented in
Table 3,
Table 4,
Table 5 and
Table 6. Cholesterol levels were significantly higher in women and creatinine and BUN in men. In the other subgroups analyzed (age, BMI, NRS-2002), the values of the studied parameters did not differ significantly.
No significant differences were observed between the analyzed markers in the low- and high-risk malnutrition groups defined according to the NRS 2002 score (
Table 6).
Table 7 presents the areas under the ROC curves (AUROCs) for predicting malnutrition based on the studied variables, using the recommended BMI cut-off of 22 kg/m
2. Only CRP with a weak accuracy predicted the risk of malnutrition (
Figure 2), although the CRP value itself did not differ significantly between people with higher and lower BMI (
Table 5).
Finally, correlations between the analyzed parameters (Spearman’s rank correlation coefficients) and agreement in malnutrition classification using these variables (contingency coefficient) were evaluated (
Table 8;
Table 9). Negative correlations were found between CRP and total protein (−0.342;
p = 0.012), albumin (−0.666;
p < 0.0001), cholesterol (−0.287;
p = 0.037), and hemoglobin (−0.383;
p = 0.0046). A positive correlation was observed between CRP and NLR (0.333;
p = 0.014). Other statistically significant correlations are highlighted in bold in
Table 2.
4. Discussion
Malnutrition is an important clinical problem with a significant impact on both hospital and long-term prognosis. It is particularly important in patients with severe chronic heart failure, including those being evaluated for organ transplantation [
8]. The effects of malnutrition, metabolic dysfunction and postoperative complications should be considered when developing a nutrition care plan for patients in the immediate post-transplant phase [
25]. The aim of this study was to determine which routinely measured laboratory parameters—including composite indices derived from them—can aid in identifying patients with malnutrition or wasting. Patients with severe heart failure develop various forms of frailty syndrome, or rather anorexia-cachexia-asthenia syndrome [
26], originally described in the oncology patient population. We have shown that assessing nutritional status and defining malnutrition in patients with severe heart failure who are heart recipients is difficult and should not be based on laboratory parameters classically used in clinical practice. It is necessary to combine several of them or search for new biomarkers to increase the accuracy of diagnosis, which will be particularly important in terms of the possibility of designing reliable studies assessing the impact of nutritional interventions on prognosis. Implementing and adhering to a nutrition support protocol for Orthotopic Heart Transplantation (OHT) patients increases nutrient delivery and is associated with a reduced risk of complications [
27].
Numerous studies have focused on the assessment of malnutrition risk in chronic heart disease [
28,
29,
30,
31]. Malnutrition is not only unintended weight loss but, more importantly, changes in body composition (with the most detrimental effect from muscle mass loss) and progressive metabolic disequilibrium (including impaired central hormonal regulation). The most commonly cited parameters, either individually or in combination, that may help identify patients at risk include total protein, albumin, prealbumin, total cholesterol, hemoglobin, leukocyte parameters (leukocytes, lymphocytes, NLR), and markers of inflammation (CRP). Various cut-off points for identifying malnutrition risk have been reported depending on the study population or methodology and often differ from standard laboratory reference ranges, as demonstrated in our analysis. BMI is widely used in clinical practice to assess malnutrition and is considered a more reliable tool than individual laboratory measurements. It is also an integral component of validated screening tools such as MNA, MUST, and NRS-2002 [
23].For many years, it has been emphasized that BMI is not a reliable indicator, although no superior routine anthropometric assessment method has yet been proposed. The routine cut-offs recommended by the World Health Organization (WHO) do not correspond to the clinical risk of malnutrition or patient survival. The arbitrarily defined “normal” BMI range (18.5–25 kg/m
2) does not correspond to the clinical norm (22–27.5 kg/m
2), which is considered the range associated with the most favourable prognosis—a phenomenon also referred to as the “obesity paradox” [
32]. Duerksen et al. highlight the importance of physical examination in the assessment and diagnosis of malnutrition. A well-validated bedside tool used for this purpose is the Subjective Global Assessment (SGA), which relies on features obtained from the medical history and physical examination (including loss of subcutaneous fat, muscle wasting, and the presence of edema) and classifies patients on a scale ranging from well-nourished to severely malnourished [
33]. The sensitivity and specificity of SGA for diagnosing malnutrition in geriatric patients were reported to be 82% and 72%, respectively [
34].
In our study, we demonstrated that only CRP, with weak accuracy, predicted the risk of malnutrition. However, as mentioned, CRP levels can also be influenced by factors unrelated to nutrition (e.g., cardiovascular diseases) and other inflammatory conditions (e.g., infections). Therefore, from a clinical perspective, CRP should be interpreted solely as a complementary marker to a detailed assessment of nutritional status [
23]. Similar findings were reported by Mądra-Gackowska et al., who observed that patients classified as being at high risk of malnutrition according to the Geriatric Nutritional Risk Index (GNRI) had a significantly higher median CRP level compared with the other groups. Notably, the group of well-nourished patients also included individuals with the highest CRP values, suggesting that relying solely on
C-reactive protein concentration to assess nutritional status may lead to misclassification [
35]. The results of our study are partly consistent with observations from a meta-analysis evaluating various markers of malnutrition in older adults [
23]. That analysis demonstrated that no single parameter is ideal, although among the available markers, BMI, hemoglobin, and total cholesterol were found to be the most useful. In the context of acute illness, BMI, hemoglobin, total cholesterol, and total protein showed better predictive potential than CRP and leukocyte parameters. These negative findings highlight that reliance on single markers alone is inadequate in the population of heart transplant recipients, emphasizing the need for more comprehensive strategies to accurately identify patients at risk of malnutrition. Consequently, future research should aim to identify novel laboratory biomarkers to support the development of integrated models for detecting malnutrition in this population, focusing on approaches that are practical for routine clinical use and incorporate thorough patient assessment.
For this analysis, prealbumin and transferrin were not assessed as potentially more reliable markers of nutritional status, as these tests were not routinely performed during the study period. In heart transplant patients, inflammatory states are associated with an inverse relationship between hepatic protein synthesis and CRP levels, resulting in frequent deficiencies of visceral proteins in those with elevated acute-phase reactants [
36]. Prealbumin screening should be performed only after excluding an acute inflammatory state (CRP > 15 mg/L). In patients not admitted to the intensive care unit and without signs of inflammation, prealbumin can serve as a useful prognostic marker of nutritional status. An increase in prealbumin levels of less than 0.04 g/L per week may indicate unsuccessful nutritional therapy [
16]. Conversely, other studies have shown that serum visceral proteins (albumin and prealbumin) are not reliable predictors of nutrient deficiencies and should not be used as the sole criterion for guiding nutritional therapy, particularly in cohorts consisting mainly of patients with anorexia nervosa [
37]. This may be due to the fact that serum visceral proteins, including albumin and prealbumin, primarily reflect the body’s response to inflammation and disease severity, and are influenced by fluid status rather than nutrient intake alone. According to American Society for Parenteral and Enteral Nutrition (ASPEN) guidelines, they are best considered markers of “nutrition-related risk” associated with inflammation [
38]. In the study by Yeh et al., involving adult surgical Intensive Care Unit (ICU) patients receiving enteral nutrition, it was shown that initial serum albumin levels and their changes over time are inversely associated with inflammation. Although baseline albumin levels may reflect the patient’s nutritional status, neither albumin levels nor prealbumin trends correlate with calorie or protein deficits and should not be used to assess the adequacy of nutritional support [
39]. To add to this, low serum albumin levels may be associated with HF. Albumin plays a key role in maintaining oncotic pressure and modulating the body’s antioxidant and anti-inflammatory responses. In patients with heart failure, hypoalbuminemia promotes fluid shift into the extravascular space, exacerbating fluid retention and edema, and has also been associated with myocardial fibrosis. Reduced ejection fraction in advanced heart failure leads to organ hypoperfusion and activation of the renin–angiotensin–aldosterone system, further accelerating disease progression [
10,
39]. Consequently, heart failure and malnutrition mutually reinforce each other, resulting in poorer prognosis. Interestingly, hypoalbuminemia appears to be particularly strongly associated with heart failure with preserved ejection fraction. This association may be mediated by chronic inflammation, which contributes to myocardial diastolic dysfunction, whereas inhibition of inflammatory pathways has been shown to improve cardiac function in experimental models [
40]. Regarding transferrin, the literature presents conflicting evidence on its utility in assessing nutritional status. Although serum transferrin levels decrease in cases of severe malnutrition, it has been shown to be an unreliable marker for mild malnutrition and for evaluating lean body mass in older patients [
16,
41].
In a cohort of 60 patients assessed five years after OHT, Prenner et al. found that only serum creatinine (AUROC = 0.698) and albumin (AUROC = 0.606) were statistically significant predictors of malnutrition, in contrast to BMI (AUROC = 0.515) [
18]. The nutrition risk index (NRI) was found to be independently associated with a risk of postoperative infection (OR= 0.97) and prolonged postoperative ventilator support (OR = 0.96) [
7]. In the study by Almutawa et al., participants with higher post–heart transplant NRI had an 18% lower risk of death compared with those with lower post-transplant NRI (HR = 0.82; 95% CI, 0.75–0.89;
p < 0.001) [
9]. Similarly, another study demonstrated a significant independent association between lower pre–heart transplant NRI and shorter post-transplant survival [
42]. Interestingly, NRI may be affected by gender [
9]. In a pilot study in Italian heart transplant recipients it was concluded that malnutrition assessed with the MUST score seems not to be associated with short-term mortality or major post-operative complications, but may affect long-term mortality [
22], which is probably related with a plethora of co-variates which affect short-term prognosis. Bayram et al., in a cohort of 195 patients after OHT with a median follow-up of 503.5 days, demonstrated that malnutrition defined by a CONUT score ≥ 2 was an independent predictor of mortality, whereas malnutrition defined by the Prognostic Nutritional Index (PNI) was not [
43]. Yoo and co-workers found that GNRI, PNI and CONUT at time of HT operation were not associated with subsequent mortality but showed prognostic utility only at time of HT listing in those with severe heart failure [
20]. In turn, in lung transplant recipients, preoperative albumin levels < 3.5 g/dL (HR = 2.723) and hemoglobin < 13 g/dL were independent predictors of death [
44].
All above-mentioned studies underline the need for further investigations on the role of available biomarkers and potential confounders. Considering the widespread use of laboratory biomarkers, our findings add valuable insight into their utility in detecting malnutrition in OHT patients in everyday clinical practice.