Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (6,884)

Search Parameters:
Keywords = nutritional status

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
25 pages, 783 KB  
Article
Field-Based Evaluation of Reactive Oxygen Species Treatments and Fungicide Protections in Potato: Effects on Late Blight, Plant Nutritional Status, Yield, and Tuber Quality
by Karol Skrobacz, Małgorzata Szostek and Maciej Balawejder
Agronomy 2026, 16(9), 912; https://doi.org/10.3390/agronomy16090912 (registering DOI) - 30 Apr 2026
Abstract
The aim of the study was to determine, under field conditions, the effects of O3, H2O2, and fungicide protection on potato late blight severity, SPAD values, tuber yield, and mineral composition, and additionally to assess whether the [...] Read more.
The aim of the study was to determine, under field conditions, the effects of O3, H2O2, and fungicide protection on potato late blight severity, SPAD values, tuber yield, and mineral composition, and additionally to assess whether the number of ozone applications modifies selected tuber quality traits. Two complementary field experiments were conducted in 2017 and 2018. In the main experiment, control, fungicide protection, ozone fumigation, and foliar H2O2 treatments were compared with respect to late blight severity, SPAD response, yield, and macro- and micronutrient contents in tuber peel and flesh. In the supplementary experiment, single, double, and triple ozonation were compared in relation to starch content, vitamin C concentration, and tuber mineral composition. Fungicide treatment most effectively limited late blight symptoms, particularly at later assessment dates, and was associated with the highest tuber yield. SPAD values, yield, and several mineral traits were strongly dependent on the study year, indicating a major contribution of environmental conditions. The response to O3 and H2O2 was selective and less stable than that observed under fungicide protection. In the supplementary experiment, the number of ozone applications did not significantly affect starch content. Vitamin C concentration depended mainly on the study year, whereas tuber mineral composition depended mainly on year and tissue type. The results indicate that, under field conditions, fungicide protection remained the most effective option for limiting late blight and achieving the highest tuber yield, whereas O3 and H2O2 should be regarded as factors capable of modifying selected plant and tuber traits, but not as direct substitutes for standard chemical protection. Full article
(This article belongs to the Special Issue Harnessing Reactive Oxygen Species (ROS) for Crop Performance)
23 pages, 3141 KB  
Review
From Growth Trajectory to Functional Decline: Age-Contextualized Nutritional Strategies for Muscle Vulnerability. A Narrative Review
by Luisa Malaguarnera, Vincenzo Sortino, Sofia Surdo and Salvatore Piro
Nutrients 2026, 18(9), 1437; https://doi.org/10.3390/nu18091437 - 30 Apr 2026
Abstract
Muscle vulnerability occurs at both extremes of the human lifespan, although its biological significance differs substantially between developmental growth and late-life decline. During childhood and adolescence, insufficient muscle accretion reflects disruption of physiological anabolic trajectories driven by inadequate energy availability, inflammatory burden, endocrine [...] Read more.
Muscle vulnerability occurs at both extremes of the human lifespan, although its biological significance differs substantially between developmental growth and late-life decline. During childhood and adolescence, insufficient muscle accretion reflects disruption of physiological anabolic trajectories driven by inadequate energy availability, inflammatory burden, endocrine imbalance, or disease-associated catabolism. In older adults, muscle deterioration is characterized by anabolic resistance, neuromuscular remodeling, chronic low-grade inflammation, and hormonal decline, culminating in sarcopenia and loss of functional independence. The absence of harmonized diagnostic frameworks across age groups limits direct translational extrapolation. A lifespan-informed perspective distinguishing growth-supportive from function-preserving nutritional approaches is, therefore, required. This narrative review examines how major classes of nutritional bioactive interact with molecular pathways regulating skeletal muscle homeostasis in fragile populations across the lifespan. The analysis encompasses energy adequacy, protein quantity and quality, amino acid-dependent anabolic signaling, vitamin D status, lipid-derived mediators, redox-modulating phytochemicals, and micronutrients supporting mitochondrial bioenergetics. In pediatric contexts, nutritional interventions primarily aim to restore anabolic permissiveness within a structurally intact growth environment. In aging individuals, strategies focus on mitigating anabolic resistance through optimized protein intake, correction of micronutrient insufficiencies, and integration with resistance exercise to preserve functional capacity. This narrative review emphasizes the need to distinguish mechanistic rationale from clinically validated interventions, as improvements in molecular pathways do not consistently translate into meaningful functional outcomes. Full article
(This article belongs to the Section Geriatric Nutrition)
Show Figures

Figure 1

13 pages, 851 KB  
Article
Dietary Iron Sources Among 9-Month-Old Infants from Low-Income Households
by Elizabeth F. Acquah, Jeffrey D. Labban, Seth M. Armah, Maureen M. Black, Marjorie Jenkins, Deborah Clarice Andoh and Jigna M. Dharod
Nutrients 2026, 18(9), 1417; https://doi.org/10.3390/nu18091417 - 30 Apr 2026
Abstract
Background: The 2025–2030 Dietary Guidelines for Americans recommend that 6–12-month-old infants receive 11 mg iron/day. The contribution of iron-rich foods in meeting guidelines is unclear. Objectives: The aims were to: (1) determine the contribution of iron-fortified cereal, infant formula and heme-iron [...] Read more.
Background: The 2025–2030 Dietary Guidelines for Americans recommend that 6–12-month-old infants receive 11 mg iron/day. The contribution of iron-rich foods in meeting guidelines is unclear. Objectives: The aims were to: (1) determine the contribution of iron-fortified cereal, infant formula and heme-iron sources to infants’ total dietary iron intake; (2) examine differences in iron adequacy by milk-feeding type; and (3) identify feeding patterns associated with meeting daily iron requirements through dietary sources. Methods: Mothers of infants were recruited from a pediatric clinic and 24 h feeding recalls were conducted to estimate infants’ iron intake. Infants’ milk-feeding types were: breastmilk only (BF), mixed (MF), or infant formula only (FF). Main outcomes were: meeting/not meeting daily iron requirement (11 mg) overall and by milk-feeding type; contribution of iron-fortified infant cereal, formula and meat to daily iron intake. Descriptive statistics, bivariate chi-square tests, and multivariate logistic regression analyses were conducted. Results: Most participants identified as African American or Hispanic (76%) and were enrolled in the Special Supplemental Nutrition Program for Women, Infants, and Children (84%). Thirty-nine percent consumed < 11 mg iron/day from dietary sources. By milk-feeding type, inadequate iron intake was significantly higher among the BF (72%) and MF (74%) groups vs. the FF group (24%, p < 0.05). Iron-fortified cereals were consumed by 46% of infants and provided a median iron intake of 6.75 mg. Among the FF group, infant formula provided 63% of the daily iron requirement. Conclusions: Inadequate dietary iron intake is common. Iron-fortified cereal is an important dietary iron source. Future research is warranted to understand the relations among infants’ daily iron intake, iron sources (heme vs. non-heme), and iron status. Full article
(This article belongs to the Special Issue Infant and Toddler Feeding and Development)
Show Figures

Figure 1

10 pages, 847 KB  
Article
RDW-to-Albumin Ratio as a Simple Biomarker for Early Mortality Risk After LVAD Implantation
by İbrahim Demir, Bilge Ecemiş, Ayşe Zorba, Selinsu Güleşce, Yahya Yıldız, İbrahim Oğuz Karaca and Korhan Erkanlı
Medicina 2026, 62(5), 853; https://doi.org/10.3390/medicina62050853 - 30 Apr 2026
Abstract
Background and Objectives: Early risk stratification remains challenging in patients undergoing left ventricular assist device (LVAD) implantation. Red cell distribution width (RDW) and serum albumin reflect systemic stress and nutritional reserve; their ratio (RDW-to-albumin ratio, RAR) may provide a simple preoperative index. We [...] Read more.
Background and Objectives: Early risk stratification remains challenging in patients undergoing left ventricular assist device (LVAD) implantation. Red cell distribution width (RDW) and serum albumin reflect systemic stress and nutritional reserve; their ratio (RDW-to-albumin ratio, RAR) may provide a simple preoperative index. We evaluated whether preoperative RAR is associated with early mortality after LVAD implantation. Materials and Methods: We conducted a retrospective cohort study of LVAD recipients (2019–2025). RAR was calculated as RDW (%) divided by albumin (g/dL) from preoperative blood tests obtained 24–48 h before surgery. The primary endpoint was in-hospital mortality. The secondary endpoint was 90-day survival. In-hospital mortality was analyzed using logistic regression with parsimonious adjustment for INTERMACS high-risk status (profiles 1–2 vs. 3–7); penalized regression was used to reduce small-sample bias. Discrimination was assessed using receiver operating characteristic (ROC) analysis. Ninety-day survival was evaluated using Cox proportional hazards models. Results: Forty-seven patients were included (37 survivors; 10 in-hospital deaths). Higher RAR was associated with increased odds of in-hospital mortality and remained significant after adjustment for INTERMACS high-risk status (OR 1.68, 95% CI 1.04–2.90). INTERMACS high-risk status was strongly associated with in-hospital mortality (OR 17.89, 95% CI 3.19–138.07). RAR demonstrated good discrimination for in-hospital mortality (AUC 0.801, 95% CI 0.648–0.955). For 90-day survival, RAR showed a borderline association in unadjusted analysis (HR 1.28, 95% CI 0.98–1.68) and was not significant after adjustment (HR 1.20, 95% CI 0.89–1.63). Conclusions: In this small single-center cohort, preoperative RAR was independently associated with in-hospital mortality after LVAD implantation. These findings should be considered hypothesis-generating and require external validation. Full article
(This article belongs to the Special Issue New Insights into Heart Failure Management and Treatment)
Show Figures

Figure 1

15 pages, 723 KB  
Review
Vitamin K Biochemistry and Pharmacokinetics: The Basis of Late Vitamin K Deficiency Intracranial Bleeding in Early Infancy
by Serafina Perrone, Virginia Beretta, Vincenzo Raitano, Liana Cerioni and Silvia Carloni
Int. J. Mol. Sci. 2026, 27(9), 4000; https://doi.org/10.3390/ijms27094000 - 29 Apr 2026
Abstract
Vitamin K is a fat-soluble vitamin essential for the activation of vitamin K-dependent proteins involved in coagulation and other physiological processes. Neonates are particularly vulnerable to vitamin K deficiency due to limited placental transfer, low hepatic stores, immature liver function, and insufficient dietary [...] Read more.
Vitamin K is a fat-soluble vitamin essential for the activation of vitamin K-dependent proteins involved in coagulation and other physiological processes. Neonates are particularly vulnerable to vitamin K deficiency due to limited placental transfer, low hepatic stores, immature liver function, and insufficient dietary intake, especially in exclusively breastfed infants. This review summarizes the biochemistry and pharmacokinetics of vitamin K, focusing on their role in the pathogenesis of late vitamin K deficiency bleeding (VKDB), including intracranial hemorrhage in early infancy. The limitations of conventional coagulation tests are discussed, highlighting the importance of functional biomarkers such as PIVKA-II (Proteins Induced by Vitamin K Absence or Antagonist-II) for the early detection of subclinical deficiency. Despite effective prophylaxis at birth, late VKDB cases still occur, likely due to declining vitamin K levels over time and nutritional factors. These findings underscore the need for prolonged vitamin K supplementation following adequate prophylaxis at birth, particularly to protect high-risk newborns from late VKDB. Strategies may include vitamin K-containing multivitamin supplementation in preterm infants, as well as daily oral vitamin K supplementation (150 µg/day) in exclusively breastfed infants, in order to ensure adequate vitamin K status during early infancy. Full article
(This article belongs to the Special Issue Drug and Non-Drug Treatment of Cerebral Diseases)
34 pages, 847 KB  
Article
Dietary and Oral Hygiene Behaviors Associated with Prevalent Caries Status in School-Aged Children of Northern Italy
by Virginia Troiani, Edoardo Ratti, Daniel Gonnella, Maria Cristina Panzeri, Paola Palestini and Emanuela Cazzaniga
Nutrients 2026, 18(9), 1416; https://doi.org/10.3390/nu18091416 - 29 Apr 2026
Abstract
Background/Objectives: Unhealthy dietary behaviors and suboptimal oral hygiene practices remain common among Italian children, potentially affecting both nutritional and oral health. Dental caries, a preventable yet highly prevalent condition in pediatric populations, has a multifactorial etiology in which lifestyle factors play a key [...] Read more.
Background/Objectives: Unhealthy dietary behaviors and suboptimal oral hygiene practices remain common among Italian children, potentially affecting both nutritional and oral health. Dental caries, a preventable yet highly prevalent condition in pediatric populations, has a multifactorial etiology in which lifestyle factors play a key role. This study aimed to assess the prevalence of dental caries, dietary habits, and oral hygiene behaviors in school-aged children in Lombardy, and to identify factors associated with prevalent caries status. Methods: A cross-sectional study was conducted on 307 schoolchildren aged 9–10 years from ten schools in Northern Italy. Oral health status was evaluated through the plaque index and the DMFT/dmft index during school-based dental examinations. Dietary habits, lifestyle, and oral hygiene practices were collected through structured questionnaires. A mixed-effects logistic regression model was developed to explore potential associations between variables and prevalent caries status. Results: The dietary patterns, weight status, oral hygiene behaviors, and oral health conditions were generally consistent with the national data. Higher plaque index, skipping breakfast, consuming mid-morning snacks, and parental reports of previous caries experiences were retained in the final model. Internal validation suggested reasonable discriminatory ability overall, whereas calibration shows heterogeneity across schools. Conclusions: The findings highlight suboptimal dietary and oral hygiene behaviors among Lombardy schoolchildren and confirm their association with dental caries. Lifestyle-related factors, particularly oral hygiene practices and eating patterns, showed a relevant association with prevalent caries status in the analyzed sample. These results underscore the need for targeted preventive strategies integrating nutritional education and oral health promotion in pediatric populations. Full article
13 pages, 520 KB  
Article
The Hereditary Angioedema Frailty and Inflammation Score (HAE-FIS): A Preliminary Study on the Inter-Attack Chronic Burden
by Ayşe Melike Gerek, Fatih Çölkesen and Şevket Arslan
J. Clin. Med. 2026, 15(9), 3417; https://doi.org/10.3390/jcm15093417 - 29 Apr 2026
Abstract
Background/Objectives: Current management of Hereditary Angioedema (HAE) predominantly focuses on acute attack control and prophylaxis. However, the cumulative “inter-attack” burden driven by chronic low-grade inflammation and its impact on physical frailty remain under-investigated. This preliminary, proof-of-concept study proposes a novel composite score, the [...] Read more.
Background/Objectives: Current management of Hereditary Angioedema (HAE) predominantly focuses on acute attack control and prophylaxis. However, the cumulative “inter-attack” burden driven by chronic low-grade inflammation and its impact on physical frailty remain under-investigated. This preliminary, proof-of-concept study proposes a novel composite score, the “Hereditary Angioedema Frailty and Inflammation Score” (HAE-FIS), to quantify this inter-attack frailty-like inflammatory burden. Methods: In this single-center retrospective study, 46 patients with C1-inhibitor deficiency were evaluated. The HAE-FIS (range 0–5) was constructed using five routinely available biomarkers reflecting inflammation and nutritional status: C-reactive protein (CRP), albumin, hemoglobin, body mass index (BMI), and inter-attack symptom burden. Patients were categorized based on annual attack frequency (Infrequent ≤ 6 vs. Frequent > 6 attacks/year). Results: The median diagnostic delay was 12.0 years. Patients with frequent attacks had significantly higher median HAE-FIS scores compared to the infrequent group (p = 0.008). Component analysis revealed that ‘High CRP’ (46.4% vs. 16.7%; p = 0.039) and ‘Low BMI’ (25.0% vs. 0.0%; p = 0.032) were significantly more prevalent in patients with frequent attacks. Notably, low BMI was observed exclusively in the frequent attack group, suggesting a specific phenotype at risk for sarcopenia. While multivariable logistic regression was constrained by the sample size, the annual attack count emerged as the strongest predictor for a high HAE-FIS score (OR: 1.049; p = 0.070). Conclusions: High attack frequency in HAE is associated with a measurable cumulative systemic burden characterized by inflammation and nutritional risk. These findings support the development of an “HAE Rehabilitation” framework integrating functional preservation into long-term management. Full article
(This article belongs to the Section Immunology & Rheumatology)
27 pages, 3988 KB  
Review
Sustainable Insect-Based Diets in Sub-Saharan Africa: A Review of Prevalence, Acceptability and Impact on Nutritional Status
by Maria Rouco, Charity Chinonso Ugwu, Gabriel Reina and Silvia Carlos
Nutrients 2026, 18(9), 1414; https://doi.org/10.3390/nu18091414 - 29 Apr 2026
Abstract
Malnutrition—including undernutrition, micronutrient deficiencies and overweight—remains a major public health concern in sub-Saharan Africa, largely driven by food insecurity. Edible insects have been proposed as a sustainable, nutrient-dense dietary alternative with potential to improve food security and nutritional outcomes. This review analyses studies [...] Read more.
Malnutrition—including undernutrition, micronutrient deficiencies and overweight—remains a major public health concern in sub-Saharan Africa, largely driven by food insecurity. Edible insects have been proposed as a sustainable, nutrient-dense dietary alternative with potential to improve food security and nutritional outcomes. This review analyses studies published until January 2024 in PubMed and Google Scholar assessing the prevalence, acceptability and nutritional impact of insect-based diets in sub-Saharan Africa. Thirteen original studies, predominantly qualitative, conducted in 8 of 47 countries in the region, met inclusion criteria. Two reviews provided additional evidence. Most studies focused on acceptability, which was strongly influenced by cultural and religious norms. Higher acceptance was observed among older individuals and those with lower educational attainment, while younger and more urbanized populations showed greater reluctance. Reported motivations for consumption included tradition, taste and perceived nutritional value. Some studies highlighted potential health risks related to food safety and the need for improved regulatory frameworks. The available nutritional analyses showed that edible insects are rich in protein and essential micronutrients, particularly iron and zinc, suggesting their potential to address common deficiencies. Although evidence on long-term nutritional impact remains limited, current findings support the feasibility and potential public health relevance of promoting insect-based diets in low-income settings. Full article
Show Figures

Figure 1

16 pages, 317 KB  
Article
Optimizing Public Health Screening: Population-Specific BMI Thresholds for Targeted Body Composition Assessment in Hungary
by Tamas Jarecsny, Nadim Al-Muhanna, Dora Rebeka Fabian, Roland Kosik, Richard Schwab, Gergo Jozsef Szollosi, Laszlo Schandl, Gyula Tomasics, Eszter Melinda Pazmandi, Andras Folyovich, Ferenc Fazekas and Monika Fekete
Nutrients 2026, 18(9), 1410; https://doi.org/10.3390/nu18091410 - 29 Apr 2026
Abstract
Background: Body mass index (BMI) is widely used as a proxy of nutritional status and related lifestyle risk patterns in public health, yet it does not capture body composition–related heterogeneity in cardiometabolic risk. Evidence on whether a more detailed body composition assessment improves [...] Read more.
Background: Body mass index (BMI) is widely used as a proxy of nutritional status and related lifestyle risk patterns in public health, yet it does not capture body composition–related heterogeneity in cardiometabolic risk. Evidence on whether a more detailed body composition assessment improves population-level screening efficiency remains inconsistent, particularly in Central European populations. Methods: We conducted a cross-sectional analysis of 868 Hungarian adults participating in a nationwide mobile screening program. Locally weighted regression identified sex-specific BMI inflection points for cardiometabolic risk. Stratified receiver operating characteristic (ROC) analyses compared BMI with bioelectrical impedance-derived parameters across five outcomes. Cost- and time-effectiveness of scalable screening strategies were modeled at the population level. Results: Cardiometabolic risk increased at BMI levels below current WHO thresholds (females: 21.8–22.3 kg/m2; males: 23.8–24.3 kg/m2). Overall, body composition parameters did not outperform BMI in the full population. Subgroup-specific differences were observed, particularly among men with BMI 24–36 kg/m2 for atherosclerosis risk, suggesting limited and outcome-specific added value rather than broad superiority over BMI. Together, non-linear risk patterns, stratified performance, and population-level modeling converged on mid-range BMI intervals (females: 22–30 kg/m2; males: 24–30 kg/m2) as likely screening windows of phenotypic heterogeneity. Within these ranges, targeted InBody assessment may help refine risk assessment for selected individuals. A mixed screening strategy covering 52% of the population would cost 178.4% of BMI-only screening, while reducing throughput by 24.3%. Conclusions: Population-specific BMI thresholds may more accurately reflect early deviations in nutritional and cardiometabolic risk than current universal cutoffs. BMI remains a useful first-line marker, and body composition assessment may add complementary information in selected BMI ranges. Overall, these findings support a potentially useful, subgroup-specific screening approach, but the modeled cost and time trade-offs should be considered hypothesis-generating and require further validation. Full article
21 pages, 580 KB  
Article
Maternal Diet, Lifestyle Factors, and Gestational Weight Gain: A Single-Center Case–Control Study in Hungary
by Edit Paulik, Anita Sisák, Anna Szolnoki, Evelin Olteán-Polanek, Márió Gajdács, Regina Molnár, Andrea Szabó, Gábor Németh and Hajnalka Orvos
Nutrients 2026, 18(9), 1403; https://doi.org/10.3390/nu18091403 - 29 Apr 2026
Abstract
Background/Objectives: Preterm birth (PTB) is a major public health concern worldwide, which may lead to detrimental maternal and neonatal outcomes. Maternal nutritional status, gestational weight gain (GWG), and lifestyle factors are potentially modifiable determinants of adverse pregnancy outcomes. This study examined the association [...] Read more.
Background/Objectives: Preterm birth (PTB) is a major public health concern worldwide, which may lead to detrimental maternal and neonatal outcomes. Maternal nutritional status, gestational weight gain (GWG), and lifestyle factors are potentially modifiable determinants of adverse pregnancy outcomes. This study examined the association between PTB and maternal GWG and assessed whether maternal dietary habits and lifestyle factors were related to GWG in women delivering preterm versus at term. Methods: A retrospective case–control study was conducted at a tertiary center in Hungary (MANOR Study, 2019). The case group included n = 100 women with PTB, while n = 200 matched term deliveries served as controls (1:2 ratio). Data were collected using a self-administered questionnaire and medical records. Pre-pregnancy body mass index (BMI) was categorized using standard definitions, while GWG was classified as inadequate, recommended, or excessive according to the US 2009 Institute of Medicine guidelines. A 7-item dietary index score was calculated based on gestational dietary habits. Results: Pre-pregnancy BMI distribution did not considerably differ between groups (p > 0.05); over one-third of women in both groups were overweight or had obesity (38.7% vs. 36.7%). Previous PTB (p < 0.001) and gestational hypertension (GHT) (p = 0.003) were more common among current PTB cases, while smoking, alcohol consumption, and gestational diabetes mellitus (GDM) showed negligible differences (p > 0.05)—28.0% of cases, and 34.5% of controls were classified as having healthy dietary habits, based on the dietary index score calculated. Inadequate GWG was more prevalent among PTB cases (49.0% vs. 26.8%), whereas excessive GWG was less frequent among cases (21.9% vs. 38.4%). Being within the recommended GWG range and the manifestation of gestational hypertension were associated with lower (aOR: 0.39; 95% CI: 0.18–0.87; p = 0.020) and higher (aOR: 3.43; 95% CI: 1.44–8.19; p = 0.005) odds of PTB, respectively. Conclusions: Inadequate GWG was more common in PTB, while excessive GWG was more frequent in term pregnancies. Fast-food consumption was associated with excessive GWG among term births. Optimizing GWG and improving maternal diet quality should be included as key, cross-cutting interventions targeting the improvement of antenatal care. Full article
(This article belongs to the Special Issue Effects of Nutrition and BMI on Obstetric–Gynecological Pathologies)
Show Figures

Figure 1

26 pages, 1654 KB  
Article
Effectiveness of a Comprehensive Program Including a Novel Concentrated High-Protein, High-Calorie Oral Nutritional Supplement to Enhance Nutritional and Morphofunctional Recovery in Malnourished Patients with Cancer: The ONAVIDA Study
by José Manuel García-Almeida, Rocío Fernández-Jiménez, Ana Hernández-Moreno, Gabriel Olveira, Mercedes Vázquez-Gutiérrez, Carolina Dassen, Pedro Pablo García-Luna, Amalia González-Jiménez, Josefina Olivares, María García-Duque, Mª José Martínez-Ramírez, Juan Manuel Guardia-Baena, María I. Rebollo-Pérez, Miguel Civera, Visitación Álvarez-de Frutos, Vicente Faus, Lucía Díaz-Naya, José Joaquín Alfaro-Martínez and Alejandro Sanz-París
Nutrients 2026, 18(9), 1398; https://doi.org/10.3390/nu18091398 - 29 Apr 2026
Abstract
Background/Objectives: Malnutrition in cancer adversely affects treatment outcomes and survival. Early intervention through oral nutritional supplements (ONSs) and dietary counseling can improve outcomes. This study evaluated the evolution of nutritional and morphofunctional parameters over three months in malnourished patients with cancer undergoing a [...] Read more.
Background/Objectives: Malnutrition in cancer adversely affects treatment outcomes and survival. Early intervention through oral nutritional supplements (ONSs) and dietary counseling can improve outcomes. This study evaluated the evolution of nutritional and morphofunctional parameters over three months in malnourished patients with cancer undergoing a comprehensive nutritional support program comprising dietary counseling, physical activity, and a novel concentrated high-protein, high-calorie ONS (cHPHC-ONS) with a high intrinsic leucine content. Methods: A prospective, observational, multicenter cohort study was conducted across 18 public hospitals in Spain. Two hundred thirty malnourished patients with cancer were enrolled: 147 naïve (no ONS treatment in the last three months) and 83 non-naïve (who transitioned to cHPHC-ONS after inadequate response to initial ONSs). Nutritional status was assessed using Global Leadership Initiative on Malnutrition (GLIM) criteria and morphofunctional parameters via bioelectrical impedance analysis, nutritional ultrasound, handgrip strength, the Timed Up and Go (TUG) test, and analysis of biochemical parameters. Results: After three months, 23.8% achieved normal GLIM nutritional status (p < 0.0001), with a greater improvement seen in non-naïve patients (28.4%, p < 0.0001). Weight loss ceased in 42.6% (p < 0.0001). and inflammation resolved for 10.3% (p = 0.0015). Non-naïve patients experienced a significant increase in fat-free mass index (p = 0.0159), appendicular skeletal muscle index (p = 0.0248), and rectus femoris cross-sectional area (p = 0.0016). Muscle strength increased significantly by +1.7 kg (p = 0.0025), and TUG test time decreased by 1.13 s (p = 0.0003) overall. Conclusions: The comprehensive nutritional support program—including a novel cHPHC-ONS, along with dietary and physical activity guidance—significantly improved the nutritional and morphofunctional status of malnourished patients with cancer, with benefits particularly evident in non-naïve individuals. Limitations: Observational design, no control group, short follow-up, and unadjusted non-multivariable comparisons, limiting causal inference. Full article
(This article belongs to the Section Clinical Nutrition)
Show Figures

Graphical abstract

14 pages, 780 KB  
Article
Early Body Mass Index Trajectory as a Marker of Metabolic and Nutritional Changes in Critically Ill Patients
by Ah Young Leem, Shihwan Chang, Chanho Lee, Mindong Sung, Hye Young Hong, Geun In Lee, Youngmok Park, Seung Hyun Yong, Ala Woo, Sang Hoon Lee, Song Yee Kim, Kyung Soo Chung, Eun Young Kim, Ji Ye Jung, Young Ae Kang, Moo Suk Park, Young Sam Kim and Su Hwan Lee
Nutrients 2026, 18(9), 1396; https://doi.org/10.3390/nu18091396 - 29 Apr 2026
Abstract
Background: Body mass index (BMI) is a common nutritional marker, but admission-only measurements present limitations. Early dynamic BMI changes may better reflect metabolic stress and fluid balance. However, the clinical significance of early BMI trajectory during critical illness remains poorly understood. This study [...] Read more.
Background: Body mass index (BMI) is a common nutritional marker, but admission-only measurements present limitations. Early dynamic BMI changes may better reflect metabolic stress and fluid balance. However, the clinical significance of early BMI trajectory during critical illness remains poorly understood. This study evaluated the impact of early BMI trajectory on mortality and ventilator weaning in critically ill patients. Methods: This retrospective cohort study included 1355 adult patients (ICU stay ≥ 7 days) admitted to the medical ICU between 2019 and 2025. BMI trajectory was defined as the percentage change from admission to day 7 and was categorized into three groups: decrease (>5% reduction), stable (±5%), and increase (>5% gain). Multivariable Cox proportional hazard and logistic regression analyses were performed to evaluate the association between BMI trajectory and clinical outcomes. Results: Of the 1355 patients, 15.9%, 57.7%, and 26.4% were in the decrease, stable, and increase groups, respectively. The increase group demonstrated significantly higher hospital mortality (52.5%) than the decrease (41.9%) and stable (40.0%) groups (p = 0.001). Multivariable analysis revealed that an increasing BMI trajectory was independently associated with higher hospital mortality (HR 1.25, 95% CI 1.05–1.48). A decreasing BMI trajectory strongly predicted successful ventilator weaning (OR 2.76, 95% CI 1.81–4.21). Conclusions: Early BMI trajectory significantly predicted ICU outcomes. Increasing and decreasing BMI were associated with higher mortality and improved ventilator weaning, respectively. These findings suggest that BMI trajectory may be a simple surrogate marker of metabolic stress, nutritional status, and fluid balance during early critical illness. Full article
(This article belongs to the Special Issue Nutritional Support for Critically Ill Patients)
Show Figures

Graphical abstract

24 pages, 941 KB  
Review
Artificial Intelligence-Guided Artificial Nutrition in Critical Illness: Integrating Indirect Calorimetry and BIVA for Metabolic Precision
by Marialaura Scarcella, Antonella Cotoia, Luigi Vetrugno, Emidio Scarpellini, Gian Marco Petroni, Cristian Deana, Rachele Simonte, Riccardo Monti, Rita Commissari, Edoardo De Robertis and Elena Bignami
Nutrients 2026, 18(9), 1387; https://doi.org/10.3390/nu18091387 - 28 Apr 2026
Abstract
Background: Critical illness is characterized by profound and rapidly evolving metabolic derangements driven by systemic inflammation, hypercatabolism, fluid shifts, and endocrine dysregulation. These dynamic changes markedly limit the accuracy of predictive equations, increasing the risk of both underfeeding and overfeeding. Indirect Calorimetry Energy [...] Read more.
Background: Critical illness is characterized by profound and rapidly evolving metabolic derangements driven by systemic inflammation, hypercatabolism, fluid shifts, and endocrine dysregulation. These dynamic changes markedly limit the accuracy of predictive equations, increasing the risk of both underfeeding and overfeeding. Indirect Calorimetry Energy represents the gold standard for measuring energy expenditure, while bioelectrical impedance vector analysis (BIVA) provides complementary insights into hydration status, cellular integrity, and body cell mass. In palliative care, AI-supported integration of indirect calorimetry and BIVA enables goal-concordant artificial nutrition by aligning energy delivery with real-time metabolic status while minimizing symptom burden. Artificial intelligence (AI) has emerged as a promising tool to integrate these heterogeneous data streams and support adaptive nutritional strategies. Methods: We conducted a structured narrative review of the literature published between 2000 and 2025 using PubMed, Scopus, Embase, and Web of Science. Artificial intelligence was not used to perform the literature search or study selection. Instead, AI was analyzed as a clinical and technological component within the included studies and explored as a future enabling strategy. Eligible publications involved adult critically ill patients and addressed indirect calorimetry, BIVA-derived parameters, or AI-based metabolic modeling applied to nutritional support. Given the heterogeneity of study designs and outcomes, findings were synthesized qualitatively. Results: Predictive equations showed substantial inaccuracy in unstable metabolic states, with errors frequently exceeding ±20–40%. Indirect calorimetry enabled individualized assessment of energy expenditure but remained limited by intermittent availability. Serial BIVA assessments consistently identified clinically relevant alterations in hydration status, body cell mass, and phase angle, the latter being strongly associated with adverse outcomes. Studies incorporating AI demonstrated improved integration of calorimetry, BIVA, and clinical variables, allowing identification of metabolic phenotypes, anticipation of metabolic shifts, and generation of adaptive nutritional recommendations. Conclusions: This narrative review highlights the complementary roles of Indirect Calorimetry and BIVA in characterizing metabolic needs in critical illness. Artificial intelligence does not replace these tools but enhances their clinical utility by integrating multidimensional data into dynamic, patient-specific nutritional strategies. The combined AI–IC–BIVA approach represents a promising framework for metabolic precision nutrition in the ICU, warranting prospective validation. Full article
(This article belongs to the Special Issue Nutritional Support for Critically Ill Patients)
17 pages, 813 KB  
Article
Pretreatment Lactate Dehydrogenase-to-Albumin Ratio and Clinical Outcomes in Extensive-Stage Small Cell Lung Cancer: A Multicenter Real-World Study
by Ahmet Unlu, Asim Armagan Aydin, Esra Sazimet Kars, Ozden Ozturk, Mehmet Acun, Mehmet Nuri Baser, Mahmut Kara, Sati Sena Coraoglu, Nurbanu Inci, Muhammet Ali Kaplan, Bilgin Demir, Senar Ebinc, Okan Avci, Hacer Boztepe Yesilcay, Banu Ozturk and Mustafa Yildiz
J. Clin. Med. 2026, 15(9), 3353; https://doi.org/10.3390/jcm15093353 - 28 Apr 2026
Abstract
Background: Reliable biomarkers that capture tumor–host interactions and predict treatment resistance in extensive-stage small cell lung cancer (SCLC) remain limited. We evaluated the prognostic and predictive value of the pretreatment lactate dehydrogenase-to-albumin ratio (LAR), an integrative biomarker reflecting metabolic activity, systemic inflammation, and [...] Read more.
Background: Reliable biomarkers that capture tumor–host interactions and predict treatment resistance in extensive-stage small cell lung cancer (SCLC) remain limited. We evaluated the prognostic and predictive value of the pretreatment lactate dehydrogenase-to-albumin ratio (LAR), an integrative biomarker reflecting metabolic activity, systemic inflammation, and host nutritional status. Methods: This multicenter, retrospective cohort study included patients with extensive-stage SCLC treated at five tertiary centers between 2016 and 2024. Pretreatment LAR was calculated using baseline serum lactate dehydrogenase and albumin levels and dichotomized using a Youden index-derived cut-off at the 12-month overall survival (OS) horizon. Time-dependent receiver operating characteristic (ROC) analyses using inverse probability weighting were performed to assess discriminative performance. Survival outcomes were evaluated using Kaplan–Meier estimates and Cox proportional hazards models. Associations with platinum resistance and lack of objective treatment benefit (defined as progressive disease as best response) were examined using logistic regression models. Results: A total of 223 patients were included. Elevated LAR was associated with inferior OS (median, 15.8 vs. 25.2 months; log-rank p < 0.001) and progression-free survival (7.9 vs. 11.5 months; p < 0.001). In multivariable analysis, LAR remained independently associated with OS (HR, 1.43; 95% CI, 1.04–1.95; p = 0.028). LAR demonstrated modest but consistently superior discriminative performance compared with other inflammatory indices for both 12-month OS (area under the curve [AUC], 0.692) and 6-month progression-free survival (PFS) (AUC, 0.646), with statistically significant differences in DeLong comparisons. Higher LAR was independently associated with increased odds of platinum resistance (adjusted odds ratio [aOR], 2.31; 95% CI, 1.41–3.81; p = 0.001) and lack of objective treatment benefit (adjusted OR, 2.04; 95% CI, 1.33–3.14; p = 0.001). Conclusions: Pretreatment LAR is a clinically accessible and biologically integrative biomarker associated with survival and treatment resistance in extensive-stage SCLC. By capturing tumor–host interactions, LAR may support risk stratification and identify patients at increased risk of early treatment failure. Prospective validation is warranted to define its role in biomarker-driven clinical decision-making. Full article
(This article belongs to the Section Oncology)
Show Figures

Figure 1

19 pages, 764 KB  
Article
Nutritional Status, Body Composition, and Frailty in Community-Dwelling and Institutionalized Albanian Older Adults: A Cross-Sectional Study
by Sadmira Gjergji, Stefania Moramarco, Angela Andreoli, Fabian Cenko, Ersilia Buonomo, Alketa Bicja and Leonardo Palombi
Nutrients 2026, 18(9), 1379; https://doi.org/10.3390/nu18091379 - 28 Apr 2026
Abstract
Background: Albania has undergone a rapid demographic transition characterized by pronounced population aging. Comprehensive geriatric assessment—functional performance, validated nutritional screening tools, and systematic evaluation of morbidities—is essential for accurately characterizing frailty and identifying the risk of malnutrition in its early stages. The [...] Read more.
Background: Albania has undergone a rapid demographic transition characterized by pronounced population aging. Comprehensive geriatric assessment—functional performance, validated nutritional screening tools, and systematic evaluation of morbidities—is essential for accurately characterizing frailty and identifying the risk of malnutrition in its early stages. The objective of the present study was to improve the assessment of the health status of Albanian older adults, both community-dwelling and residing in long-term care facilities, by addressing both functional and nutritional components. Methods: This observational study included Albanian older adults aged ≥ 65 years, both institutionalized and community-dwelling. Frailty and nutritional status were assessed using validated questionnaires (Grauer Geriatric Functional Evaluation and Mini Nutritional Assessment—MNA), alongside body composition analysis performed by bioelectrical impedance analysis (BIA). Results: Data for 123 older adults were analyzed (56.9% female; mean age 71.3 ± 7.4 years; 54.5% institutionalized vs. 45.5% community-dwelling). A high prevalence of frailty and multimorbidity was observed, particularly among institutionalized older adults. With regard to nutritional status, marked age-related differences were identified among females, with a pronounced deterioration in those aged over 75 years. Body-composition-derived parameters identified a substantially higher proportion of individuals at risk of malnutrition compared with other conventional anthropometric measures. Low body cell mass index (BCMI) and institutionalization were the factors with the strongest independent associations with frailty (AOR 5.02, 95% CI 1.69–14.87, p = 0.004, and AOR 5.71, 95% CI 1.76–18.54, p = 0.004, respectively), while low BCMI was the only variable associated with an increased risk of malnutrition (AOR 4.88, 95% CI 1.78–13.40, p = 0.002). Conclusions: These exploratory findings suggest that incorporating body composition parameters into geriatric assessment may provide complementary information alongside traditional screening tools to support the development of targeted preventive and therapeutic strategies. Full article
(This article belongs to the Section Nutrition and Public Health)
Show Figures

Figure 1

Back to TopTop