Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (5,089)

Search Parameters:
Keywords = multi-component model

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 928 KB  
Article
Integrated Multi-Scale Risk Assessment of Reservoir Bank Collapse: A Case Study of Xiluodu Reservoir, China
by Xiaodong Wang, Zihan Wang, Hongjian Liu and Yunchang Liang
Appl. Sci. 2026, 16(3), 1304; https://doi.org/10.3390/app16031304 - 27 Jan 2026
Abstract
Reservoir bank collapse is a critical geological hazard during the operation of large-scale water conservancy projects, controlled by unique hydrodynamic mechanisms induced by reservoir impoundment, and differs significantly from ordinary landslides. Traditional risk assessment methods, however, often struggle to achieve effective integration between [...] Read more.
Reservoir bank collapse is a critical geological hazard during the operation of large-scale water conservancy projects, controlled by unique hydrodynamic mechanisms induced by reservoir impoundment, and differs significantly from ordinary landslides. Traditional risk assessment methods, however, often struggle to achieve effective integration between macro-regional zoning and micro-mechanical analysis. Against this limitation, this study proposes a GIS-integrated multi-scale risk screening framework to achieve the preliminary integration of qualitative regional evaluation and quantitative site-specific analysis. Compared with traditional multi-scale studies, the innovations of this research are as follows: (1) a customized GIS component was developed to realize semi-automatic profile extraction from high-resolution DEMs and batch Bishop stability calculations, overcoming the bottleneck of spatializing micro-models over large areas; (2) a “bottom-up” dynamic feedback mechanism was established, utilizing the quantitative safety factor from site-specific evaluations as an explicit indicator for the conservative screening correction of the macro-regional risk map. Applied to the Xiluodu Reservoir, this framework illustrates a potential multi-scale approach for cross-scale risk screening driven by physical–mechanical mechanisms. This provides both a global perspective and a localized physical basis, offering a strategic screening tool for reservoir management. By linking failure mechanisms directly to spatial impacts, the framework provides a plausible conservative feedback rule for risk-informed decision-making in complex reservoir settings. Full article
14 pages, 36585 KB  
Article
Integrated Multi-Omics and Spatial Transcriptomics Identify FBLL1 as a Malignant Transformation Driver in Hepatocellular Carcinoma
by Junye Xie, Shujun Guo, Yujie Xiao, Yibo Zhang, An Hong and Xiaojia Chen
Cells 2026, 15(3), 246; https://doi.org/10.3390/cells15030246 - 27 Jan 2026
Abstract
Background: Hepatocellular carcinoma (HCC) is characterized by marked intratumoral heterogeneity and poor clinical outcomes. Dysregulated ribosome biogenesis has emerged as a fundamental hallmark of tumor initiation and progression; however, the specific molecular drivers linking this machinery to HCC pathogenesis remain largely undefined. [...] Read more.
Background: Hepatocellular carcinoma (HCC) is characterized by marked intratumoral heterogeneity and poor clinical outcomes. Dysregulated ribosome biogenesis has emerged as a fundamental hallmark of tumor initiation and progression; however, the specific molecular drivers linking this machinery to HCC pathogenesis remain largely undefined. Methods: By integrating multi-omics data from the TCGA and ICGC cohorts, FBLL1 was identified as a key prognostic candidate gene. Its cellular and spatial distribution was analyzed using single-cell RNA sequencing and spatial transcriptomics. Its biological functions in vitro and in vivo were validated through functional experiments, including lentivirus-mediated ectopic expression and siRNA-mediated gene knockdown. Finally, its molecular mechanism was elucidated through transcriptomic analysis and Western blotting. Results: FBLL1 was significantly upregulated in HCC and correlated with poor patient survival. Spatial and single-cell analyses showed that FBLL1 expression was preferentially enriched in malignant hepatocytes within the tumor region. Functionally, knockdown FBLL1 could inhibit the proliferation and clonogenic capacity of HCC cells, while overexpression FBLL1 in non-tumorigenic hepatocytes could promote the tumorigenic phenotype in xenograft models. Transcriptomic analysis indicated that FBLL1 overexpression was associated with the synergistic upregulation of c-Myc and multiple EGFR ligands, as well as decreased expression of hepatocyte functional markers. Consistently, modulation of FBLL1 expression affected the activity of the EGFR–MAPK signaling pathway. Conclusions: Our study identifies FBLL1 as a previously unrecognized regulator associated with malignant state transition in HCC. Rather than acting as a direct regulator of core signaling components, FBLL1 is associated with ligand-dependent activation of the EGFR–MAPK pathway in conjunction with c-Myc upregulation. These findings indicate that FBLL1 represents a promising therapeutic target for disrupting oncogenic signaling programs in liver cancer. Full article
(This article belongs to the Special Issue How Does Gene Regulation Affect Cancer Development?)
Show Figures

Figure 1

24 pages, 47010 KB  
Article
Real-Time Multi-Step Prediction Method of TBM Cutterhead Torque Based on Fusion Signal Decomposition Mechanism and Physical Constraints
by Junnan Feng, Yuzhe Hou, Youqian Liu, Shijia Chen and Ying You
Appl. Sci. 2026, 16(3), 1285; https://doi.org/10.3390/app16031285 - 27 Jan 2026
Abstract
The cutterhead torque of a full-face tunnel boring machine (TBM) is a pivotal parameter that characterises the rock-machine interaction. Its dynamic prediction is of considerable significance to achieve intelligent regulation of the boring parameters and enhance the construction efficiency and safety. In order [...] Read more.
The cutterhead torque of a full-face tunnel boring machine (TBM) is a pivotal parameter that characterises the rock-machine interaction. Its dynamic prediction is of considerable significance to achieve intelligent regulation of the boring parameters and enhance the construction efficiency and safety. In order to achieve high-precision time series prediction of cutterhead torque under complex geological conditions, this study proposes an intelligent prediction method (VBGAP) that integrates signal decomposition mechanism and physical constraints. At the data preprocessing level, a multi-step data cleaning process is designed. This process comprises the following steps: the processing of invalid values, the detection of outliers, and normalisation. The non-smooth torque time-series signal is decomposed by variational mode decomposition (VMD) into narrow-band sub-signals that serve as a data-driven, frequency-specific input for subsequent modelling, and a hybrid deep learning model based on Bi-GRU and self-attention mechanism is built for each sub-signal. Finally, the prediction results of each component are linearly superimposed to achieve signal reconstruction. Concurrently, a novel modal energy conservation loss function is proposed, with the objective of effectively constraining the information entropy decay in the decomposition-reconstruction process. The validity of the proposed method is supported by empirical evidence from a real tunnel project dataset in Northeast China, which demonstrates an average accuracy of over 90% in a multi-step prediction task with a time step of 30 s. This suggests that the proposed method exhibits superior adaptability and prediction accuracy in comparison to existing mainstream deep learning models. The findings of the research provide novel concepts and methodologies for the intelligent regulation of TBM boring parameters. Full article
Show Figures

Figure 1

21 pages, 1967 KB  
Article
Unified Promptable Panoptic Mapping with Dynamic Labeling Using Foundation Models
by Mohamad Al Mdfaa, Raghad Salameh, Geesara Kulathunga, Sergey Zagoruyko and Gonzalo Ferrer
Robotics 2026, 15(2), 31; https://doi.org/10.3390/robotics15020031 - 27 Jan 2026
Abstract
Panoptic maps enable robots to reason about both geometry and semantics. However, open-vocabulary models repeatedly produce closely related labels that split panoptic entities and degrade volumetric consistency. The proposed UPPM advances open-world scene understanding by leveraging foundation models to introduce a panoptic Dynamic [...] Read more.
Panoptic maps enable robots to reason about both geometry and semantics. However, open-vocabulary models repeatedly produce closely related labels that split panoptic entities and degrade volumetric consistency. The proposed UPPM advances open-world scene understanding by leveraging foundation models to introduce a panoptic Dynamic Descriptor that reconciles open-vocabulary labels with unified category structure and geometric size priors. The fusion for such dynamic descriptors is performed within a multi-resolution multi-TSDF map using language-guided open-vocabulary panoptic segmentation and semantic retrieval, resulting in a persistent and promptable panoptic map without additional model training. Based on our evaluation experiments, UPPM shows the best overall performance in terms of the map reconstruction accuracy and the panoptic segmentation quality. The ablation study investigates the contribution for each component of UPPM (custom NMS, blurry-frame filtering, and unified semantics) to the overall system performance. Consequently, UPPM preserves open-vocabulary interpretability while delivering strong geometric and panoptic accuracy. Full article
(This article belongs to the Section AI in Robotics)
Show Figures

Figure 1

18 pages, 4213 KB  
Article
Accelerated Optimization of Superalloys by Integrating Thermodynamic Calculation Data with Machine Learning Models: A Reference Alloy Approach
by Yubing Pei, Zhenhuan Gao, Junjie Wu, Liping Nie, Song Lu, Jiaxin Tan, Ziyun Wu, Longfei Li and Xiufang Gong
Metals 2026, 16(2), 154; https://doi.org/10.3390/met16020154 - 27 Jan 2026
Abstract
The multi-objective optimization of multicomponent superalloys has long been impeded by not only the complex interactions among multiple elements but also the low efficiency and high cost of traditional trial-and-error methods. To address this issue, this study proposed a thermodynamic calculation data-driven optimization [...] Read more.
The multi-objective optimization of multicomponent superalloys has long been impeded by not only the complex interactions among multiple elements but also the low efficiency and high cost of traditional trial-and-error methods. To address this issue, this study proposed a thermodynamic calculation data-driven optimization framework that integrates machine learning (ML) and multi-objective screening based on domain knowledge. The core of this methodology involves introducing a commercial reference alloy and rapidly generating a large-scale thermodynamic dataset through ML models. After training, the ML models were verified to be more efficient at predicting phase transition temperatures and γ′ volume fractions than the CALPHAD methods. Focusing on the mechanical properties, critical strength indices, including solid solution strengthening, precipitation strengthening, and creep resistance based on the calculated γ/γ′ two-phase compositions, were compared with the reference alloy and set as the critical screen criteria. Optimal alloys were selected from the 388,000 candidates. Compared with the reference alloy, two new alloys were experimentally verified to have superior or comparable compressive yield strength and creep resistance at 900 °C at the expense of oxidation resistance and density, while maintaining comparable cost. This work demonstrates the significant potential of combining high-throughput thermodynamic data with intelligent multi-objective optimization to accelerate the development of new alloys with tailored property profiles. Full article
Show Figures

Figure 1

35 pages, 2414 KB  
Article
Hierarchical Caching for Agentic Workflows: A Multi-Level Architecture to Reduce Tool Execution Overhead
by Farhana Begum, Craig Scott, Kofi Nyarko, Mansoureh Jeihani and Fahmi Khalifa
Mach. Learn. Knowl. Extr. 2026, 8(2), 30; https://doi.org/10.3390/make8020030 - 27 Jan 2026
Abstract
Large Language Model (LLM) agents depend heavily on multiple external tools such as APIs, databases and computational services to perform complex tasks. However, these tool executions create latency and introduce costs, particularly when agents handle similar queries or workflows. Most current caching methods [...] Read more.
Large Language Model (LLM) agents depend heavily on multiple external tools such as APIs, databases and computational services to perform complex tasks. However, these tool executions create latency and introduce costs, particularly when agents handle similar queries or workflows. Most current caching methods focus on LLM prompt–response pairs or execution plans and overlook redundancies at the tool level. To address this, we designed a multi-level caching architecture that captures redundancy at both the workflow and tool level. The proposed system integrates four key components: (1) hierarchical caching that operates at both the workflow and tool level to capture coarse and fine-grained redundancies; (2) dependency-aware invalidation using graph-based techniques to maintain consistency when write operations affect cached reads across execution contexts; (3) category-specific time-to-live (TTL) policies tailored to different data types, e.g., weather APIs, user location, database queries and filesystem and computational tasks; and (4) session isolation to ensure multi-tenant cache safety through automatic session scoping. We evaluated the system using synthetic data with 2.25 million queries across ten configurations in fifteen runs. In addition, we conducted four targeted evaluations—write intensity robustness from 4 to 30% writes, personalized memory effects under isolated vs. shared cache modes, workflow-level caching comparison and workload sensitivity across five access distributions—on an additional 2.565 million queries, bringing the total experimental scope to 4.815 million executed queries. The architecture achieved 76.5% caching efficiency, reducing query processing time by 13.3× and lowering estimated costs by 73.3% compared to a no-cache baseline. Multi-tenant testing with fifteen concurrent tenants confirmed robust session isolation and 74.1% efficiency under concurrent workloads. Our evaluation used controlled synthetic workloads following Zipfian distributions, which are commonly used in caching research. While absolute hit rates vary by deployment domain, the architectural principles of hierarchical caching, dependency tracking and session isolation remain broadly applicable. Full article
(This article belongs to the Section Learning)
Show Figures

Figure 1

24 pages, 9471 KB  
Article
Algorithmic Complexity vs. Market Efficiency: Evaluating Wavelet–Transformer Architectures for Cryptocurrency Price Forecasting
by Aldan Jay and Rafael Berlanga
Algorithms 2026, 19(2), 101; https://doi.org/10.3390/a19020101 - 27 Jan 2026
Abstract
We investigate whether sophisticated deep learning architectures justify their computational cost for short-term cryptocurrency price forecasting. Our study evaluates a 2.1M-parameter (M represents millions (e.g., 2.1M = 2,100,000 parameters), with all RMSE values reported in USD) wavelet-enhanced transformer that decomposes the Fear and [...] Read more.
We investigate whether sophisticated deep learning architectures justify their computational cost for short-term cryptocurrency price forecasting. Our study evaluates a 2.1M-parameter (M represents millions (e.g., 2.1M = 2,100,000 parameters), with all RMSE values reported in USD) wavelet-enhanced transformer that decomposes the Fear and Greed Index (FGI) into multiple timescales before integrating these signals with technical indicators. Using Diebold–Mariano tests with HAC-corrected variance, we find that all models—including our wavelet–transformer, ARIMA, XGBoost, LSTM, and vanilla Transformer—fail to significantly outperform the O(1) naive persistence baseline at the 1-day horizon (DM statistic = +19.13, p<0.001, naive preferred). Our model achieves an RMSE of USD 2005 versus USD 1986 for naive (ratio 1.010), requiring 3909× more inference time (2.43 ms vs. 0.0006 ms) for a statistically worse performance. These results provide strong empirical support for the Efficient Market Hypothesis in cryptocurrency markets: even sophisticated multi-scale architectures combining wavelet decomposition, cross-attention, and auxiliary technical indicators cannot extract profitable short-term signals. Through systematic ablation, we identify positional encoding as the only critical architectural component—its removal causes 30% RMSE degradation. Our findings carry important implications, as follows: (1) short-term crypto forecasting faces fundamental predictability limits, (2) architectural complexity provides negative ROI in efficient markets, and (3) rigorous statistical validation reveals that apparent improvements often represent noise rather than signal. Full article
(This article belongs to the Special Issue Machine Learning for Pattern Recognition (3rd Edition))
Show Figures

Figure 1

14 pages, 2846 KB  
Article
Valorization of Plant Biomass Through the Synthesis of Lignin-Based Hydrogels for Drug Delivery
by Natalia Cárdenas-Vargas, Nazish Jabeen, Jose Huerta-Recasens, Francisco Pérez-Pla, Clara M. Gómez, Maurice N. Collins, Leire Ruiz-Rubio, Rafael Muñoz-Espí and Mario Culebras
Gels 2026, 12(2), 104; https://doi.org/10.3390/gels12020104 - 27 Jan 2026
Abstract
This study focuses on obtaining lignin-based hydrogels from pruning residues of orange trees in the Safor region (Valencia) using an alkaline extraction method. The structural analysis of the obtained lignin was carried out using Fourier-transform infrared spectroscopy (FTIR), which revealed the characteristic functional [...] Read more.
This study focuses on obtaining lignin-based hydrogels from pruning residues of orange trees in the Safor region (Valencia) using an alkaline extraction method. The structural analysis of the obtained lignin was carried out using Fourier-transform infrared spectroscopy (FTIR), which revealed the characteristic functional groups of lignin, as well as its structural monolignols: syringyl and guaiacyl. The thermal properties were analyzed using differential scanning calorimetry (DSC) and thermogravimetric analysis. The DSC thermogram revealed a relatively low glass transition temperature (Tg) of 67 °C, which may be attributed to partial lignin chain degradation during alkaline extraction. Soda lignin was obtained at 190 °C with an approximate yield of 10.8% relative to the initial biomass and subsequently used to synthesize poly(vinyl alcohol) (PVA)-based hydrogels for ibuprofen encapsulation. Finally, the release experiments of the encapsulated ibuprofen were carried out in an aqueous phosphate buffer medium (pH = 7) at room temperature. A multi-curve response analysis (MCR) algorithm using the Korsmeyer–Peppas (KP) concentration model was used to analyze the release curves, which concluded that the drug and water-soluble lignin fraction (SLF) were released at different rates. For both components, a good correlation was obtained between the measured responses and those provided by the KP model. The release profile indicated that approximately 87% of the initial ibuprofen load was released from the hydrogel within 5 h, highlighting the promising potential of lignin-based hydrogels for drug delivery applications. Full article
(This article belongs to the Special Issue Design and Optimization of Pharmaceutical Gels (2nd Edition))
Show Figures

Figure 1

28 pages, 1315 KB  
Article
SFD-ADNet: Spatial–Frequency Dual-Domain Adaptive Deformation for Point Cloud Data Augmentation
by Jiacheng Bao, Lingjun Kong and Wenju Wang
J. Imaging 2026, 12(2), 58; https://doi.org/10.3390/jimaging12020058 - 26 Jan 2026
Abstract
Existing 3D point cloud enhancement methods typically rely on artificially designed geometric transformations or local blending strategies, which are prone to introducing illogical deformations, struggle to preserve global structure, and exhibit insufficient adaptability to diverse degradation patterns. To address these limitations, this paper [...] Read more.
Existing 3D point cloud enhancement methods typically rely on artificially designed geometric transformations or local blending strategies, which are prone to introducing illogical deformations, struggle to preserve global structure, and exhibit insufficient adaptability to diverse degradation patterns. To address these limitations, this paper proposes SFD-ADNet—an adaptive deformation framework based on a dual spatial–frequency domain. It achieves 3D point cloud augmentation by explicitly learning deformation parameters rather than applying predefined perturbations. By jointly modeling spatial structural dependencies and spectral features, SFD-ADNet generates augmented samples that are both structurally aware and task-relevant. In the spatial domain, a hierarchical sequence encoder coupled with a bidirectional Mamba-based deformation predictor captures long-range geometric dependencies and local structural variations, enabling adaptive position-aware deformation control. In the frequency domain, a multi-scale dual-channel mechanism based on adaptive Chebyshev polynomials separates low-frequency structural components from high-frequency details, allowing the model to suppress noise-sensitive distortions while preserving the global geometric skeleton. The two deformation predictions dynamically fuse to balance structural fidelity and sample diversity. Extensive experiments conducted on ModelNet40-C and ScanObjectNN-C involved synthetic CAD models and real-world scanned point clouds under diverse perturbation conditions. SFD-ADNet, as a universal augmentation module, reduces the mCE metrics of PointNet++ and different backbone networks by over 20%. Experiments demonstrate that SFD-ADNet achieves state-of-the-art robustness while preserving critical geometric structures. Furthermore, models enhanced by SFD-ADNet demonstrate consistently improved robustness against diverse point cloud attacks, validating the efficacy of adaptive space-frequency deformation in robust point cloud learning. Full article
(This article belongs to the Special Issue 3D Image Processing: Progress and Challenges)
Show Figures

Figure 1

26 pages, 9070 KB  
Article
Research on a General-Type Hydraulic Valve Leakage Diagnosis Method Based on CLAF-MTL Feature Deep Integration
by Chengbiao Tong, Yu Xiong, Xinming Xu and Yihua Wu
Sensors 2026, 26(3), 821; https://doi.org/10.3390/s26030821 - 26 Jan 2026
Abstract
As control and execution components within hydraulic systems, hydraulic valves are critical to system efficiency and operational safety. However, existing research primarily focuses on specific valve designs, exhibiting limitations in versatility and task coordination that constrain their comprehensive diagnostic capabilities. To address these [...] Read more.
As control and execution components within hydraulic systems, hydraulic valves are critical to system efficiency and operational safety. However, existing research primarily focuses on specific valve designs, exhibiting limitations in versatility and task coordination that constrain their comprehensive diagnostic capabilities. To address these issues, this paper innovatively proposes a multi-modal feature deep fusion multi-task prediction (CLAF-MTL) model. This model employs a core architecture based on the CNN-LSTM-Additive Attention module and a fully connected network (FCN) for multi-domain features, while simultaneously embedding a multi-task learning mechanism. It resolves the multi-task prediction challenge of leakage rate regression and fault type classification, significantly enhancing diagnostic efficiency and practicality. This model innovatively designs a complementary fusion mechanism of “deep auto-features + multi-domain features” overcoming the limitations of single-modality representation. It integrates leakage rate regression and fault type classification into a unified modeling framework, dynamically optimizing dual-task weights via the MGDA-UB algorithm to achieve bidirectional complementarity between tasks. Experimental results demonstrate that the proposed method achieves an R2 of 0.9784 for leakage rate prediction and a fault type identification accuracy of 92.23% on the test set. Compared to traditional approaches, this method is the first to simultaneously address the challenge of accurately predicting both leakage rate and fault type. It exhibits superior robustness and applicability across generic valve scenarios, providing an effective solution for intelligent monitoring of valve leakage faults in hydraulic systems. Full article
(This article belongs to the Section Fault Diagnosis & Sensors)
Show Figures

Figure 1

28 pages, 710 KB  
Review
Nurse-Led Interventions Targeting Clinical Correlates of Immunosenescence in Older Adults: A Scoping Review
by Gianluca Azzellino, Patrizia Vagnarelli, Ernesto Aitella, Luca Mengoli, Lia Ginaldi and Massimo De Martinis
Medicina 2026, 62(2), 262; https://doi.org/10.3390/medicina62020262 - 26 Jan 2026
Abstract
Background and Objectives: Immunosenescence is a complex biological process associated with aging, characterized by a progressive decline in immune function and increased chronic inflammation (“inflammaging”), with clinical implications such as frailty, functional decline, multimorbidity, and a higher risk of adverse events in older [...] Read more.
Background and Objectives: Immunosenescence is a complex biological process associated with aging, characterized by a progressive decline in immune function and increased chronic inflammation (“inflammaging”), with clinical implications such as frailty, functional decline, multimorbidity, and a higher risk of adverse events in older adults. Nurses in community and primary care settings play a central role in preventive and health promotion interventions that may indirectly influence these processes. However, the available literature remains fragmented. Therefore, this scoping review aims to map and synthesise nursing interventions targeting older adults (≥60 years) that may indirectly influence immunosenescence by acting on its clinical correlates and modifiable determinants, organising the evidence within a four-pillar conceptual framework. Materials and Methods: A scoping review was conducted following JBI methodology and the PRISMA-ScR checklist. We included primary studies on nurse-led interventions in community, home care, primary care, territorial, or long-term care settings. PubMed, Scopus, and Web of Science were searched (English; last 10 years). Interventions were classified into four pillars: nursing nutrition and immunonutrition support, physical activity and exercise support, nursing vaccination coaching, and frailty monitoring and prevention of functional decline. Results: Twenty-five primary studies were included, mostly randomised or cluster-randomised trials in community, primary care, home care, and transitional care settings. Interventions mapped mainly to Pillar 4 and Pillar 2, while Pillar 1 was less frequent and usually part of multicomponent programmes; no primary studies targeted Pillar 3. Overall, effectiveness appeared driven more by intervention intensity and integration than by frailty identification alone: structured, multicomponent nurse-led programmes combining exercise with nutritional and psychosocial components showed the most consistent benefits on frailty, functional outcomes, and well-being, whereas low-intensity preventive consultations and Comprehensive Geriatric Assessment (CGA)-based models often showed limited improvements over usual care. Conclusions: This scoping review highlights the key role of community and primary care nurses in preventive interventions targeting clinical correlates of immunosenescence. Multicomponent nurse-led programmes integrating physical activity, nutrition, and psychosocial support appear most promising for frailty and functional outcomes, while low-intensity interventions show limited effectiveness. No primary studies addressed nurse-led vaccination coaching, representing an evidence gap. Future research should include biological/immunological markers alongside clinical outcomes. Full article
(This article belongs to the Special Issue Personal and Pervasive Health Care for the Elderly)
19 pages, 9109 KB  
Systematic Review
Influence of Self-Care on the Quality of Life of Elderly People with Chronic Non-Communicable Diseases: A Systematic Review
by Poliana Martins Ferreira, Jonas Paulo Batista Dias, Monica Barbosa, Teresa Martins, Rui Pedro Gomes Pereira, Murilo César do Nascimento and Namie Okino Sawada
Healthcare 2026, 14(3), 308; https://doi.org/10.3390/healthcare14030308 - 26 Jan 2026
Abstract
Background/Objectives: Self-care is a cornerstone of healthy aging and chronic disease management; however, evidence on the most effective intervention models for improving quality of life in older adults with chronic non-communicable diseases (NCDs) remains fragmented. This review aimed to evaluate the effectiveness of [...] Read more.
Background/Objectives: Self-care is a cornerstone of healthy aging and chronic disease management; however, evidence on the most effective intervention models for improving quality of life in older adults with chronic non-communicable diseases (NCDs) remains fragmented. This review aimed to evaluate the effectiveness of self-care interventions in promoting quality of life and health outcomes in older adults with NCDs. Methods: A systematic review was conducted in accordance with PRISMA 2020 guidelines and registered in PROSPERO (CRD420251040613). Randomized and non-randomized clinical trials published between 2019 and 2024 were retrieved from Scopus, Web of Science, and EBSCOhost. Eligible studies included adults aged ≥60 years with NCDs receiving self-care interventions. Data extraction and risk of bias assessment were independently performed using Joanna Briggs Institute tools. Results: Twenty-nine studies involving 7241 older adults were included. Self-care interventions comprised nurse-led educational programs, digital health strategies, community- and peer-based approaches, and person-centered care models. Multicomponent and continuous interventions demonstrated consistent improvements in physical and psychological domains of quality of life, self-efficacy, autonomy, symptom management, and treatment adherence. Digital interventions enhanced monitoring and engagement, although their effectiveness varied according to sensory and health literacy limitations. Conclusions: Structured, person-centered, and nurse-led self-care interventions are effective in improving quality of life and autonomy among older adults with NCDs. These findings support their integration into primary and community-based care, reinforcing their relevance for clinical practice, care planning, and the development of assistive and educational strategies in aging care. Full article
(This article belongs to the Special Issue Advances in Public Health and Healthcare Management for Chronic Care)
Show Figures

Figure 1

25 pages, 2206 KB  
Article
Adaptive Bayesian System Identification for Long-Term Forecasting of Industrial Load and Renewables Generation
by Lina Sheng, Zhixian Wang, Xiaowen Wang and Linglong Zhu
Electronics 2026, 15(3), 530; https://doi.org/10.3390/electronics15030530 - 26 Jan 2026
Abstract
The expansion of renewables in modern power systems and the coordinated development of upstream and downstream industrial chains are promoting a shift on the utility side from traditional settlement by energy toward operation driven by data and models. Industrial electricity consumption data exhibit [...] Read more.
The expansion of renewables in modern power systems and the coordinated development of upstream and downstream industrial chains are promoting a shift on the utility side from traditional settlement by energy toward operation driven by data and models. Industrial electricity consumption data exhibit pronounced multi-scale temporal structures and sectoral heterogeneity, which makes unified long-term load and generation forecasting while maintaining accuracy, interpretability, and scalability a challenge. From a modern system identification perspective, this paper proposes a System Identification in Adaptive Bayesian Framework (SIABF) for medium- and long-term industrial load forecasting based on daily freeze electricity time series. By combining daily aggregation of high-frequency data, frequency domain analysis, sparse identification, and long-term extrapolation, we first construct daily freeze series from 15 min measurements, and then we apply discrete Fourier transforms and a spectral complexity index to extract dominant periodic components and build an interpretable sinusoidal basis library. A sparse regression formulation with 1 regularization is employed to select a compact set of key basis functions, yielding concise representations of sector and enterprise load profiles and naturally supporting multivariate and joint multi-sector modeling. Building on this structure, we implement a state-space-implicit physics-informed Bayesian forecasting model and evaluate it on real data from three representative sectors, namely, steel, photovoltaics, and chemical, using one year of 15 min measurements. Under a one-month-ahead evaluation using one year of 15 min measurements, the proposed framework attains a Mean Absolute Percentage Error (MAPE) of 4.5% for a representative PV-related customer case and achieves low single-digit MAPE for high-inertia sectors, often outperforming classical statistical models, sparse learning baselines, and deep learning architectures. These results should be interpreted as indicative given the limited time span and sample size, and broader multi-year, population-level validation is warranted. Full article
(This article belongs to the Section Systems & Control Engineering)
19 pages, 2293 KB  
Article
Automated Identification of Heavy BIM Library Components: A Multi-Criteria Analysis Tool for Model Optimization
by Andrzej Szymon Borkowski
Smart Cities 2026, 9(2), 22; https://doi.org/10.3390/smartcities9020022 - 26 Jan 2026
Abstract
This study addresses the challenge of identifying heavy Building Information Modeling (BIM) library components that disproportionately degrade model performance. While BIM has become standard in the construction industry, heavy components characterized by excessive geometric complexity, numerous instances, or inefficient optimization—cause extended file loading [...] Read more.
This study addresses the challenge of identifying heavy Building Information Modeling (BIM) library components that disproportionately degrade model performance. While BIM has become standard in the construction industry, heavy components characterized by excessive geometric complexity, numerous instances, or inefficient optimization—cause extended file loading times, interface lag, and coordination difficulties, particularly in large cross-industry projects. Current identification methods rely primarily on designer experience and manual inspection, lacking systematic evaluation frameworks. This research develops a multi-criteria evaluation method based on Multi-Criteria Decision Analysis (MCDA) that quantifies component performance impact through five weighted criteria: instance count (20%), geometry complexity (30%), face count (20%), edge count (10%), and estimated file size (20%). These metrics are aggregated into a composite Weight Score, with components exceeding a threshold of 200 classified as requiring optimization attention. The method was implemented as HeavyFamilies, a pyRevit plugin for Autodesk Revit featuring a graphical interface with tabular results, CSV export functionality, and direct model visualization. Validation on three real BIM projects of varying scales (133–680 families) demonstrated effective identification of heavy components within 8–165 s of analysis time. User validation with six BIM specialists achieved 100% task completion rate, with automatic color coding and direct model highlighting particularly valued. The proposed approach enables a shift from reactive troubleshooting to proactive quality control, supporting routine diagnostics and objective prioritization of optimization efforts in federated and multi-disciplinary construction projects. Full article
Show Figures

Figure 1

24 pages, 2078 KB  
Article
SymXplorer: Symbolic Analog Topology Exploration of a Tunable Common-Gate Bandpass TIA for Radio-over- Fiber Applications
by Danial Noori Zadeh and Mohamed B. Elamien
Electronics 2026, 15(3), 515; https://doi.org/10.3390/electronics15030515 - 25 Jan 2026
Viewed by 41
Abstract
While circuit parameter optimization has matured significantly, the systematic discovery of novel circuit topologies remains a bottleneck in analog design automation. This work presents SymXplorer, an open-source Python framework designed for automated topology exploration through symbolic modeling of analog components. The framework enables [...] Read more.
While circuit parameter optimization has matured significantly, the systematic discovery of novel circuit topologies remains a bottleneck in analog design automation. This work presents SymXplorer, an open-source Python framework designed for automated topology exploration through symbolic modeling of analog components. The framework enables a component-agnostic approach to architecture-level synthesis, integrating stability analysis and higher-order filter exploration within a streamlined API. By modeling non-idealities as lumped parameters, the framework accounts for physical constraints directly within the symbolic analysis. To facilitate circuit sizing, SymXplorer incorporates a multi-objective optimization toolbox featuring Bayesian optimization and evolutionary algorithms for simulation-in-the-loop evaluation. Using this framework, we conduct a systematic search for differential Common-Gate (CG) Bandpass Transimpedance Amplifier (TIA) topologies tailored for 5G New Radio (NR) Radio-over-Fiber applications. We propose a novel, orthogonally tunable Bandpass TIA architecture identified by the tool. Implementation in 65 nm CMOS technology demonstrates the efficacy of the framework. Post-layout results exhibit a tunable gain of 30–50 dBΩ, a center frequency of 3.5 GHz, and a tuning range of 500 MHz. The design maintains a power consumption of less than 400 μW and an input-referred noise density of less than 50 pA/Hz across the passband. Finally, we discuss how this symbolic framework can be integrated into future agentic EDA workflows to further automate the analog design cycle. SymXplorer is open-sourced to encourage innovation in symbolic-driven analog design automation. Full article
(This article belongs to the Section Circuit and Signal Processing)
Back to TopTop