Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (525)

Search Parameters:
Keywords = metamodelling

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
70 pages, 5036 KB  
Review
A Review of Mathematical Reduced-Order Modeling of PCM-Based Latent Heat Storage Systems
by John Nico Omlang and Aldrin Calderon
Energies 2026, 19(9), 2017; https://doi.org/10.3390/en19092017 - 22 Apr 2026
Viewed by 435
Abstract
Phase change material (PCM)-based latent heat storage (LHS) systems help address the mismatch between renewable energy supply and thermal demand. However, their practical implementation is constrained by the strongly nonlinear and multiphysics nature of phase change, which makes high-fidelity simulations and real-time applications [...] Read more.
Phase change material (PCM)-based latent heat storage (LHS) systems help address the mismatch between renewable energy supply and thermal demand. However, their practical implementation is constrained by the strongly nonlinear and multiphysics nature of phase change, which makes high-fidelity simulations and real-time applications computationally expensive. This review examines mathematical reduced-order modeling (ROM) as an effective strategy to overcome this limitation by combining physics-based simplifications, projection methods, interpolation techniques, and data-driven models for PCM-based LHS systems. While physical simplifications (such as dimensional reduction and effective property approximations) represent an important first layer of model reduction, the primary focus of this work is on the mathematical ROM methodologies that operate on the governing equations after such physical simplifications have been applied. The review covers approaches including two-temperature non-equilibrium and analytical thermal-resistance models, Proper Orthogonal Decomposition (POD), CFD-derived look-up tables, kriging and ε-NTU grey/black-box metamodels, and machine-learning methods such as artificial neural networks and gradient-boosted regressors trained from CFD data. These ROM techniques have been applied to packed beds, PCM-integrated heat exchangers, finned enclosures, triplex-tube systems, and solar thermal components, achieving speed-ups from tens to over 80,000 times faster than full CFD simulations while maintaining prediction errors typically below 5% or within sub-Kelvin temperature deviations. A critical comparative analysis exposes the fundamental trade-off between interpretability, data dependence, and computational efficiency, leading to a practical decision-making framework that guides method selection for specific applications such as design optimization, real-time control, and system-level simulation. Remaining challenges—including accurate representation of phase change nonlinearity, moving phase boundaries, multi-timescale dynamics, generalization across geometries, experimental validation, and integration into industrial workflows—motivate a structured roadmap for future hybrid physics–machine learning developments, standardized validation protocols, and pathways toward industrial deployment. Full article
(This article belongs to the Section D: Energy Storage and Application)
Show Figures

Figure 1

32 pages, 5238 KB  
Review
Simulation- and Metamodel-Based Multi-Objective Optimization for Sustainable Building Retrofit Across Climatic Conditions
by Sk. Reza-E-Rabbi, Muhammed A. Bhuiyan, Guomin Zhang, Shanuka Dodampegama and Kanishka Atapattu
Materials 2026, 19(8), 1649; https://doi.org/10.3390/ma19081649 - 20 Apr 2026
Viewed by 179
Abstract
Building retrofit optimization has gained increasing attention as a pathway to improve energy performance and support sustainability. This review examines 162 studies and synthesizes simulation-based (SBMOO) and metamodel-based (MBMOO) multi-objective optimization techniques for building retrofit across climatic conditions. The review also analyzes passive, [...] Read more.
Building retrofit optimization has gained increasing attention as a pathway to improve energy performance and support sustainability. This review examines 162 studies and synthesizes simulation-based (SBMOO) and metamodel-based (MBMOO) multi-objective optimization techniques for building retrofit across climatic conditions. The review also analyzes passive, active, and combined retrofit strategies and evaluates how climatic context influences their suitability and performance. Passive strategies typically involve envelope- or material-related upgrades, whereas active strategies focus on building systems. Energy efficiency, comfort, cost-effectiveness, and environmental impact are identified as the major performance metrics for retrofit evaluation. Sustainability metric such as life cycle assessment (LCA) has yet to be used adequately to evaluate retrofit measures, while social objectives are also less explored. SBMOO provides robust optimization but can be computationally intensive, whereas MBMOO improves computational efficiency through surrogate modeling but depends strongly on dataset quality, sampling strategy, and surrogate model selection. In contrast to earlier reviews that usually emphasize either optimization techniques or retrofit measures independently, this study integrates optimization pathway comparison with climate-based analysis of retrofit strategies. The review also finds that most studies are highly case-specific, limiting transferability across climates, building types, and retrofit contexts. Therefore, this work proposes a synthesized framework to support structured selection of baseline modeling and optimization pathways for future retrofit studies. Overall, the review identifies current methodological trends, key research gaps, and future directions for more consistent and climate responsive retrofit decision-making. Full article
(This article belongs to the Special Issue Eco-Friendly Materials for Sustainable Buildings)
Show Figures

Graphical abstract

43 pages, 2312 KB  
Article
Classification Model of Emotional Tone in Hate Speech and Its Relationship with Inequality and Gender Stereotypes, Using NLP and Machine Learning Algorithms
by Aymé Escobar Díaz, Ricardo Rivadeneira, Walter Fuertes and Washington Loza
Future Internet 2026, 18(4), 218; https://doi.org/10.3390/fi18040218 - 20 Apr 2026
Viewed by 163
Abstract
Hate speech on social media reproduces norms of inequality and gender stereotypes, disproportionately affecting women. This study proposes a hybrid approach that integrates emotional tone classification with explicit hostility detection to strengthen preventive moderation. We constructed a corpus from three open data sets [...] Read more.
Hate speech on social media reproduces norms of inequality and gender stereotypes, disproportionately affecting women. This study proposes a hybrid approach that integrates emotional tone classification with explicit hostility detection to strengthen preventive moderation. We constructed a corpus from three open data sets (1,236,371 records; 1,003,991 after ETL) and represented the text using TF-IDF and contextual RoBERTa embeddings. We trained individual models (RoBERTa fine-tuned, Random Forest, and XGBoost) and a stacking metamodel (Gradient Boosting) that combines their probabilities. On the test set, the ensemble outperformed the base classifiers, achieving accuracy of 0.93 in hate detection and 0.90 in emotion classification, with an AUC of 0.98 for emotion classification. We implemented a RESTful API and a web client to validate the moderation flow before publication, along with an administration panel for auditing. Performance tests in a prototype deployment (Google Colab exposed through an Ngrok tunnel) provided proof-of-concept validation, revealing concurrency limitations from around 300 users due to infrastructure constraints. In general, the results indicate that incorporating emotional tone analysis improves the model’s ability to identify implicit hostility and offers a practical way to promote safer digital environments. The probabilistic outputs produced by the ensemble model were subsequently analyzed using the Bayesian Calibration and Optimal Design under Asymmetric Risk (BACON-AR) framework, which serves as a mathematical post hoc decision layer for evaluating classification behaviour under unequal error costs. Rather than modifying the trained architecture or improving its predictive performance, the framework identifies a cost-sensitive operating threshold that minimizes the total expected risk under the selected asymmetric cost configuration. The experiments were conducted using an English-language data set; therefore, the findings of this study are limited to hate speech detection in English. Full article
(This article belongs to the Section Techno-Social Smart Systems)
Show Figures

Graphical abstract

8 pages, 1376 KB  
Proceeding Paper
Metamodeling Approach for Comparison of Linear Flux-Switching and Permanent Magnet Synchronous Machines for Electric Aircraft Propulsion
by Enrico Teichert, Matthias Lang, Ilja Koch and Stefan Kazula
Eng. Proc. 2026, 133(1), 49; https://doi.org/10.3390/engproc2026133049 - 17 Apr 2026
Viewed by 25
Abstract
The increasing demand for electric, direct-drive propulsion systems with high torque density and high efficiency is driving the development of novel topologies in aviation. Conventional surface-mounted permanent magnet machines offer high efficiency with medium gravimetric shear force density. Flux-switching machines have a significantly [...] Read more.
The increasing demand for electric, direct-drive propulsion systems with high torque density and high efficiency is driving the development of novel topologies in aviation. Conventional surface-mounted permanent magnet machines offer high efficiency with medium gravimetric shear force density. Flux-switching machines have a significantly higher specific force density and offer attractive advantages such as structural robustness, favorable permanent magnet utilization and simplified cooling options. In this work, two FSM variants and an SPM benchmark are investigated. A metamodel-based optimization framework is employed to efficiently explore a parameterized design space, allowing the identification of pareto-optimal solutions. Selected designs are analyzed in detail and compared with each other. The results show that high-pole FSM configurations are particularly suitable for torque-dense electric machines in aviation due to their high shear force density and scalability. Full article
Show Figures

Figure 1

26 pages, 1776 KB  
Article
Regression Meta-Model for Predicting Temperature-Humidity Index in Mechanically Ventilated Broiler Houses Using Building Energy Simulation in South Korea
by Taehwan Ha, Kyeongseok Kwon, Se-Woon Hong and Uk-Hyeon Yeo
Agriculture 2026, 16(8), 824; https://doi.org/10.3390/agriculture16080824 - 8 Apr 2026
Viewed by 362
Abstract
Heat stress is a major challenge for broiler production worldwide and is expected to intensify with more frequent heatwaves. This study focuses on mechanically ventilated broiler houses in South Korea, where heatwaves have become increasingly frequent. Three regression meta-models were developed to predict [...] Read more.
Heat stress is a major challenge for broiler production worldwide and is expected to intensify with more frequent heatwaves. This study focuses on mechanically ventilated broiler houses in South Korea, where heatwaves have become increasingly frequent. Three regression meta-models were developed to predict the indoor temperature–humidity index (THI) directly from weather forecast data, using simulated results from a validated building energy simulation (BES) model. A TRNSYS-based BES model was validated against field measurements from four rearing cycles in a commercial broiler house (RMSE 1.31–2.16; MAPE < 2.00%). Using 3072 simulation cases that combined multiple sites, thermal-transmittance levels, cooling conditions, building sizes, and broiler body weights, three regression meta-model approaches were evaluated: a condition-specific regression meta-model for each condition set, a unified regression meta-model with categorical predictors, and a single variable meta-model using only external THI as a predictor. All three showed strong predictive performance, and the unified regression meta-model achieved R2 = 0.978, RMSE = 0.817, and MAPE = 0.829, providing the best balance between accuracy and simplicity. This unified model offers a practical tool to link weather forecasts with broiler-house design and environmental-control decisions for heat-stress risk management. Full article
Show Figures

Graphical abstract

26 pages, 3093 KB  
Article
Research on Model-Based Systems Engineering Approach for Information Technology Standard System Planning
by Yangyang Zhang, Xiuming Yu, Xiaojian Liu, Jianxun Guo, Wenyuan Zhang and Can Zhang
Systems 2026, 14(4), 380; https://doi.org/10.3390/systems14040380 - 1 Apr 2026
Viewed by 350
Abstract
Planning an information technology (IT) standard system requires balancing multiple complex factors. These include industrial chain layout, technological iteration, practitioner application scenarios and cross-domain integration. Such planning demands high industrial adaptability, technical foresight and implementation operability, yet mature and systematic methods are currently [...] Read more.
Planning an information technology (IT) standard system requires balancing multiple complex factors. These include industrial chain layout, technological iteration, practitioner application scenarios and cross-domain integration. Such planning demands high industrial adaptability, technical foresight and implementation operability, yet mature and systematic methods are currently lacking in this field. To address this issue, this paper proposes a Model-Based Systems Engineering (MBSE) approach for IT standard system planning by integrating complex system decomposition and integration principles. A multi-perspective (industry, practitioner, business, product, standard) and multi-view (industrial chain, practitioner, product, technical process, management process, standardization object, standard) modeling framework is constructed, and an MOF-based meta-model system for each view is designed to realize full-process visual modeling from industrial ecosystem analysis to standard system implementation. As a conceptual and methodological study, this approach makes up for the perspective limitations of traditional planning methods. Multi-view hierarchical and collaborative analysis ensures the standard system to be in line with industrial reality with foresight and operability, providing systematic methodological support for standard-setting organizations, industrial alliances and enterprises in the IT field, and expanding the application boundary of MBSE in standard system planning. Full article
(This article belongs to the Section Artificial Intelligence and Digital Systems Engineering)
Show Figures

Figure 1

59 pages, 18673 KB  
Article
Characterization and Predictive Modeling of Diatomite Mortar Performance: A Hybrid Framework Based on Experimental Analysis and Machine Learning Meta-Models
by Sihem Brahimi, Miloud Hamadache and Mhand Hifi
Buildings 2026, 16(7), 1281; https://doi.org/10.3390/buildings16071281 - 24 Mar 2026
Viewed by 297
Abstract
Decarbonizing the construction sector requires high-volume replacement of Portland clinker with non-calcined supplementary cementitious materials (SCMs). This study investigates white cement pastes incorporating raw Algerian diatomite—a silica-rich biogenic mineral—at substitution levels from 40% to 95% (5% increments) and a fixed water-to-binder ratio of [...] Read more.
Decarbonizing the construction sector requires high-volume replacement of Portland clinker with non-calcined supplementary cementitious materials (SCMs). This study investigates white cement pastes incorporating raw Algerian diatomite—a silica-rich biogenic mineral—at substitution levels from 40% to 95% (5% increments) and a fixed water-to-binder ratio of 0.5. The target application is ultra-lightweight, multifunctional composites for non-structural uses such as decorative panels and partition elements. Increasing diatomite content progressively reduced bulk density from 1.483 g/cm3 (D40) to 0.557 g/cm3 (D95) and increased porosity. 28-day compressive strength decreased monotonically from 16 MPa (D40) to 2.4 MPa (D95) as clinker dilution intensified. Ultrasonic pulse velocity dropped from 6205 m/s to 1495 m/s, reflecting progressive pore development and confirming the material’s lightweight potential. Statistically significant strength gains beyond 28 days were recorded (+25.87% for compression, p-value < 0.05), evidencing delayed pozzolanic activity. These results confirm that raw, non-calcined diatomite is a viable SCM for eco-efficient, low-density construction systems. To overcome the extrapolation instability of purely data-driven approaches, a Meta-Avrami Hybrid Framework was developed. It anchors Gradient Boosting residual learning to a sigmoidal Avrami hydration kernel. The model achieved high predictive accuracy (R20.999, RMSE0.010) under 10-fold cross-validation. Generalization was well-controlled, with a low overfitting gap (ΔR2=0.0226) and stable fold-to-fold performance (Std=0.0204). These metrics confirm suitability for unseen mix designs. This is particularly relevant for service-life assessment of partition panels and lightweight façade elements, where long-term performance guarantees are required. The physics-informed architecture ensures asymptotic strength stabilization up to a 10-year horizon (amplification ratios 1.03–1.05). This prevents the non-physical divergence observed in polynomial and power-law hybrids (ratios 1.36–1.70). The framework provides a reliable and interpretable tool for service-life design of sustainable low-carbon cementitious systems. Full article
(This article belongs to the Section Building Materials, and Repair & Renovation)
Show Figures

Figure 1

38 pages, 4516 KB  
Article
A Formal Modeling Framework for Time-Aware Cyber–Physical Systems of Systems
by Riad Helal, Faiza Belala, Nabil Hameurlain and Akram Seghiri
Systems 2026, 14(3), 312; https://doi.org/10.3390/systems14030312 - 16 Mar 2026
Cited by 1 | Viewed by 449
Abstract
Cyber–Physical Systems of Systems (CPSoS) integrate autonomous constituent systems to accomplish complex missions. Nonetheless, decentralized coordination and continuous evolution create intricate dependencies that make behavior difficult to analyze. Current semi-formal modeling approaches, despite being easy to understand and widely accessible, lack semantic precision [...] Read more.
Cyber–Physical Systems of Systems (CPSoS) integrate autonomous constituent systems to accomplish complex missions. Nonetheless, decentralized coordination and continuous evolution create intricate dependencies that make behavior difficult to analyze. Current semi-formal modeling approaches, despite being easy to understand and widely accessible, lack semantic precision and are not computationally checkable to guarantee time-critical properties. Furthermore, current formal methods are often fragmented: they analyze behavior either at the individual CPS level or the collective CPSoS level, failing to provide a multi-level specification. To address these limitations, we propose an integrated framework combining SysML and Maude rewriting logic. SysML provides structural and behavioral specification capabilities, while Maude enables rigorous semantics, executable models, and formal verification. First, our approach proposes MM-CPSoS, a meta-model that unifies CPS and CPSoS entities with explicit temporal constraints. Dynamic behavior is captured through evolution patterns governing mission progression across both levels. Then, we encode SysML models into Maude as object-oriented configurations and conditional rewrite rules, enabling linear temporal logic (LTL) model checking of temporal properties. Finally, we demonstrate our approach through a Time-Aware Road Crisis Management System (TaRCiMaS2). Full article
(This article belongs to the Section Systems Engineering)
Show Figures

Figure 1

25 pages, 3685 KB  
Article
Explainable Meta-Learning Ensemble Framework for Predicting Insulin Dose Adjustments in Diabetic Patients: A Comparative Machine Learning Approach with SHAP-Based Clinical Interpretability
by Emek Guldogan, Burak Yagin, Hasan Ucuzal, Abdulmohsen Algarni, Fahaid Al-Hashem and Mohammadreza Aghaei
Medicina 2026, 62(3), 502; https://doi.org/10.3390/medicina62030502 - 9 Mar 2026
Viewed by 617
Abstract
Background and Objectives: Diabetes mellitus represents one of the most prevalent chronic metabolic disorders worldwide, necessitating precise insulin dose management to prevent both acute and long-term complications. The optimization of insulin dosing remains a significant clinical challenge, as inappropriate dosing can lead [...] Read more.
Background and Objectives: Diabetes mellitus represents one of the most prevalent chronic metabolic disorders worldwide, necessitating precise insulin dose management to prevent both acute and long-term complications. The optimization of insulin dosing remains a significant clinical challenge, as inappropriate dosing can lead to hypoglycemia or hyperglycemia, each carrying substantial morbidity risks. Machine learning approaches have emerged as promising tools for developing clinical decision support systems; however, their practical implementation requires both high predictive accuracy and model interpretability. This study aimed to develop and evaluate an explainable machine learning framework for predicting insulin dose adjustments in diabetic patients. We sought to compare multiple ensemble learning approaches and identify the optimal model configuration that balances predictive performance with clinical interpretability through comprehensive SHAP and LIME analyses. Materials and Methods: A comprehensive dataset comprising 10,000 patient records with 12 clinical and demographic features was utilized. We implemented and compared nine machine learning models, including gradient boosting variants (XGBoost, LightGBM, CatBoost, GradientBoosting), AdaBoost, and four ensemble strategies (Voting, Stacking, Blending, and Meta-Learning). Model interpretability was achieved through SHapley Additive exPlanations (SHAP) and Local Interpretable Model-agnostic Explanations (LIME) analyses. Performance was evaluated using accuracy, weighted F1-score, area under the receiver operating characteristic curve (AUC-ROC), precision-recall AUC (PR-AUC), sensitivity, specificity, and cross-entropy loss. Results: The Meta-Learning Ensemble achieved superior performance across all evaluation metrics, attaining an accuracy of 81.35%, weighted F1-score of 0.8121, macro-averaged AUC-ROC of 0.9637, and PR-AUC of 0.9317. The model demonstrated exceptional sensitivity (86.61%) and specificity (91.79%), with particularly high performance in detecting dose reduction requirements (100% sensitivity for the ‘down’ class). SHAP analysis revealed insulin sensitivity, previous medications, sleep hours, weight, and body mass index as the most influential predictors across different insulin adjustment categories. The meta-model feature importance analysis indicated that LightGBM probability estimates contributed most significantly to the ensemble predictions. Conclusions: The proposed explainable Meta-Learning Ensemble framework demonstrates robust predictive capability for insulin dose adjustment recommendations while maintaining clinical interpretability. The integration of SHAP-based explanations facilitates clinician understanding of model predictions, supporting transparent and informed decision-making in diabetes management. This approach represents a significant advancement toward the clinical implementation of artificial intelligence in personalized insulin therapy. Full article
Show Figures

Figure 1

47 pages, 10831 KB  
Article
DS PRO-S: A Success Assessment Model and Methodology for Data Science Projects
by Gonca Tokdemir Gökay, Ebru Gökalp and P. Erhan Eren
Appl. Sci. 2026, 16(5), 2551; https://doi.org/10.3390/app16052551 - 6 Mar 2026
Viewed by 522
Abstract
There is a persistent paradox in the data science domain: despite the growing recognition of data as a strategic asset, many projects designed to leverage its value still suffer from high failure rates. To address this challenge, this study introduces the Data Science [...] Read more.
There is a persistent paradox in the data science domain: despite the growing recognition of data as a strategic asset, many projects designed to leverage its value still suffer from high failure rates. To address this challenge, this study introduces the Data Science Projects Success Assessment Model (DS PRO-S), developed using a Design Science Research approach to make data science project success explicit, measurable, and comparable. DS PRO-S functions as a meta-model and an instantiation toolkit, complete with an operational methodology that supports success and health assessments using critical success factors (CSFs) and success criteria at both the phase and project levels through four distinct modules. This modular structure enables evaluations at any point in the data science lifecycle and informs timely, data-driven interventions before issues propagate. The measurement and evaluation framework within DS PRO-S aligns with ISO/IEC 15939, incorporating mathematical formulations for aggregating success criteria and CSFs into upper-level scores. To demonstrate its instantiability, completeness, and operational utility, case studies were conducted in a predictive analytics project of a large energy enterprise and a generative AI project of a vendor. The findings indicate that DS PRO-S is applicable in diverse project contexts in the data science domain and offers a robust solution for assessments. Full article
Show Figures

Figure 1

43 pages, 1927 KB  
Article
A Large-Scale Empirical Study of LLM Orchestration and Ensemble Strategies for Sentiment Analysis in Recommender Systems
by Konstantinos I. Roumeliotis, Dionisis Margaris, Dimitris Spiliotopoulos and Costas Vassilakis
Future Internet 2026, 18(2), 112; https://doi.org/10.3390/fi18020112 - 20 Feb 2026
Viewed by 1662
Abstract
This paper presents a comprehensive empirical evaluation comparing meta-model aggregation strategies with traditional ensemble methods and standalone models for sentiment analysis in recommender systems beyond standalone large language model (LLM) performance. We investigate whether aggregating multiple LLMs through a reasoning-based meta-model provides measurable [...] Read more.
This paper presents a comprehensive empirical evaluation comparing meta-model aggregation strategies with traditional ensemble methods and standalone models for sentiment analysis in recommender systems beyond standalone large language model (LLM) performance. We investigate whether aggregating multiple LLMs through a reasoning-based meta-model provides measurable performance advantages over individual models and standard statistical aggregation approaches in zero-shot sentiment classification. Using a balanced dataset of 5000 verified Amazon purchase reviews (1000 reviews per rating category from 1 to 5 stars, sampled via two-stage stratified sampling across five product categories), we evaluate 12 different leading pre-trained LLMs from four major providers (OpenAI, Anthropic, Google, and DeepSeek) in both standalone and meta-model configurations. Our experimental design systematically compares individual model performance against GPT-based meta-model aggregation and traditional ensemble baselines (majority voting, mean aggregation). Results show statistically significant improvements (McNemar’s test, p < 0.001): the GPT-5 meta-model achieves 71.40% accuracy (10.15 percentage point improvement over the 61.25% individual model average), while the GPT-5 mini meta-model reaches 70.32% (9.07 percentage point improvement). These observed improvements surpass traditional ensemble methods (majority voting: 62.64%; mean aggregation: 62.96%), suggesting potential value in meta-model aggregation for sentiment analysis tasks. Our analysis reveals empirical patterns including neutral sentiment classification challenges (3-star ratings show 64.83% failure rates across models), model influence hierarchies, and cost-accuracy trade-offs ($130.45 aggregation cost vs. $0.24–$43.97 for individual models per 5000 predictions). This work provides evidence-based insights into the comparative effectiveness of LLM aggregation strategies in recommender systems, demonstrating that meta-model aggregation with natural language reasoning capabilities achieves measurable performance gains beyond statistical aggregation alone. Full article
(This article belongs to the Special Issue Intelligent Agents and Their Application)
Show Figures

Graphical abstract

30 pages, 7851 KB  
Article
Integrating Machine Learning and Simulation for Integrated Mine-to-Mill Flowsheet Modelling: A Meta-Modelling Framework
by Pouya Nobahar, Chaoshui Xu and Peter Dowd
Minerals 2026, 16(2), 216; https://doi.org/10.3390/min16020216 - 20 Feb 2026
Viewed by 531
Abstract
The growing global demand for mineral resources is challenging mining operations to maintain productivity while processing lower-grade ores and increasingly complex deposits. This study presents an integrated framework that leverages machine learning (ML) and high-fidelity simulation to model and support scenario-based decision-making for [...] Read more.
The growing global demand for mineral resources is challenging mining operations to maintain productivity while processing lower-grade ores and increasingly complex deposits. This study presents an integrated framework that leverages machine learning (ML) and high-fidelity simulation to model and support scenario-based decision-making for the blasting–crushing–SAG (Semi-Autogenous Grindin) milling chain using a calibrated flowsheet. Using publicly available data from the Barrick Cortez Mine (Nevada, USA), more than three million operational scenarios were generated using the Integrated Extraction Simulator (IES) to capture system variability and sensitivity. Machine learning meta-models, built using Random Forest and XGBoost methods, were trained on the simulated data and achieved coefficients of determination (R2) exceeding 0.90 across all key outputs, including P20, P50, P80, and mass flow rates at different operational stages. The meta-models accurately reproduced plant-scale behaviour while reducing computational requirements by several orders of magnitude compared with full-scale simulations. SHapley Additive exPlanations (SHAP) analysis revealed that blast-hole diameter, explosive energy parameters, screen cut-size, crusher feed characteristics, and SAG mill operating conditions are the dominant factors impacting downstream particle size distributions. The proposed framework enables near-real-time evaluation of “what-if” operational scenarios and provides transparent, quantitative decision-support for integrated mine-to-mill optimisation. Full article
(This article belongs to the Section Mineral Processing and Extractive Metallurgy)
Show Figures

Figure 1

20 pages, 1739 KB  
Systematic Review
Systematic Review and Model-Based Meta-Analysis of Targeted Drugs for Systemic Sclerosis
by Marina Vaskeikina, Yaroslav Ugolkov, Boris Kireev, Kirill Peskov and Alina Volkova
Pharmaceutics 2026, 18(2), 250; https://doi.org/10.3390/pharmaceutics18020250 - 18 Feb 2026
Viewed by 1132
Abstract
Background: Systemic sclerosis (SSc) is a complex autoimmune fibrotic disorder marked by heterogeneous clinical features and multiple pathophysiological mechanisms. The rapid emergence of targeted therapies, aimed at selectively modulating molecular targets, has expanded treatment options; however, making direct efficacy comparisons remains challenging [...] Read more.
Background: Systemic sclerosis (SSc) is a complex autoimmune fibrotic disorder marked by heterogeneous clinical features and multiple pathophysiological mechanisms. The rapid emergence of targeted therapies, aimed at selectively modulating molecular targets, has expanded treatment options; however, making direct efficacy comparisons remains challenging due to the variability in trial designs, endpoints, and patient populations. Methods: A systematic search of PubMed and ClinicalTrials.gov identified randomized controlled trials (RCTs) evaluating targeted therapies in SSc. A longitudinal mixed-effect meta-model incorporating Emax structural functions characterized treatment response trajectories for the modified Rodnan skin score (mRSS) and forced vital capacity (FVC). Between-study and between-treatment-arm variability were explicitly modeled to account for heterogeneity. Results: A total of 32 RCTs with 2036 patients and 23 targeted agents were analyzed. Guselkumab, an anti-IL-23 antibody, showed the greatest effect on mRSS, followed by tofacitinib, inebilizumab, and baricitinib. For FVC, B-cell-targeted therapies, with belimumab and rituximab, demonstrated the highest efficacy, while tocilizumab and nintedanib had more moderate effects. Time to 50% maximal response was approximately 27.5 weeks, indicating a 6.3-month period for half treatment response development. Conclusions: This model-based meta-analysis provides a broad comparison of targeted therapies in SSc, highlighting distinct efficacy patterns for skin versus lung involvement and offering hypothesis-generating insights that may support treatment selection and the design of future clinical trials. Full article
Show Figures

Graphical abstract

16 pages, 3373 KB  
Article
Intelligent Assessment Framework of Unmanned Air Vehicle Health Status Based on Bayesian Stacking
by Junfu Qiao, Jinqin Guo, Yu Zhang and Yongwei Li
Batteries 2026, 12(2), 62; https://doi.org/10.3390/batteries12020062 - 14 Feb 2026
Viewed by 457
Abstract
This paper proposed a stacking-based ensemble model to replace the traditional single machine learning model prediction approach, significantly improving the evaluation efficiency of SoC and SoH of lithium batteries. Firstly, a dataset was constructed including three input variables (temperature, current, and voltage) and [...] Read more.
This paper proposed a stacking-based ensemble model to replace the traditional single machine learning model prediction approach, significantly improving the evaluation efficiency of SoC and SoH of lithium batteries. Firstly, a dataset was constructed including three input variables (temperature, current, and voltage) and two output variables (SoC and SoH). Pearson correlation coefficients and histograms were used for preliminary analysis of the correlations and distributions of the dataset. The multi-layer perceptron (MLP), support vector machine (SVM), random forest (RF), and extreme gradient boosting tree (XGB) were used as base prediction models. Bayesian optimization (BO) was used to fine-tune the parameters of these models, then three statistical indicators were compared to assess the prediction accuracy of the four ML models. Furthermore, MLP, SVM, and RF were selected as base models, while XGB was used as the meta-model, enhancing the integrated performance of the prediction models. SHAP was used to quantify the influence of the output variables on SoC. Finally, linked measures for the prediction model were proposed to achieve autonomous monitoring of drones. The results showed that XGB exhibited superior prediction accuracy, with R2 of 0.93 and RMSE of 0.14. The ensemble model obtained using stacking reduced the number of outliers by 89.4%. Current was identified as the key variable influencing both SoC and SoH. Furthermore, the intelligent prediction model proposed in this paper can be integrated with controllers, visualization web pages, and other systems to enable the health status assessment of drones. Full article
(This article belongs to the Section Energy Storage System Aging, Diagnosis and Safety)
Show Figures

Figure 1

29 pages, 2919 KB  
Article
A Model-Driven Engineering Approach to AI-Powered Healthcare Platforms
by Mira Raheem, Neamat Eltazi, Michael Papazoglou, Bernd Krämer and Amal Elgammal
Informatics 2026, 13(2), 32; https://doi.org/10.3390/informatics13020032 - 11 Feb 2026
Viewed by 962
Abstract
Artificial intelligence (AI) has the potential to transform healthcare by supporting more accurate diagnoses and personalized treatments. However, its adoption in practice remains constrained by fragmented data sources, strict privacy rules, and the technical complexity of building reliable clinical systems. To address these [...] Read more.
Artificial intelligence (AI) has the potential to transform healthcare by supporting more accurate diagnoses and personalized treatments. However, its adoption in practice remains constrained by fragmented data sources, strict privacy rules, and the technical complexity of building reliable clinical systems. To address these challenges, we introduce a model-driven engineering (MDE) framework designed specifically for healthcare AI. The framework relies on formal metamodels, domain-specific languages (DSLs), and automated transformations to move from high-level specifications to running software. At its core is the Medical Interoperability Language (MILA), a graphical DSL that enables clinicians and data scientists to define queries and machine learning pipelines using shared ontologies. When combined with a federated learning architecture, MILA allows institutions to collaborate without exchanging raw patient data, ensuring semantic consistency across sites while preserving privacy. We evaluate this approach in a multi-center cancer immunotherapy study. The generated pipelines delivered strong predictive performance, with best-performing models achieving up to 98.5% accuracy on selected prediction tasks, while substantially reducing manual coding effort. These findings suggest that MDE principles—metamodeling, semantic integration, and automated code generation—can provide a practical path toward interoperable, reproducible, and reliable digital health platforms. Full article
(This article belongs to the Section Health Informatics)
Show Figures

Figure 1

Back to TopTop