Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (31,507)

Search Parameters:
Keywords = accuracy prediction model

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
20 pages, 1815 KB  
Article
Modelling, Optimisation, and Construction of a High-Temperature Superconducting Maglev Demonstrator
by Chenxuan Zhang, Qian Dong, Hongye Zhang and Markus Mueller
Machines 2026, 14(1), 108; https://doi.org/10.3390/machines14010108 (registering DOI) - 16 Jan 2026
Abstract
To achieve global carbon-neutrality goals, magnetic levitation (maglev) technologies offer a promising pathway toward sustainable, energy-efficient transportation systems. In this study, a comprehensive methodology was developed to analyse and optimise the levitation performance of high-temperature superconducting (HTS) maglev systems. Several permanent magnet guideway [...] Read more.
To achieve global carbon-neutrality goals, magnetic levitation (maglev) technologies offer a promising pathway toward sustainable, energy-efficient transportation systems. In this study, a comprehensive methodology was developed to analyse and optimise the levitation performance of high-temperature superconducting (HTS) maglev systems. Several permanent magnet guideway (PMG) configurations were compared, and an optimised PMG Halbach array design was identified that enhances flux concentration and significantly improves levitation performance. To accurately model the electromagnetic interaction between the HTS bulk and the external magnetic field, finite element models based on the H-formulation were established in both two dimensions (2D) and three dimensions (3D). An HTS maglev demonstrator was built using YBCO bulks, and an experimental platform was constructed to measure levitation force. While the 2D model offers fast computation, it shows deviations from the measurements due to geometric simplifications, whereas the 3D model predicts levitation forces for the cylindrical bulk with much higher accuracy, with errors remaining below 10%. The strong agreement between experimental measurements and the 3D simulation across the entire force–height cycle confirms that the proposed model reliably reproduces the electromagnetic coupling and resulting levitation forces in HTS maglev systems. The paper provides a practical and systematic reference for the optimal design and experimental validation of HTS bulk-based maglev systems. Full article
(This article belongs to the Section Vehicle Engineering)
29 pages, 9144 KB  
Article
PhysGraphIR: Adaptive Physics-Informed Graph Learning for Infrared Thermal Field Prediction in Meter Boxes with Residual Sampling and Knowledge Distillation
by Hao Li, Siwei Li, Xiuli Yu and Xinze He
Electronics 2026, 15(2), 410; https://doi.org/10.3390/electronics15020410 (registering DOI) - 16 Jan 2026
Abstract
Infrared thermal field (ITF) prediction for meter boxes is crucial for the early warning of power system faults, yet this method faces three major challenges: data sparsity, complex geometry, and resource constraints in edge computing. Existing physics-informed neural network-graph neural network (PINN-GNN) approaches [...] Read more.
Infrared thermal field (ITF) prediction for meter boxes is crucial for the early warning of power system faults, yet this method faces three major challenges: data sparsity, complex geometry, and resource constraints in edge computing. Existing physics-informed neural network-graph neural network (PINN-GNN) approaches suffer from redundant physics residual calculations (over 70% of flat regions contain little information) and poor model generalization (requiring retraining for new box types), making them inefficient for deployment on edge devices. This paper proposes the PhysGraphIR framework, which employs an Adaptive Residual Sampling (ARS) mechanism to dynamically identify hotspot region nodes through a physics-aware gating network, calculating physics residuals only at critical nodes to reduce computational overhead by over 80%. In this study, a `hotspot region’ is explicitly defined as a localized area exhibiting significant temperature elevation relative to the background—typically concentrated around electrical connection terminals or wire entrances—which is critical for identifying potential thermal faults under sparse data conditions. Additionally, it utilizes a Physics Knowledge Distillation Graph Neural Network (Physics-KD GNN) to decouple physics learning from geometric learning, transferring universal heat conduction knowledge to specific meter box geometries through a teacher–student architecture. Experimental results demonstrate that on both synthetic and real-world meter box datasets, PhysGraphIR achieves a hotspot region mean absolute error (MAE) of 11.8 °C under 60% infrared data missing conditions, representing a 22% improvement over traditional PINN-GNN. The training speed is accelerated by 3.1 times, requiring only five infrared samples to adapt to new box types. The experiments prove that this method significantly enhances prediction accuracy and computational efficiency under sparse infrared data while maintaining physical consistency, providing a feasible solution for edge intelligence in power systems. Full article
32 pages, 22265 KB  
Article
A Hybrid Ensemble Learning Framework for Accurate Photovoltaic Power Prediction
by Wajid Ali, Farhan Akhtar, Asad Ullah and Woo Young Kim
Energies 2026, 19(2), 453; https://doi.org/10.3390/en19020453 (registering DOI) - 16 Jan 2026
Abstract
Accurate short-term forecasting of solar photovoltaic (PV) power output is essential for efficient grid integration and energy management, especially given the widespread global adoption of PV systems. To address this research gap, the present study introduces a scalable, interpretable ensemble learning model of [...] Read more.
Accurate short-term forecasting of solar photovoltaic (PV) power output is essential for efficient grid integration and energy management, especially given the widespread global adoption of PV systems. To address this research gap, the present study introduces a scalable, interpretable ensemble learning model of PV power prediction with respect to a large PVOD v1.0 dataset, which encompasses more than 270,000 points representing ten PV stations. The proposed methodology involves data preprocessing, feature engineering, and a hybrid ensemble model consisting of Random Forest, XGBoost, and CatBoost. Temporal features, which included hour, day, and month, were created to reflect the diurnal and seasonal characteristics, whereas feature importance analysis identified global irradiance, temperature, and temporal indices as key indicators. The hybrid ensemble model presented has a high predictive power, with an R2 = 0.993, a Mean Absolute Error (MAE) = 0.227 kW, and a Root Mean Squared Error (RMSE) = 0.628 kW when applied to the PVOD v1.0 dataset to predict short-term PV power. These findings were achieved on standardized, multi-station, open access data and thus are not in an entirely rigorous sense comparable to previous studies that may have used other datasets, forecasting horizons, or feature sets. Rather than asserting numerical dominance over other approaches, this paper focuses on the real utility of integrating well-known tree-based ensemble techniques with time-related feature engineering to derive real, interpretable, and computationally efficient PV power prediction models that can be used in smart grid applications. This paper shows that a mixture of conventional ensemble methods and extensive temporal feature engineering is effective in producing consistent accuracy in PV forecasting. The framework can be reproduced and run efficiently, which makes it applicable in the integration of smart grid applications. Full article
(This article belongs to the Special Issue Advanced Control Strategies for Photovoltaic Energy Systems)
41 pages, 1444 KB  
Article
A Physics-Informed Combinatorial Digital Twin for Value-Optimized Production of Petroleum Coke
by Vladimir V. Bukhtoyarov, Alexey A. Gorodov, Natalia A. Shepeta, Ivan S. Nekrasov, Oleg A. Kolenchukov, Svetlana S. Kositsyna and Artem Y. Mikhaylov
Energies 2026, 19(2), 451; https://doi.org/10.3390/en19020451 (registering DOI) - 16 Jan 2026
Abstract
Petroleum coke quality strongly influences refinery economics and downstream energy use, yet real-time control is constrained by slow quality assays and a 24–48 h lag in laboratory results. This study introduces a physics-informed combinatorial digital twin for value-optimized coking, aimed at improving energy [...] Read more.
Petroleum coke quality strongly influences refinery economics and downstream energy use, yet real-time control is constrained by slow quality assays and a 24–48 h lag in laboratory results. This study introduces a physics-informed combinatorial digital twin for value-optimized coking, aimed at improving energy efficiency and environmental performance through adaptive quality forecasting. The approach builds a modular library of 32 candidate equations grouped into eight quality parameters and links them via cross-parameter dependencies. A two-level optimization scheme is applied: a genetic algorithm selects the best model combination, while a secondary loop tunes parameters under a multi-objective fitness function balancing accuracy, interpretability, and computational cost. Validation on five clustered operating regimes (industrial patterns augmented with noise-perturbed synthetic data) shows that optimal model ensembles outperform single best models, achieving typical cluster errors of ~7–13% NMAE. The developed digital twin framework enables accurate prediction of coke quality parameters that are critical for its energy applications, such as volatile matter and sulfur content, which serve as direct proxies for estimating the net calorific value and environmental footprint of coke as a fuel. Full article
(This article belongs to the Special Issue AI-Driven Modeling and Optimization for Industrial Energy Systems)
18 pages, 773 KB  
Article
A Radiomics-Based Machine Learning Model for Predicting Pneumonitis During Durvalumab Treatment in Locally Advanced NSCLC
by Takeshi Masuda, Daisuke Kawahara, Wakako Daido, Nobuki Imano, Naoko Matsumoto, Kosuke Hamai, Yasuo Iwamoto, Yusuke Takayama, Sayaka Ueno, Masahiko Sumii, Hiroyasu Shoda, Nobuhisa Ishikawa, Masahiro Yamasaki, Yoshifumi Nishimura, Shigeo Kawase, Naoki Shiota, Yoshikazu Awaya, Soichi Kitaguchi, Yuji Murakami, Yasushi Nagata and Noboru Hattoriadd Show full author list remove Hide full author list
AI 2026, 7(1), 32; https://doi.org/10.3390/ai7010032 (registering DOI) - 16 Jan 2026
Abstract
Introduction: Pneumonitis represents one of the clinically significant adverse events observed in patients with non-small-cell lung cancer (NSCLC) who receive durvalumab as consolidation therapy after chemoradiotherapy (CRT). Although clinical factors such as radiation dose (e.g., V20) and interstitial lung abnormalities (ILAs) have been [...] Read more.
Introduction: Pneumonitis represents one of the clinically significant adverse events observed in patients with non-small-cell lung cancer (NSCLC) who receive durvalumab as consolidation therapy after chemoradiotherapy (CRT). Although clinical factors such as radiation dose (e.g., V20) and interstitial lung abnormalities (ILAs) have been reported as risk predictors, accurate and objective prognostication remains difficult. This study aimed to develop a radiomics-based machine learning model to predict grade ≥ 2 pneumonitis. Methods: This retrospective study included patients with unresectable NSCLC who received CRT followed by durvalumab. Radiomic features, including first-order and texture and shape-based features with wavelet transformation were extracted from whole-lung regions on pre-durvalumab computed tomography (CT) images. Machine learning models, support vector machines, k-nearest neighbor, neural networks, and naïve Bayes classifiers were developed and evaluated using a testing cohort. Model performance was assessed using five-fold cross-validation. Conventional predictors, including V20 and ILAs, were also assessed using logistic regression and receiver operating characteristic analysis. Results: Among 123 patients, 44 (35.8%) developed grade ≥ 2 pneumonitis. The best-performing model, a support vector machine, achieved an AUC of 0.88 and accuracy of 0.81, the conventional model showed lower performance with an AUC of 0.71 and accuracy of 0.64. Conclusions: Radiomics-based machine learning demonstrated superior performance over clinical parameters in predicting pneumonitis. This approach may enable individualized risk stratification and support early intervention in patients with NSCLC. Full article
(This article belongs to the Section Medical & Healthcare AI)
Show Figures

Figure 1

24 pages, 1025 KB  
Article
Hallucination-Aware Interpretable Sentiment Analysis Model: A Grounded Approach to Reliable Social Media Content Classification
by Abdul Rahaman Wahab Sait and Yazeed Alkhurayyif
Electronics 2026, 15(2), 409; https://doi.org/10.3390/electronics15020409 (registering DOI) - 16 Jan 2026
Abstract
Sentiment analysis (SA) has become an essential tool for analyzing social media content in order to monitor public opinion and support digital analytics. Although transformer-based SA models exhibit remarkable performance, they lack mechanisms to mitigate hallucinated sentiment, which refers to the generation of [...] Read more.
Sentiment analysis (SA) has become an essential tool for analyzing social media content in order to monitor public opinion and support digital analytics. Although transformer-based SA models exhibit remarkable performance, they lack mechanisms to mitigate hallucinated sentiment, which refers to the generation of unsupported or overconfident predictions without explicit linguistic evidence. To address this limitation, this study presents a hallucination-aware SA model by incorporating semantic grounding, interpretability-congruent supervision, and neuro-symbolic reasoning within a unified architecture. The proposed model is based on a fine-tuned Open Pre-trained Transformer (OPT) model, using three fundamental mechanisms: a Sentiment Integrity Filter (SIF), a SHapley Additive exPlanations (SHAP)-guided regularization technique, and a confidence-based lexicon-deep fusion module. The experimental analysis was conducted on two multi-class sentiment datasets that contain Twitter (now X) and Reddit posts. In Dataset 1, the suggested model achieved an average accuracy of 97.6% and a hallucination rate of 2.3%, outperforming the current transformer-based and hybrid sentiment models. With Dataset 2, the framework demonstrated strong external generalization with an accuracy of 95.8%, and a hallucination rate of 3.4%, which is significantly lower than state-of-the-art methods. These findings indicate that it is possible to include hallucination mitigation into transformer optimization without any performance degradation, offering a deployable, interpretable, and linguistically complex social media SA framework, which will enhance the reliability of neural systems of language understanding. Full article
22 pages, 18812 KB  
Article
Integration of X-Ray CT, Sensor Fusion, and Machine Learning for Advanced Modeling of Preharvest Apple Growth Dynamics
by Weiqun Wang, Dario Mengoli, Shangpeng Sun and Luigi Manfrini
Sensors 2026, 26(2), 623; https://doi.org/10.3390/s26020623 - 16 Jan 2026
Abstract
Understanding the complex interplay between environmental factors and fruit quality development requires sophisticated analytical approaches linking cellular architecture to environmental conditions. This study introduces a novel application of dual-resolution X-ray computed tomography (CT) for the non-destructive characterization of apple internal tissue architecture in [...] Read more.
Understanding the complex interplay between environmental factors and fruit quality development requires sophisticated analytical approaches linking cellular architecture to environmental conditions. This study introduces a novel application of dual-resolution X-ray computed tomography (CT) for the non-destructive characterization of apple internal tissue architecture in relation to fruit growth, thereby advancing beyond traditional methods that are primarily focused on postharvest analysis. By extracting detailed three-dimensional structural parameters, we reveal tissue porosity and heterogeneity influenced by crop load, maturity timing and canopy position, offering insights into internal quality attributes. Employing correlation analysis, Principal Component Analysis, Canonical Correlation Analysis, and Structural Equation Modeling, we identify temperature as the primary environmental driver, particularly during early developmental stages (45 Days After Full Bloom, DAFB), and uncover nonlinear, hierarchical effects of preharvest environmental factors such as vapor pressure deficit, relative humidity, and light on quality traits. Machine learning models (Multiple Linear Regression, Random Forest, XGBoost) achieve high predictive accuracy (R² > 0.99 for Multiple Linear Regression), with temperature as the key predictor. These baseline results represent findings from a single growing season and require validation across multiple seasons and cultivars before operational application. Temporal analysis highlights the importance of early-stage environmental conditions. Integrating structural and environmental data through innovative visualization tools, such as anatomy-based radar charts, facilitates comprehensive interpretation of complex interactions. This multidisciplinary framework enhances predictive precision and provides a baseline methodology to support precision orchard management under typical agricultural variability. Full article
(This article belongs to the Special Issue Feature Papers in Sensing and Imaging 2025&2026)
28 pages, 2027 KB  
Article
Dynamic Resource Games in the Wood Flooring Industry: A Bayesian Learning and Lyapunov Control Framework
by Yuli Wang and Athanasios V. Vasilakos
Algorithms 2026, 19(1), 78; https://doi.org/10.3390/a19010078 - 16 Jan 2026
Abstract
Wood flooring manufacturers face complex challenges in dynamically allocating resources across multi-channel markets, characterized by channel conflicts, demand uncertainty, and long-term cumulative effects of decisions. Traditional static optimization or myopic approaches struggle to address these intertwined factors, particularly when critical market states like [...] Read more.
Wood flooring manufacturers face complex challenges in dynamically allocating resources across multi-channel markets, characterized by channel conflicts, demand uncertainty, and long-term cumulative effects of decisions. Traditional static optimization or myopic approaches struggle to address these intertwined factors, particularly when critical market states like brand reputation and customer base cannot be precisely observed. This paper establishes a systematic and theoretically grounded online decision framework to tackle this problem. We first model the problem as a Partially Observable Stochastic Dynamic Game. The core innovation lies in introducing an unobservable market position vector as the central system state, whose evolution is jointly influenced by firm investments, inter-channel competition, and macroeconomic randomness. The model further captures production lead times, physical inventory dynamics, and saturation/cross-channel effects of marketing investments, constructing a high-fidelity dynamic system. To solve this complex model, we propose a hierarchical online learning and control algorithm named L-BAP (Lyapunov-based Bayesian Approximate Planning), which innovatively integrates three core modules. It employs particle filters for Bayesian inference to nonparametrically estimate latent market states online. Simultaneously, the algorithm constructs a Lyapunov optimization framework that transforms long-term discounted reward objectives into tractable single-period optimization problems through virtual debt queues, while ensuring stability of physical systems like inventory. Finally, the algorithm embeds a game-theoretic module to predict and respond to rational strategic reactions from each channel. We provide theoretical performance analysis, rigorously proving the mean-square boundedness of system queues and deriving the performance gap between long-term rewards and optimal policies under complete information. This bound clearly quantifies the trade-off between estimation accuracy (determined by particle count) and optimization parameters. Extensive simulations demonstrate that our L-BAP algorithm significantly outperforms several strong baselines—including myopic learning and decentralized reinforcement learning methods—across multiple dimensions: long-term profitability, inventory risk control, and customer service levels. Full article
(This article belongs to the Section Analysis of Algorithms and Complexity Theory)
20 pages, 5656 KB  
Article
Reading the Himalayan Treeline in 3D: Species Turnover and Structural Thresholds from UAV LiDAR
by Niti B. Mishra and Paras Bikram Singh
Remote Sens. 2026, 18(2), 309; https://doi.org/10.3390/rs18020309 - 16 Jan 2026
Abstract
Mountain treelines are among the most climate-sensitive ecosystems on Earth, yet their fine-scale structural and species level dynamics remain poorly resolved in the Himalayas. In particular, the absence of three-dimensional, crown level measurements have hindered the detection of structural thresholds and species turnover [...] Read more.
Mountain treelines are among the most climate-sensitive ecosystems on Earth, yet their fine-scale structural and species level dynamics remain poorly resolved in the Himalayas. In particular, the absence of three-dimensional, crown level measurements have hindered the detection of structural thresholds and species turnover that often precede treeline shifts. To bridge this gap, we introduce UAV LiDAR—applied for the first time in the Hindu Kush Himalayas—to quantify canopy structure and tree species distributions across a steep treeline ecotone in the Manang Valley of central Nepal. High-density UAV-LiDAR data acquired over elevations of 3504–4119 m was used to quantify elevation-dependent changes in canopy stature and cover from a canopy height model derived from the 3D point cloud, while individual tree segmentation and species classification were performed directly on the 3D, height-normalized point cloud at the crown level. Individual trees were delineated using a watershed-based segmentation algorithm while tree species were classified using a random forest model trained on LiDAR-derived structural and intensity metrics, supported by field-validated reference data. Results reveal a sharply defined treeline characterized by an abrupt collapse in canopy height and cover within a narrow ~60–80 m vertical interval. Treeline “threshold” was quantified as a breakpoint elevation from a piecewise model of tree cover versus elevation, and the elevation span over which modeled cover and height distributions rapidly declined from forest values to near-zero. Segmented regression identified a distinct structural breakpoint near 3995 m elevation. Crown-level species predictions aggregated by elevation quantified an ordered turnover in dominance, with Pinus wallichiana most frequent at lower elevations, Abies spectabilis peaking mid-slope, and Betula utilis concentrated near the upper treeline. Species classification achieved high overall accuracy (>85%), although performance varied among taxa, with broadleaf Betula more difficult to discriminate than conifers. These findings underscore UAV LiDAR’s value for resolving sharp ecological thresholds, identifying elevation-driven simplification in forest structure, and bridging observation gaps in remote, rugged mountain ecosystems. Full article
Show Figures

Figure 1

22 pages, 4895 KB  
Article
Carbon Convenience Yields and Probability Density Forecasts for Carbon Returns
by Meng Han, Jia You and Min Lin
Mathematics 2026, 14(2), 315; https://doi.org/10.3390/math14020315 - 16 Jan 2026
Abstract
We explore the role of carbon convenience yields in forecasting the probability density of carbon returns. While theory suggests that convenience yields contain forward-looking information, their predictive content for carbon returns—especially in a density forecasting framework—remains underexplored. We propose a probability density forecasting [...] Read more.
We explore the role of carbon convenience yields in forecasting the probability density of carbon returns. While theory suggests that convenience yields contain forward-looking information, their predictive content for carbon returns—especially in a density forecasting framework—remains underexplored. We propose a probability density forecasting approach that combines a mixed data sampling (MIDAS) regression with a non-parametric bootstrap and kernel density estimation. Using data from the European carbon market, we find that convenience yields significantly predict carbon returns. It takes approximately 19 days for a disturbance in carbon convenience yields to affect carbon returns, with the impact persisting for around 27 days. Moreover, our approach outperforms existing benchmark models in predicting the probability density of carbon returns, showing superior predictive accuracy and robustness. Full article
(This article belongs to the Special Issue Mathematical Problems in Financial Fluctuations and Forecasting)
31 pages, 1550 KB  
Article
Valuation of New Carbon Asset CCER
by Hua Tang, Jiayi Wang, Yue Liu, Hanxiao Li and Boyan Zou
Sustainability 2026, 18(2), 940; https://doi.org/10.3390/su18020940 - 16 Jan 2026
Abstract
As a critical carbon offset mechanism, China’s Certified Emission Reduction (CCER) plays a pivotal role in achieving the “dual carbon” targets. With the relaunch of its trading market, refining the CCER valuation framework has become imperative. This study develops a multidimensional CCER valuation [...] Read more.
As a critical carbon offset mechanism, China’s Certified Emission Reduction (CCER) plays a pivotal role in achieving the “dual carbon” targets. With the relaunch of its trading market, refining the CCER valuation framework has become imperative. This study develops a multidimensional CCER valuation methodology based on both the income and market approaches. Under the income approach, two probabilistic models—discrete and continuous emission distribution frameworks—are proposed to quantify CCER value. Under the market approach, a Geometric Brownian Motion (GBM) model and a Long Short-Term Memory (LSTM) neural network model are constructed to capture nonlinear temporal dynamics in CCER pricing. Through a systematic comparative analysis of the outputs and methodologies of these models, this study identifies optimal pricing strategies to enhance CCER valuation. Results reveal significant disparities among models in predictive accuracy, computational efficiency, and adaptability to market dynamics. Each model exhibits distinct strengths and limitations, necessitating scenario-specific selection based on data availability, application context, and timeliness requirements to strike a balance between precision and efficiency. These findings offer both theoretical and practical insights to support the development of the CCER market. Full article
(This article belongs to the Special Issue Sustainable Development: Integrating Economy, Energy and Environment)
Show Figures

Figure 1

26 pages, 1924 KB  
Article
Mathematically Grounded Neuro-Fuzzy Control of IoT-Enabled Irrigation Systems
by Nikolay Hinov, Reni Kabakchieva, Daniela Gotseva and Plamen Stanchev
Mathematics 2026, 14(2), 314; https://doi.org/10.3390/math14020314 - 16 Jan 2026
Abstract
This paper develops a mathematically grounded neuro-fuzzy control framework for IoT-enabled irrigation systems in precision agriculture. A discrete-time, physically motivated model of soil moisture is formulated to capture the nonlinear water dynamics driven by evapotranspiration, irrigation, and drainage in the crop root zone. [...] Read more.
This paper develops a mathematically grounded neuro-fuzzy control framework for IoT-enabled irrigation systems in precision agriculture. A discrete-time, physically motivated model of soil moisture is formulated to capture the nonlinear water dynamics driven by evapotranspiration, irrigation, and drainage in the crop root zone. A Mamdani-type fuzzy controller is designed to approximate the optimal irrigation strategy, and an equivalent Takagi–Sugeno (TS) representation is derived, enabling a rigorous stability analysis based on Input-to-State Stability (ISS) theory and Linear Matrix Inequalities (LMIs). Online parameter estimation is performed using a Recursive Least Squares (RLS) algorithm applied to real IoT field data collected from a drip-irrigated orchard. To enhance prediction accuracy and long-term adaptability, the fuzzy controller is augmented with lightweight artificial neural network (ANN) modules for evapotranspiration estimation and slow adaptation of membership-function parameters. This work provides one of the first mathematically certified neuro-fuzzy irrigation controllers integrating ANN-based estimation with Input-to-State Stability (ISS) and LMI-based stability guarantees. Under mild Lipschitz continuity and boundedness assumptions, the resulting neuro-fuzzy closed-loop system is proven to be uniformly ultimately bounded. Experimental validation in an operational IoT setup demonstrates accurate soil-moisture regulation, with a tracking error below 2%, and approximately 28% reduction in water consumption compared to fixed-schedule irrigation. The proposed framework is validated on a real IoT deployment and positioned relative to existing intelligent irrigation approaches. Full article
(This article belongs to the Special Issue Advances in Fuzzy Logic and Artificial Neural Networks, 2nd Edition)
23 pages, 773 KB  
Article
Predicting Employee Turnover Based on Improved ADASYN and GS-CatBoost
by Shuigen Hu and Kai Dong
Mathematics 2026, 14(2), 313; https://doi.org/10.3390/math14020313 - 16 Jan 2026
Abstract
In corporate management practices, human resources are among the most active and critical elements, and frequent employee turnover can impose substantial losses on firms. Accurately predicting employee turnover dynamics and identifying turnover propensity in advance is therefore of significant importance for organizational development. [...] Read more.
In corporate management practices, human resources are among the most active and critical elements, and frequent employee turnover can impose substantial losses on firms. Accurately predicting employee turnover dynamics and identifying turnover propensity in advance is therefore of significant importance for organizational development. To improve turnover prediction performance, this study proposes an employee turnover prediction model that integrates an improved ADASYN data rebalancing algorithm with a grid-search-optimized CatBoost classifier. In practice, turnover instances typically constitute a minority class; severe class imbalance may lead to overfitting or underfitting and thus degrade predictive performance. To mitigate imbalance, we employ ADASYN oversampling to reduce skewness in the dataset. However, because ADASYN is primarily designed for continuous features, it may generate invalid or meaningless values when discrete variables are present. Accordingly, we improve ADASYN by introducing a new distance metric and an enhanced sample generation strategy, making it applicable to turnover data with mixed (continuous and discrete) features. Given CatBoost’s strong predictive capability in high-dimensional settings, we adopt CatBoost as the base learner. Nonetheless, CatBoost performance is highly sensitive to hyperparameter choices, and different parameter combinations can yield markedly different results. Therefore, we apply grid search (GS) to efficiently optimize CatBoost hyperparameters and obtain the best-performing configuration. Experimental results on three datasets demonstrate that the proposed improved-ADASYN GS-CatBoost model effectively enhances turnover prediction performance, exhibiting strong robustness and adaptability. Compared with existing models, our approach improves predictive accuracy by approximately 4.6112%. Full article
(This article belongs to the Section E5: Financial Mathematics)
31 pages, 4308 KB  
Article
A Study into Aspect Ratio and the Influence of Platen Restraint on the Compressive Strength of Jute Fibre-Reinforced Compressed Earth Composites
by Jack Andrew Cottrell, Muhammad Ali, D. Brett Martinson and D. Lavorato
Fibers 2026, 14(1), 13; https://doi.org/10.3390/fib14010013 - 16 Jan 2026
Abstract
This study investigates the behaviour of Compressed Earth Cylinders (CECs) and Compressed Earth Blocks (CEBs) during direct compression tests and examines the influence of aspect ratio and the effects of platen restraint. The experimental investigation utilises two soil types and examines the impact [...] Read more.
This study investigates the behaviour of Compressed Earth Cylinders (CECs) and Compressed Earth Blocks (CEBs) during direct compression tests and examines the influence of aspect ratio and the effects of platen restraint. The experimental investigation utilises two soil types and examines the impact of jute fibre reinforcement on the failure mechanism of CECs with aspect ratios ranging from 0.50 to 2.00. Through experimental analysis and numerical modelling, the effects of platen restraint are examined, and a novel hypothesis of intersecting cones is presented. The results show that specimens with a lower aspect ratio exhibited higher compressive strength due to confinement caused by platen restraint. Moreover, this research has derived new aspect ratio correction factors that enable conversion from Apparent Compressive Strength (ACS) to Unconfined Compressive Strength (UCS) of unstabilised and fibre-reinforced CECs. The experimental results indicate that the derived conversion factor of 0.861 allows for the prediction of CEB strength from CEC specimens with an accuracy of 2.7%. Furthermore, the addition of jute fibres at a 0.25% dosage increased the Apparent Compressive Strength across all aspect ratios. The outcome of this research recommends a standard approach to the application of aspect ratio correction factors when interpreting and reporting the compressive strength of CECs and CEBs. Full article
22 pages, 1881 KB  
Article
Heterogeneous Spatiotemporal Graph Attention Network for Karst Spring Discharge Prediction: Advancing Sustainable Groundwater Management Under Climate Change
by Chunmei Ma, Ke Xu, Ying Li, Yonghong Hao, Huazhi Sun, Shuai Gao, Xiangfeng Fan and Xueting Wang
Sustainability 2026, 18(2), 933; https://doi.org/10.3390/su18020933 - 16 Jan 2026
Abstract
Reliable forecasting of karst spring discharge is critical for sustainable groundwater resource management under the dual pressures of climate change and intensified anthropogenic activities. This study proposes a Heterogeneous Spatiotemporal Graph Attention Network (H-STGAT) to predict spring discharge dynamics at Shentou Spring, Shanxi [...] Read more.
Reliable forecasting of karst spring discharge is critical for sustainable groundwater resource management under the dual pressures of climate change and intensified anthropogenic activities. This study proposes a Heterogeneous Spatiotemporal Graph Attention Network (H-STGAT) to predict spring discharge dynamics at Shentou Spring, Shanxi Province, China. Unlike conventional spatiotemporal networks that treat all relationships uniformly, our model derives its heterogeneity from a graph structure that explicitly categorizes spatial, temporal, and periodic dependencies as unique edge classes. Specifically, a dual-layer attention mechanism is designed to independently extract hydrological features within each relational channel while dynamically assigning importance weights to fuse these multi-source dependencies. This architecture enables the adaptive capture of spatial heterogeneity, temporal dependencies, and multi-year periodic patterns in karst hydrological processes. Results demonstrate that H-STGAT outperforms both traditional statistical and deep learning models in predictive accuracy, achieving an RMSE of 0.22 m3/s and an NSE of 0.77. The model reveals a long-distance recharge pattern dominated by high-altitude regions, a finding validated by independent isotopic evidence, and accurately identifies an approximately 4–6 month lag between precipitation and spring discharge, which is consistent with the characteristic hydrological lag identified through statistical cross-covariance analysis. This research enhances the understanding of complex mechanisms in karst hydrological systems and provides a robust predictive tool for sustainable groundwater management and ecological conservation, while offering a generalizable methodological framework for similar complex karst hydrological systems. Full article
(This article belongs to the Section Sustainable Water Management)
Back to TopTop