Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (539)

Search Parameters:
Keywords = dynamic state forecasting

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
41 pages, 8140 KB  
Article
A Hierarchical Signal-to-Policy Learning Framework for Risk-Aware Portfolio Optimization
by Jiayang Yu and Kuo-Chu Chang
Int. J. Financial Stud. 2026, 14(3), 75; https://doi.org/10.3390/ijfs14030075 - 13 Mar 2026
Abstract
This study proposes a hierarchical signal-to-policy learning framework for risk-aware portfolio optimization that integrates model-based return forecasting, explainable machine learning, and deep reinforcement learning (DRL) within a unified architecture. In the first stage, next-period returns are estimated using gradient-boosted tree models, and SHAP-based [...] Read more.
This study proposes a hierarchical signal-to-policy learning framework for risk-aware portfolio optimization that integrates model-based return forecasting, explainable machine learning, and deep reinforcement learning (DRL) within a unified architecture. In the first stage, next-period returns are estimated using gradient-boosted tree models, and SHAP-based feature attributions are extracted to provide transparent, factor-level explanations of the predictive signals. In the second stage, a Proximal Policy Optimization (PPO) agent incorporates both predictive forecasts and explanatory signals into its state representation and learns dynamic allocation policies under a mean–CVaR reward function that explicitly penalizes tail risk while controlling trading frictions. By separating signal extraction from policy learning, the proposed architecture allows the use of economically interpretable predictive signals to incorporate into the policy’s state representation while preserving the flexibility and adaptability of reinforcement learning. Empirical evaluations on U.S. sector ETFs and Dow Jones Industrial Average constituents show that the hierarchical framework delivers higher and stable out-of-sample risk-adjusted returns relative to both a single-layer DRL agent trained solely on technical indicators, a mean–CVaR optimized portfolio using the same parameters used in the proposed hierarchical model and standard equal weight as well as index-based benchmarks. These results demonstrate that integrating explainable predictive signals with risk-sensitive reinforcement learning improves the robustness and stability of data-driven portfolio strategies. Full article
(This article belongs to the Special Issue Financial Markets: Risk Forecasting, Dynamic Models and Data Analysis)
Show Figures

Figure 1

19 pages, 880 KB  
Article
A Hybrid Model for Copper Futures Price Forecasting Utilizing Complexity-Aware Variational Mode Decomposition and Reconstruction and Multi-Behavior-Triggered Interaction Modeling
by Yan Li and Dezhi Liu
Entropy 2026, 28(3), 320; https://doi.org/10.3390/e28030320 - 12 Mar 2026
Abstract
Accurate forecasting of copper futures prices is crucial for risk management and investment decisions. However, existing approaches primarily rely on historical prices and incorporate behavioral signals without a unified modeling framework. To address this limitation, we propose MBTI-Net (Multi-source Behavior-Triggered Interaction Network), a [...] Read more.
Accurate forecasting of copper futures prices is crucial for risk management and investment decisions. However, existing approaches primarily rely on historical prices and incorporate behavioral signals without a unified modeling framework. To address this limitation, we propose MBTI-Net (Multi-source Behavior-Triggered Interaction Network), a behavior-aware forecasting framework for heterogeneous copper market data. We first construct a compact behavioral factor from Baidu search indices via a multi-view projection strategy that preserves structural and predictive information. We then develop a complexity-aware reconstruction mechanism that aggregates intrinsic mode functions into multi-frequency components based on fuzzy entropy and energy. To accommodate distributional and volatility differences between behavioral and market variables, we introduce VB-ReVIN (Volatility- and Behavior-aware Reversible Instance Normalization). Building upon these representations, MBTI-Net models dynamic multi-source interactions triggered by behavioral intensity and market conditions, enabling adaptive cross-source information fusion. Experiments on LME and SHFE copper futures datasets demonstrate consistent improvements over state-of-the-art benchmarks, highlighting the importance of explicitly modeling behavior-driven dependencies in financial forecasting. Full article
(This article belongs to the Special Issue Time Series Analysis for Signal Processing)
Show Figures

Figure 1

23 pages, 26061 KB  
Article
FATE-Net: An Optimization-Enhanced Attention-Driven Temporal Evolution Framework for Stock Price Forecasting
by Zhizhe Lin, Pengbo Li, Zhibo Zhao, Fei Wang, Teng Zhou and Chunjie Cao
Mathematics 2026, 14(6), 964; https://doi.org/10.3390/math14060964 - 12 Mar 2026
Viewed by 29
Abstract
Stock price forecasting remains challenging due to the nonlinear, volatile, multi-scale dynamics of financial time series. This study addresses two core limitations of existing models: incomplete capture of full-spectrum multi-scale temporal dependencies and severe hyperparameter sensitivity caused by inefficient manual tuning. To solve [...] Read more.
Stock price forecasting remains challenging due to the nonlinear, volatile, multi-scale dynamics of financial time series. This study addresses two core limitations of existing models: incomplete capture of full-spectrum multi-scale temporal dependencies and severe hyperparameter sensitivity caused by inefficient manual tuning. To solve these issues, we propose FATE-Net, an optimization-enhanced attention-driven forecasting framework. FATE-Net first integrates LSTM-based local encoding and Transformer-based global refinement to model multi-scale temporal dependencies. To address hyperparameter sensitivity, we embed a multi-objective particle swarm optimization (MOPSO) strategy, which formulates hyperparameter configuration as a dual-objective problem minimizing MAPE and RMSE, automatically exploring the hyperparameter space to find optimal configurations and enhance model generalization. Experiments on BYD stock data show that FATE-Net achieves state-of-the-art performance, with an MAE of 1.051, RMSE of 1.435, MAPE of 0.37%, and R2 of 0.997, verifying our framework’s effectiveness. Full article
Show Figures

Figure 1

23 pages, 4778 KB  
Article
A Dual-Attentional Gated Residual Framework for Robust Travel Time Prediction
by Jiajun Wu, Yongchuan Zhang, Yiduo Bai, Jun Xia and Yong He
ISPRS Int. J. Geo-Inf. 2026, 15(3), 120; https://doi.org/10.3390/ijgi15030120 - 12 Mar 2026
Viewed by 35
Abstract
Travel time prediction (TTP) is a fundamental pillar of intelligent transportation systems (ITS). However, deploying highly parameterized deep learning models in data-scarce environments—referred to as the “cold-start” problem—remains a critical bottleneck, frequently leading to overfitting and severe error accumulation on ultra-long trajectories. To [...] Read more.
Travel time prediction (TTP) is a fundamental pillar of intelligent transportation systems (ITS). However, deploying highly parameterized deep learning models in data-scarce environments—referred to as the “cold-start” problem—remains a critical bottleneck, frequently leading to overfitting and severe error accumulation on ultra-long trajectories. To surmount these limitations, this study proposes the Dual-Attentional Gated Residual Network (DAGRN), a data-efficient forecasting framework driven by a novel topology-temporal coordination mechanism. Specifically, the framework introduces three integrated innovations: (1) transforming the primal network into a physics-aware Line Graph to explicitly filter out illegal movements and dynamically modulating topological propagation via Feature-wise Linear Modulation (FiLM); (2) coupling a Bidirectional GRU backbone with a Multi-Head Attention module to simultaneously capture global trends and localized intersection delays; (3) employing a Gated Residual Fusion mechanism that preserves dimensional consistency and facilitates gradient flow in extensive sequences. To rigorously validate the model’s robustness, we conduct evaluations on a highly constrained, stratified dataset comprising merely 2000 trajectories. Experimental results demonstrate that DAGRN achieves state-of-the-art predictive precision with an RMSE of 415.485 s and an R2 of 0.848, significantly outperforming 12 advanced baseline models and reducing error by up to 13.8% against the strongest graph baseline. Comprehensive ablation studies confirm the absolute necessity of the Multi-Head Attention module, whose removal causes the most severe performance degradation (RMSE surging to 521.495 s). Ultimately, DAGRN presents a readily deployable solution for sparse-data ITS regimes, actively paving the way for future hybrid integrations with microscopic traffic simulations and evolutionary road network optimization algorithms. Full article
Show Figures

Figure 1

20 pages, 13437 KB  
Article
Motion Prediction of Moored Platform Using CNN–LSTM for Eco-Friendly Operation
by Omar Jebari, Chungkuk Jin, Byungho Kang, Seong Hyeon Hong, Changhee Lee and Young Hun Jeon
J. Mar. Sci. Eng. 2026, 14(6), 531; https://doi.org/10.3390/jmse14060531 - 12 Mar 2026
Viewed by 42
Abstract
Predicting the motion of ships and floating structures is essential for ensuring economical and environmentally friendly operations in the ocean. In this study, we propose a hybrid encoder–decoder Convolutional Neural Network–Long Short-Term Memory (CNN–LSTM) architecture to predict motions of a moored Floating Production [...] Read more.
Predicting the motion of ships and floating structures is essential for ensuring economical and environmentally friendly operations in the ocean. In this study, we propose a hybrid encoder–decoder Convolutional Neural Network–Long Short-Term Memory (CNN–LSTM) architecture to predict motions of a moored Floating Production Storage and Offloading (FPSO) vessel under varying sea conditions. The model integrates a CNN for spatial wave-field feature extraction and an LSTM encoder–decoder to capture temporal dependencies in vessel motion. Synthetic datasets were generated using mid-fidelity dynamics simulations of a coupled FPSO–mooring–riser system subjected to wave excitations. Five sea states ranging from calm to severe were considered to evaluate the model’s robustness. A key preprocessing step involved determining the optimal spatial domain for wave field input, and a wave field size of 600 m × 600 m was identified as the most cost-effective configuration while maintaining accuracy. The model was validated using the Root Mean Square Error (RMSE) or relative RMSE (RRMSE). Despite low RRMSE values in low sea states, predictions were noisier due to high-frequency, low-amplitude responses. In contrast, higher sea states yielded more stable predictions despite higher RRMSE values. The proposed method offers high-resolution motion forecasting capability, which can enhance operational safety and energy efficiency of offshore platforms, particularly when integrated with stereo camera-based wave monitoring systems. Full article
(This article belongs to the Special Issue Intelligent Solutions for Marine Operations)
Show Figures

Figure 1

16 pages, 1565 KB  
Article
Shrimp Market Under Innovation Schemes: Hidden Markov Modeling
by Johnny Javier Triviño-Sanchez, Alexander Fernando Haro-Sarango, Julián Coronel-Reyes, Carlos Alfredo De Loor-Platón and Dayanna Soria-Encalada
J. Risk Financial Manag. 2026, 19(3), 214; https://doi.org/10.3390/jrfm19030214 - 12 Mar 2026
Viewed by 71
Abstract
This article models the Ecuadorian shrimp market as a nonlinear system with recurring latent regimes that affect margins and planning decisions. A multivariate Hidden Markov Model (HMM) with Gaussian emissions in log space is estimated via the Baum–Welch algorithm to segment the joint [...] Read more.
This article models the Ecuadorian shrimp market as a nonlinear system with recurring latent regimes that affect margins and planning decisions. A multivariate Hidden Markov Model (HMM) with Gaussian emissions in log space is estimated via the Baum–Welch algorithm to segment the joint dynamics of pounds produced, dollars invoiced, and average price. The analysis uses monthly data from January 2017 to May 2025 (T = 101). The selected four-state specification shows strong fit and outperforms linear alternatives (log likelihood = 480.9; AIC = 859.8; BIC = 729.5). The dominant regime (State 2) concentrates high prices (~USD 2.97/lb) with intermediate production and acts as an attractor (stationary probability ≈ 1), while States 0 and 1 capture orderly expansion and oversupply conditions, and State 3 reflects episodic demand rallies. Adverse regimes (States 0–1) exhibit expected durations of 6–8 months, suggesting natural reversion toward the profitable regime. These estimates enable probabilistic regime forecasting and Monte Carlo scenario simulation to support hedging, inventory management, and financial stress testing. Overall, the proposed HMM framework provides an operational decision tool for producers, traders, and policymakers seeking to anticipate regime shifts, mitigate oversupply cycles, and stabilize margins. Full article
(This article belongs to the Section Mathematics and Finance)
Show Figures

Figure 1

82 pages, 6468 KB  
Article
Correction Functions and Refinement Algorithms for Enhancing the Performance of Machine Learning Models
by Attila Kovács, Judit Kovácsné Molnár and Károly Jármai
Automation 2026, 7(2), 45; https://doi.org/10.3390/automation7020045 - 6 Mar 2026
Viewed by 283
Abstract
The aim of this study is to investigate and demonstrate the role of correction functions and optimisation-based refinement algorithms in enhancing the performance of machine learning models, particularly in predictive anomaly detection tasks applied in industrial environments. The performance of machine learning models [...] Read more.
The aim of this study is to investigate and demonstrate the role of correction functions and optimisation-based refinement algorithms in enhancing the performance of machine learning models, particularly in predictive anomaly detection tasks applied in industrial environments. The performance of machine learning models is highly dependent on the quality of data preprocessing, model architecture, and post-processing methodology. In many practical applications—particularly in time-series forecasting and anomaly detection—the conventional training pipeline alone is insufficient, because model uncertainty, structural bias and the handling of rare events require specialised post hoc calibration and refinement mechanisms. This study provides a systematic overview of the role of correction functions (e.g., Principal Component Analysis (PCA), Squared Prediction Error (SPE)/Q-statistics, Hotelling’s T2, Bayesian calibration) and adaptive improvement algorithms (e.g., Genetic Algorithms (GA), Particle Swarm Optimisation (PSO), Simulated Annealing (SA), Gaussian Mixture Model (GMM) and ensemble-based techniques) in enhancing the performance of machine learning pipelines. The models were trained on a real industrial dataset compiled from power network analytics and harmonic-injection-based loading conditions. Model validation and equipment-level testing were performed using a large-scale harmonic measurement dataset collected over a five-year period. The reliability of the approach was confirmed by comparing predicted state transitions with actual fault occurrences, demonstrating its practical applicability and suitability for integration into predictive maintenance frameworks. The analysis demonstrates that correction functions introduce deterministic transformations in the data or error space, whereas improvement algorithms apply adaptive optimisation to fine-tune model parameters or decision boundaries. The combined use of these approaches significantly reduces overfitting, improves predictive accuracy and lowers false alarm rates. This work introduces the concept of an Organically Adaptive Predictive (OAP) ML model. The proposed model presents organic adaptivity, continuously adjusting its predictive behaviour in response to dynamic variations in network loading and harmonic spectrum composition. The introduced terminology characterises the organically emergent nature of the adaptive learning mechanism. Full article
Show Figures

Figure 1

25 pages, 913 KB  
Article
Sustainable Development in the Regional Economic Security System: Assessment Methodology and Management Tools
by Anna Polukhina, Marina Y. Sheresheva, Dmitry Napolskikh and Vladimir Lezhnin
Sustainability 2026, 18(5), 2577; https://doi.org/10.3390/su18052577 - 6 Mar 2026
Viewed by 154
Abstract
The paper presents a comprehensive methodological system for assessing the level of economic security of Russian regions, based on the synthesis of several complementary approaches and accounting for regional specifics. The central idea is a shift from static monitoring to dynamic analysis, which [...] Read more.
The paper presents a comprehensive methodological system for assessing the level of economic security of Russian regions, based on the synthesis of several complementary approaches and accounting for regional specifics. The central idea is a shift from static monitoring to dynamic analysis, which allows not only for capturing the current state but also for identifying the direction and stability of trends over time. The proposed methodology based on four stages: forming a set of indicators, normalizing their values, aggregating them into integral indices, and then visualizing them for operational decision-making. An important feature of sustainable development is the introduction of mechanisms to account for regional specifics through the clustering of regions and adjustment coefficients, which helps to mitigate the influence of geographical and structural differences on the results comparability. Together, they form an integrated system for diagnosing, planning, and monitoring the economic security of regions. The paper provides examples of threshold values for indicators such as the share of households with internet access, the length of the road network, birth rate, the volume of building commissioning, and innovation expenditures. A classification of regions into stability zones and recommendations for policy measures within each zone accompany the threshold analysis. In particular, for digitalization and transport infrastructure, measures are proposed to enhance monitoring, improve service accessibility, and invest in infrastructure; for the demographic component, measures are proposed to support families and improve quality of life. The practical significance of the research lies in creating a universal, yet flexible, toolkit for monitoring, ranking, and planning regional policy in the field of economic security. The proposed system was designed for application both at the federal level and for interregional analysis, including scenario planning and modeling the impact of management decisions. Thus, this study contributes to the literature by bridging the theory of economic security, the imperatives of sustainable regional development, and the practical potential of information technologies. It offers a concrete, scalable methodology for transforming regional economic security management into a data-driven, forward-looking, and context-sensitive process. In the future, the authors intend to further develop the methodology by considering the sectoral specialization of regions, integrating with medium- and long-term forecasting systems, and creating an automated monitoring platform. Full article
(This article belongs to the Special Issue Innovative Development and Application of Sustainable Management)
Show Figures

Figure 1

25 pages, 2213 KB  
Article
Adaptive Subsidy Policies for Shore Power Promotion: An Integrated Game Theory–System Dynamics Approach
by Huilin Lin and Lei Dai
Mathematics 2026, 14(5), 860; https://doi.org/10.3390/math14050860 - 3 Mar 2026
Viewed by 261
Abstract
Shore power (SP) is a critical solution for decarbonizing maritime transport, yet its adoption is hindered by the “high investment, low utilization” paradox, driven by high initial costs and misaligned incentives between ports and ships. While government subsidies are essential, traditional static policy [...] Read more.
Shore power (SP) is a critical solution for decarbonizing maritime transport, yet its adoption is hindered by the “high investment, low utilization” paradox, driven by high initial costs and misaligned incentives between ports and ships. While government subsidies are essential, traditional static policy designs often fail to adapt to the complex, non-linear dynamics of technology diffusion. To address this, the study proposes a dynamic evaluation framework combining System Dynamics (SD) with Evolutionary Game Theory (EGT), embedding a Rolling Horizon Optimization algorithm. Using Shanghai Port as a case study, simulation results demonstrate that optimal subsidies are highly state-dependent. Specifically, effective promotion requires prioritizing ship-side incentives during the early start-up phase, followed by facilities subsidies supporting the coordinated evolution of both ships and berths, and finally a market-driven exit. Furthermore, the proposed dynamic strategy demonstrates superior robustness against oil price volatility and demand shocks compared to static policies, while strictly complying with fiscal budget caps. This framework provides a foundation for the adaptive management of green port infrastructure, facilitating the advancement of energy-saving and environmental protection initiatives within the maritime industry. Additionally, it contributes to the forecasting and evaluation of the policy outcomes of green technology adoption. Full article
Show Figures

Figure 1

17 pages, 467 KB  
Article
Staying Young at the Edge: A Software Aging Perspective for Foundation Models as a Service
by Benedetta Picano and Romano Fantacci
Computers 2026, 15(3), 158; https://doi.org/10.3390/computers15030158 - 3 Mar 2026
Viewed by 225
Abstract
Nowadays, the emergence of Foundation Models as a Service enables mobile users to access powerful capabilities such as inference and fine-tuning on demand and without incurring local computational overhead. This paper introduces a software-aware offloading framework for FMaaS that allows edge nodes to [...] Read more.
Nowadays, the emergence of Foundation Models as a Service enables mobile users to access powerful capabilities such as inference and fine-tuning on demand and without incurring local computational overhead. This paper introduces a software-aware offloading framework for FMaaS that allows edge nodes to forecast software aging and prevent service degradation. Each node employs a lightweight Echo State Network to predict its software age, with tasks dynamically assigned based on communication cost, inference delay, and forecast reliability. Simulation results including ablation studies confirm the effectiveness of software age forecasting in reducing task failures and improving session continuity. Full article
(This article belongs to the Special Issue Best Practices, Challenges and Opportunities in Software Engineering)
Show Figures

Graphical abstract

27 pages, 687 KB  
Article
Chaotic Scaling and Network Turbulence in Crude Oil-Equity Systems Using a Coupled Multiscale Chaos Index
by Arash Sioofy Khoojine, Lin Xiao, Hao Chen and Congyin Wang
Int. J. Financial Stud. 2026, 14(3), 63; https://doi.org/10.3390/ijfs14030063 - 3 Mar 2026
Viewed by 161
Abstract
Financial markets often display nonlinear and turbulent dynamics during periods of stress, and crude-oil and global equity systems frequently demonstrate closely connected forms of instability. Earlier studies report multifractality, chaotic features and regime-dependent spillovers across commodities and equities, yet existing approaches rarely succeed [...] Read more.
Financial markets often display nonlinear and turbulent dynamics during periods of stress, and crude-oil and global equity systems frequently demonstrate closely connected forms of instability. Earlier studies report multifractality, chaotic features and regime-dependent spillovers across commodities and equities, yet existing approaches rarely succeed in capturing both the intrinsic complexity of oil-market behavior and the changing structure of cross-asset dependence. This limitation reduces the ability to distinguish calm from turbulent regimes and weakens short-horizon risk assessment. The present study introduces a unified framework that quantifies and predicts systemic instability within the coupled oil–equity system. The analysis constructs a crude-oil complexity index based on multifractal fluctuation analysis, permutation and approximate entropy, and Lyapunov-based indicators of chaotic dynamics. At the same time, it develops an information-theoretic network of global equity and energy-sector returns and summarizes its instability through measures of edge turnover, spectral radius, degree entropy and strength dispersion. These components are combined to form the Coupled Multiscale Chaos Index (CMCI), a scalar state variable that distinguishes calm, transitional and chaotic market regimes. Empirical results indicate that Brent and WTI exhibit pronounced multifractality, elevated entropy and positive Lyapunov exponents, while the dependence network becomes more centralized, more clustered and more capable of shock amplification during high-CMCI states. The CMCI moves closely with realized volatility and provides significant predictive content for five-day variance across major global equity benchmarks, with performance superior to models that rely only on macro-financial controls. Out-of-sample evaluation shows that forecasts incorporating measures of complexity record substantially lower MSE and QLIKE losses. The findings indicate that systemic instability reflects the interaction between local chaotic dynamics in crude-oil markets and turbulence in the global dependence network. The CMCI offers a practical early-warning indicator that supports risk management, forecasting and macroprudential supervision. Full article
Show Figures

Figure 1

19 pages, 5177 KB  
Article
Maritime Trajectory Forecasting via CNN–SOFTS-Based Coupled Spatio-Temporal Features
by Yongfeng Suo, Chunyu Yang, Gaocai Li, Qiang Mei and Lei Cui
Sensors 2026, 26(5), 1547; https://doi.org/10.3390/s26051547 - 1 Mar 2026
Viewed by 282
Abstract
Spatio-temporal features are crucial for maritime trajectory forecasting, especially in scenarios involving curved waterways or abrupt changes in ship motion patterns. Although Automatic Identification System (AIS) data, which are widely used for trajectory prediction, inherently include temporal and spatial information, effectively strengthening these [...] Read more.
Spatio-temporal features are crucial for maritime trajectory forecasting, especially in scenarios involving curved waterways or abrupt changes in ship motion patterns. Although Automatic Identification System (AIS) data, which are widely used for trajectory prediction, inherently include temporal and spatial information, effectively strengthening these features and integrating them into prediction models remains challenging. To address this challenge, we propose a Convolutional Neural Network (CNN)-Series-cOre Fused Time Series forecaster (SOFTS)-based framework that explicitly couples spatial and temporal features to achieve high-fidelity maritime trajectory forecasting, especially in scenarios with complex spatial patterns. We first employ a CNN-based spatial encoder to hierarchically abstract spatial density distributions through convolution and pooling operations, thereby learning global spatial structure patterns of ship movements. This encoder emphasizes overall spatial morphology rather than precise individual trajectory points. Second, we employ the SOFTS model to incorporate angular velocity, acceleration, and angular acceleration as input features to characterize ship motion states, which can capture the temporal dependencies of ship motion states from multivariate time series. Finally, the spatial embedding features extracted by the CNN are concatenated with the temporal feature representations learned by SOFTS along the feature dimension to form a joint spatiotemporal representation. This representation is then fed into a fusion regression module composed of fully connected layers to predict future ship trajectories. Experimental results on the validation dataset show that the proposed method achieves an MSE of 0.020 and an MAE of 0.060, outperforming several advanced time series forecasting models in prediction accuracy and computational efficiency. The introduction of angular velocity, acceleration, and angular acceleration features reduces the MSE and MAE by approximately 10.22% and 9.49%, respectively, validating the effectiveness of the introduced dynamic features in improving trajectory prediction performance. These results underscore the proposed method’s potential for intelligent navigation and traffic management systems by effectively enhancing inland river navigation safety and strengthening waterborne traffic monitoring capabilities. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

25 pages, 10445 KB  
Article
Temporal Trend and Fluctuation Learning via Enhanced Attention Mamba for Carbon Price Interval Forecasting
by Lijun Duan, Jin Chen, Qiankun Zuo, Yanfei Zhu, Yi Di and Ruiheng Li
Entropy 2026, 28(3), 270; https://doi.org/10.3390/e28030270 - 28 Feb 2026
Viewed by 177
Abstract
Accurate carbon price forecasting is essential for transforming complex carbon trading markets into efficiently managed and stably operating systems. Existing long-term time series forecasting methods struggle to capture the nonlinear and non-stationary characteristics inherent in carbon prices. To address this limitation, we propose [...] Read more.
Accurate carbon price forecasting is essential for transforming complex carbon trading markets into efficiently managed and stably operating systems. Existing long-term time series forecasting methods struggle to capture the nonlinear and non-stationary characteristics inherent in carbon prices. To address this limitation, we propose the Temporal Trend and Fluctuation Learning (TTFL) model for interval-valued carbon price forecasting. The model first uses wavelet decomposition to separate the forecasting task into two branches: Price Trend Learning (PTL) and Price Fluctuation Learning (PFL). The PTL branch adopts a forward–backward enhanced Mamba architecture to extract low-frequency, long-term trend features. This design facilitates price interactions across time steps. The enhanced Mamba module leverages a state space model (SSM) to preserve historical information selectively and employs a forgetting gate to recover missing information. As a result, the model captures complementary dependencies across different price points, improving prediction reliability. The PFL branch integrates an attention mechanism with the standard Mamba architecture to model high-frequency temporal dynamics. It provides fine-grained short-term volatility information essential for market participants. We also introduce an interval-valued recovery loss function. This loss quantifies the overlap between predicted and actual interval prices, emphasizes trend learning, and stabilizes model training. We evaluate the TTFL model on three real-world carbon trading markets. Comparative experiments demonstrate that TTFL achieves superior prediction accuracy and robustness relative to baseline methods. Through collaborative learning and selective state space modeling, our approach not only outperforms traditional forecasting models but also offers stakeholders a practical tool for navigating complex carbon market environments. This work contributes a novel forecasting paradigm that integrates multivariate collaborative learning with selective state space modeling. It provides actionable insights for policymaking, investment strategy development, and risk management in the energy and environmental sectors. Full article
Show Figures

Figure 1

29 pages, 5948 KB  
Article
Carbon Price Forecasting for Sustainable Low-Carbon Investment Decisions: A Hybrid Transformer—sLSTM Model
by Aiying Zhao, Qian Chen, Yang Zhao, Ruiyi Wu, Jiamin Xu and Yongpeng Tong
Sustainability 2026, 18(5), 2324; https://doi.org/10.3390/su18052324 - 27 Feb 2026
Viewed by 261
Abstract
Under the framework of the Paris Agreement, carbon trading has emerged as a pivotal market-based instrument for achieving carbon neutrality. Following years of pilot programs, China has taken a critical step toward establishing a unified national carbon market. Consequently, accurate carbon price forecasting [...] Read more.
Under the framework of the Paris Agreement, carbon trading has emerged as a pivotal market-based instrument for achieving carbon neutrality. Following years of pilot programs, China has taken a critical step toward establishing a unified national carbon market. Consequently, accurate carbon price forecasting is essential for constructing a stable and effective carbon pricing mechanism. However, the 2017 reform of the EU Emissions Trading System (EU ETS) significantly altered the carbon price formation mechanism, exacerbating price volatility and uncertainty. This shift further underscores the urgent need for research into high-precision carbon price forecasting.Existing deep learning models struggle to simultaneously capture short-term high-frequency fluctuations and long-term evolutionary trends within complex carbon market data, a limitation that compromises their prediction accuracy and stability. To address these challenges, this paper proposes a Transformer-based carbon price forecasting model that incorporates an sLSTM structure. By enhancing sequence memory and state update mechanisms, this model effectively improves the capability to model both short-term volatility characteristics and long-term evolutionary patterns of carbon prices. In the data preprocessing phase, Variational Mode Decomposition (VMD) is employed to perform multi-scale decomposition of carbon price sequences, effectively mitigating the issue of overlapping fluctuations across different time scales. Furthermore, the Whale Optimization Algorithm (WOA) is utilized to optimize the number of decomposition modes and the penalty factor, thereby resolving the parameter sensitivity issues inherent in modal decomposition. Experimental results on real-world carbon price datasets demonstrate that the model achieves an average coefficient of determination (R2) of 0.9862 and a Mean Absolute Percentage Error (MAPE) of only 0.5607%. These findings indicate that the proposed method possesses significant advantages in characterizing the complex dynamic features of time series, thereby effectively enhancing prediction accuracy.The proposed model can serve as a supportive tool for carbon-market risk monitoring and policy evaluation by identifying abnormal fluctuations and mitigating market inefficiencies caused by information asymmetry. This enhances the stability and predictability of carbon price signals as incentives for emissions reduction, enabling firms to plan abatement pathways and low-carbon investments, and strengthening the sustainable role of carbon markets in achieving carbon neutrality. Full article
Show Figures

Figure 1

38 pages, 6586 KB  
Article
Fuzzy Modeling Strategies for Groundwater Level Forecasting: Comparing Local, Integrated, and Behavioral Frameworks for a Data-Limited Coastal Aquifer in the Eastern Mediterranean
by Mahmoud Ahmad, Katalin Bene and Richard Ray
Water 2026, 18(5), 566; https://doi.org/10.3390/w18050566 - 27 Feb 2026
Viewed by 219
Abstract
Groundwater modeling in semi-arid regions presents significant challenges due to complex aquifer dynamics, limited data availability, and heterogeneous hydrogeological conditions. This study presents a comprehensive comparative analysis of three fuzzy expert system strategies for monthly groundwater level forecasting in the Al-Hsain Basin, Syria: [...] Read more.
Groundwater modeling in semi-arid regions presents significant challenges due to complex aquifer dynamics, limited data availability, and heterogeneous hydrogeological conditions. This study presents a comprehensive comparative analysis of three fuzzy expert system strategies for monthly groundwater level forecasting in the Al-Hsain Basin, Syria: localized models based on hydrogeographical grouping, a unified basin-wide approach, and an innovative behavioral clustering methodology. Using synchronized rainfall and temperature data from 35 monitoring wells over four years (2020–2024), we developed and evaluated fuzzy inference systems’ directional classification accuracy as the primary performance metric, categorizing groundwater level changes into rise, stable, and decline states rather than predicting continuous values. This choice reflects the qualitative nature of fuzzy expert systems and their suitability for groundwater management under data-limited conditions. The behavioral clustering approach achieved excellent overall performance with a mean accuracy of 0.74, outperforming localized models (0.71) and unified models (0.67). Behavioral clustering demonstrated effectiveness in 66% of wells, with individual accuracy improvements reaching up to 0.23, while reducing model complexity from five group-specific systems to three behaviorally coherent clusters. Localized models achieved optimal performance in 29% of wells where hydrogeological conditions aligned with spatial assumptions, whereas unified models provided consistent moderate performance across 89% of locations. The incorporation of lagged variables and seasonal indices in behavioral clustering models proved essential for capturing temporal complexity in semi-arid groundwater responses. Statistical analysis revealed lower intra-group variability in behavioral clusters (standard deviation 0.06–0.09) than in geographical groupings (0.08–0.14), confirming improved functional homogeneity through response-based organization. These findings indicate that fuzzy modeling strategy selection should be context-dependent, with behavioral clustering offering an effective balance between accuracy, interpretability, and generalization for regional groundwater management applications. The novelty of this work lies in isolating the effect of fuzzy system organization logic (localized, unified, and behavioral) on forecasting performance, robustness, and transferability, evaluated under an identical inference and time-series validation framework. Full article
(This article belongs to the Special Issue Artificial Intelligence (AI) Solutions for Hydrogeological Challenges)
Show Figures

Figure 1

Back to TopTop