Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (52)

Search Parameters:
Keywords = unified time series algorithm

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
20 pages, 15544 KB  
Article
The Potential Use of a Land Trend Algorithm for Regional Landslide Mapping in Indonesia
by Tubagus Nur Rahmat Putra, Muhammad Aufaristama, Khaled Ahmed, Mochamad Candra Wirawan Arief, Rahmihafiza Hanafi, Bambang Wijatmoko and Irwan Ary Dharmawan
Appl. Sci. 2026, 16(6), 3090; https://doi.org/10.3390/app16063090 - 23 Mar 2026
Viewed by 135
Abstract
Indonesia is among the most landslide-prone countries in the world, with thousands of fatalities and widespread infrastructure damage recorded over recent decades. Despite this high hazard level, regional-scale landslide monitoring remains constrained by the limitations of conventional bitemporal satellite imagery, which is susceptible [...] Read more.
Indonesia is among the most landslide-prone countries in the world, with thousands of fatalities and widespread infrastructure damage recorded over recent decades. Despite this high hazard level, regional-scale landslide monitoring remains constrained by the limitations of conventional bitemporal satellite imagery, which is susceptible to cloud contamination, dependent on precise acquisition timing, and unable to capture the full temporal dynamics of landslide occurrence and recovery. While the LandTrendr (Landsat-based Detection of Trends in Disturbance and Recovery) algorithm has been widely applied for detecting vegetation disturbances such as forest loss and land-use change, its potential for landslide detection in tropical environments has not been sufficiently explored. This study aims to evaluate the applicability of LandTrendr applied to long-term Landsat time series imagery for automated regional-scale landslide detection and mapping in Indonesia. The method integrates temporal segmentation of the Normalized Difference Vegetation Index (NDVI) derived from Landsat imagery spanning 2000–2022 with slope information from the Shuttle Radar Topography Mission (SRTM) Digital Elevation Model (DEM) to identify the characteristic drop-recovery spectral signature associated with landslide events. The algorithm was applied and evaluated in two geologically distinct study areas: Lombok, West Nusa Tenggara, and Pasaman, West Sumatra. Detection accuracies of 25.9% by location and 20.3% by area were achieved in Lombok and 76.3% by location and 85.3% by area in Pasaman. The lower accuracy in Lombok is primarily attributed to the predominance of small landslides below the sensor’s spatial resolution and rapid vegetation recovery. The proposed approach demonstrates the unique capability of LandTrendr to model the entire life cycle of a mass movement event, from pre-event stability through abrupt disturbance to ecological recovery within a single unified framework, providing a scalable and cost-effective tool for long-term landslide monitoring applicable to other tropical, landslide-prone regions. Full article
(This article belongs to the Section Environmental Sciences)
Show Figures

Figure 1

23 pages, 2990 KB  
Article
Forecasting-Aware Digital Twin Calibration for Reliable Multi-Horizon Traffic Prediction
by Zeyad AlJundi, Taqwa A. Alhaj, Fatin A. Elhaj, Inshirah Idris and Tasneem Darwish
Network 2026, 6(1), 13; https://doi.org/10.3390/network6010013 - 6 Mar 2026
Viewed by 349
Abstract
Digital twin systems are becoming an important tool in intelligent transportation management, as they provide simulation-based environments for monitoring, analyzing, and predicting traffic behavior. However, the predictive performance of traffic digital twins is often limited by the quality and temporal consistency of sensor-level [...] Read more.
Digital twin systems are becoming an important tool in intelligent transportation management, as they provide simulation-based environments for monitoring, analyzing, and predicting traffic behavior. However, the predictive performance of traffic digital twins is often limited by the quality and temporal consistency of sensor-level data generated from microscopic simulations. Most current calibration methods focus mainly on matching macroscopic traffic indicators, such as vehicle count and speed, without explicitly addressing the requirements of multi-horizon forecasting. This creates a gap between achieving realistic simulations and building reliable predictive models. This research proposes a forecasting-aware digital traffic twin framework that integrates microscopic SUMO simulation, controlled sensor-level observation modeling through geometric misalignment and noise injection, behavioral calibration, and deep temporal forecasting within a unified end-to-end structure. Unlike traditional calibration approaches, the proposed Genetic Algorithm (GA) reformulates calibration as a multi-step predictive optimization task. Simulation parameters are optimized by minimizing forecasting error produced by a lightweight proxy sequence model embedded within the calibration loop. In this way, calibration moves beyond simple statistical matching and instead emphasizes temporal learnability and forecasting stability, enabling the digital twin to generate traffic patterns more suitable for long-term prediction. Based on the calibrated traffic time series, both convolutional and recurrent deep learning models are evaluated under single-step and multi-step forecasting scenarios. To further examine generalizability, external validation is performed using the real-world PEMS-BAY dataset. The experimental findings demonstrate that forecasting-aware calibration reduces macroscopic traffic signal errors by around 50% for vehicle count and around 40% for average speed, improves temporal stability, and significantly enhances forecasting accuracy across both short-term and long-term horizons. Full article
(This article belongs to the Special Issue Emerging Trends and Applications in Vehicular Ad Hoc Networks)
Show Figures

Figure 1

29 pages, 5948 KB  
Article
Carbon Price Forecasting for Sustainable Low-Carbon Investment Decisions: A Hybrid Transformer—sLSTM Model
by Aiying Zhao, Qian Chen, Yang Zhao, Ruiyi Wu, Jiamin Xu and Yongpeng Tong
Sustainability 2026, 18(5), 2324; https://doi.org/10.3390/su18052324 - 27 Feb 2026
Viewed by 321
Abstract
Under the framework of the Paris Agreement, carbon trading has emerged as a pivotal market-based instrument for achieving carbon neutrality. Following years of pilot programs, China has taken a critical step toward establishing a unified national carbon market. Consequently, accurate carbon price forecasting [...] Read more.
Under the framework of the Paris Agreement, carbon trading has emerged as a pivotal market-based instrument for achieving carbon neutrality. Following years of pilot programs, China has taken a critical step toward establishing a unified national carbon market. Consequently, accurate carbon price forecasting is essential for constructing a stable and effective carbon pricing mechanism. However, the 2017 reform of the EU Emissions Trading System (EU ETS) significantly altered the carbon price formation mechanism, exacerbating price volatility and uncertainty. This shift further underscores the urgent need for research into high-precision carbon price forecasting.Existing deep learning models struggle to simultaneously capture short-term high-frequency fluctuations and long-term evolutionary trends within complex carbon market data, a limitation that compromises their prediction accuracy and stability. To address these challenges, this paper proposes a Transformer-based carbon price forecasting model that incorporates an sLSTM structure. By enhancing sequence memory and state update mechanisms, this model effectively improves the capability to model both short-term volatility characteristics and long-term evolutionary patterns of carbon prices. In the data preprocessing phase, Variational Mode Decomposition (VMD) is employed to perform multi-scale decomposition of carbon price sequences, effectively mitigating the issue of overlapping fluctuations across different time scales. Furthermore, the Whale Optimization Algorithm (WOA) is utilized to optimize the number of decomposition modes and the penalty factor, thereby resolving the parameter sensitivity issues inherent in modal decomposition. Experimental results on real-world carbon price datasets demonstrate that the model achieves an average coefficient of determination (R2) of 0.9862 and a Mean Absolute Percentage Error (MAPE) of only 0.5607%. These findings indicate that the proposed method possesses significant advantages in characterizing the complex dynamic features of time series, thereby effectively enhancing prediction accuracy.The proposed model can serve as a supportive tool for carbon-market risk monitoring and policy evaluation by identifying abnormal fluctuations and mitigating market inefficiencies caused by information asymmetry. This enhances the stability and predictability of carbon price signals as incentives for emissions reduction, enabling firms to plan abatement pathways and low-carbon investments, and strengthening the sustainable role of carbon markets in achieving carbon neutrality. Full article
Show Figures

Figure 1

28 pages, 15959 KB  
Article
A Proof of Concept for an Agrifood Data Space Based on Open Data and Interoperability
by Cristina Martinez-Ruedas, Adela Pérez-Galvín and Rafael Linares-Burgos
Appl. Sci. 2026, 16(4), 1831; https://doi.org/10.3390/app16041831 - 12 Feb 2026
Viewed by 343
Abstract
The creation of unified, open, secure, reliable, and agile data spaces is essential for collecting, storing, and sharing data in a standardized and accessible manner, promoting data reuse and addressing current interoperability limitations. In this context, this research presents a proof of concept [...] Read more.
The creation of unified, open, secure, reliable, and agile data spaces is essential for collecting, storing, and sharing data in a standardized and accessible manner, promoting data reuse and addressing current interoperability limitations. In this context, this research presents a proof of concept for a unified agronomic data space based on the structured integration of heterogeneous open data sources. The central hypothesis is that the automated acquisition, preprocessing, and harmonization of publicly available agronomic data can significantly improve accessibility, usability, and interoperability for agricultural decision support applications. To this end, a comprehensive analysis of relevant open data sources was conducted, followed by the design and implementation of configurable algorithms for automated data downloading, cleaning, validation, and integration. The proposed approach explicitly addresses key challenges such as heterogeneous data formats, inconsistent spatial and temporal resolutions, missing values, and outlier detection. As a result, a unified access point was developed, providing reliable agronomic information, including (i) preprocessed climatological time series, (ii) crop and phytosanitary data, (iii) high-resolution aerial orthophotography, (iv) remote-sensing imagery, (v) pest-related information, and (vi) time series of major vegetation indices. The proof of concept was implemented for olive groves in the Andalusian region of Spain; however, the methodology is fully transferable to other crops, regions, and institutional contexts where comparable open data sources are available. The results demonstrate the potential of shared agronomic data spaces to enhance data reuse, support scalable analytics, and facilitate interoperable, data-driven agricultural management beyond the specific regional case study. Full article
(This article belongs to the Special Issue Sustainable and Smart Agriculture)
Show Figures

Figure 1

19 pages, 3017 KB  
Article
When Will the Next Shock Happen? A Dynamic Framework for Event Probability Estimation
by Konstantinos Pantelidis, Ioannis Karakostas and Odysseas Pavlatos
FinTech 2026, 5(1), 13; https://doi.org/10.3390/fintech5010013 - 2 Feb 2026
Viewed by 442
Abstract
Extreme movements in financial time series pose challenges for risk management and forecasting, particularly when their timing is irregular and difficult to anticipate. This study aims to develop a probabilistic framework for detecting and predicting such events using daily Bitcoin returns as a [...] Read more.
Extreme movements in financial time series pose challenges for risk management and forecasting, particularly when their timing is irregular and difficult to anticipate. This study aims to develop a probabilistic framework for detecting and predicting such events using daily Bitcoin returns as a case study. We first identify extreme positive and negative return events using the Isolation Forest algorithm and estimate their empirical recurrence patterns using a dynamic frequency table to derive baseline parametric probabilities. A 7-day Hawkes excitation kernel is then applied to capture short-run self-exciting dynamics, and both components are integrated using logistic regression to produce real-time probability forecasts. The results show that positive events occur more frequently than negative ones and that prediction accuracy improves over time: Brier scores, which measure the accuracy of probabilistic predictions, decrease as additional event data accumulate, and log loss values exhibit a consistent downward trend. Overall, by combining anomaly detection, empirical inter-arrival estimation, and excitation dynamics into a unified structure, the proposed framework offers a transparent and adaptable tool for forecasting extreme events in the financial market. Full article
Show Figures

Figure 1

25 pages, 1979 KB  
Article
Classifying and Predicting Household Energy Consumption Using Data Analytics and Machine Learning
by David Cordon, Antonio Pita and Angel A. Juan
Algorithms 2026, 19(2), 114; https://doi.org/10.3390/a19020114 - 1 Feb 2026
Viewed by 438
Abstract
Growing pressure on electricity grids and the increasing availability of smart meter data have intensified the need for accurate, interpretable, and scalable methods to analyze and forecast household electricity consumption. In this context, this study presents a general, data-agnostic methodology for predicting and [...] Read more.
Growing pressure on electricity grids and the increasing availability of smart meter data have intensified the need for accurate, interpretable, and scalable methods to analyze and forecast household electricity consumption. In this context, this study presents a general, data-agnostic methodology for predicting and classifying household energy consumption. The proposed workflow unifies data preparation, feature engineering, and machine learning techniques (including clustering, classification, regression, and time series forecasting) within a single interpretable pipeline that supports actionable insights. Rather than proposing new prediction algorithms, this work contributes a fully reproducible, end-to-end methodological pipeline that enables the controlled evaluation of the impact of contextual variables, customer segmentation, and cold-start conditions on household energy forecasting. A distinctive aspect of the pipeline is the explicit use of household- and dwelling-level contextual variables to derive customer typologies via clustering and to enrich forecasting models. The models are evaluated for predictive accuracy, reliability under varying conditions, and suitability for operational use. The results show that incorporating contextual variables and clustering significantly improves forecasting accuracy, particularly in cold-start scenarios where no historical consumption data are available. Although numerous public datasets of residential electricity consumption exist, they rarely provide, in an openly accessible form, both detailed load histories and rich contextual attributes, while many are subject to privacy or licensing restrictions. To ensure full reproducibility and to enable controlled experiments where contextual variables can be switched on and off, the experiments are conducted on a synthetically generated dataset that reproduces realistic behavior and seasonal usage patterns. However, the proposed methodology is independent of the specific data source and can be directly applied to any real or synthetic dataset with similar structure. The approach enables applications such as short- and long-term demand forecasting, estimation of household energy costs, and forecasting demand for new customers. These findings demonstrate that the proposed pipeline provides a transparent and effective framework for end-to-end analysis of household electricity consumption. Full article
(This article belongs to the Section Algorithms for Multidisciplinary Applications)
Show Figures

Figure 1

20 pages, 5434 KB  
Article
A Wavenumber Domain Consistent Imaging Method Based on High-Order Fourier Series Fitting Compensation for Optical/SAR Co-Aperture System
by Ke Wang, Yinshen Wang, Chong Song, Bingnan Wang, Li Tang, Xuemei Wang and Maosheng Xiang
Remote Sens. 2026, 18(2), 315; https://doi.org/10.3390/rs18020315 - 16 Jan 2026
Viewed by 344
Abstract
Optical and SAR image registration and fusion are pivotal in the remote sensing field, as they leverage the complementary advantages of both modalities. However, achieving this with high accuracy and efficiency remains challenging. This challenge arises because traditional methods are confined to the [...] Read more.
Optical and SAR image registration and fusion are pivotal in the remote sensing field, as they leverage the complementary advantages of both modalities. However, achieving this with high accuracy and efficiency remains challenging. This challenge arises because traditional methods are confined to the image domain, applied after independent image formation. They attempt to correct geometric mismatches that are rooted in fundamental physical differences, an approach that inherently struggles to achieve both precision and speed. Therefore, this paper introduces a co-designed system and algorithm framework to overcome the fundamental challenges. At the system level, we pioneer an innovative airborne co-aperture system to ensure synchronous data acquisition. At the algorithmic level, we derive a theoretical model within the wavenumber domain imaging process, attributing optical/SAR pixel deviations to the deterministic phase errors introduced by its core Stolt interpolation operation. This model enables a signal-domain compensation technique, which employs high-order Fourier series fitting to correct these errors during the SAR image formation itself. This co-design yields a unified processing pipeline that achieves direct, sub-pixel co-registration, thereby establishing a foundational paradigm for real-time multi-source data processing. The experimental results on both multi-point and structural targets confirm that our method achieves sub-pixel registration accuracy across diverse scenarios, accompanied by a marked gain in computational efficiency over the time-domain approach. Full article
Show Figures

Figure 1

43 pages, 1151 KB  
Review
Clustering of Temporal and Visual Data: Recent Advancements
by Priyanka Mudgal
Data 2026, 11(1), 7; https://doi.org/10.3390/data11010007 - 4 Jan 2026
Cited by 1 | Viewed by 1225
Abstract
Clustering plays a central role in uncovering latent structure within both temporal and visual data. It enables critical insights in various domains including healthcare, finance, surveillance, autonomous systems, and many more. With the growing volume and complexity of time-series and image-based datasets, there [...] Read more.
Clustering plays a central role in uncovering latent structure within both temporal and visual data. It enables critical insights in various domains including healthcare, finance, surveillance, autonomous systems, and many more. With the growing volume and complexity of time-series and image-based datasets, there is an increasing demand for robust, flexible, and scalable clustering algorithms. Although these modalities differ—time-series being inherently sequential and vision data being spatial—they exhibit common challenges such as high dimensionality, noise, variability in alignment and scale, and the need for interpretable groupings. This survey presents a comprehensive review of recent advancements in clustering methods that are adaptable to both time-series and vision data. We explore a wide spectrum of approaches, including distance-based techniques (e.g., DTW, EMD), feature-based methods, model-based strategies (e.g., GMMs, HMMs), and deep learning frameworks such as autoencoders, self-supervised learning, and graph neural networks. We also survey hybrid and ensemble models, as well as semi-supervised and active clustering methods that leverage minimal supervision for improved performance. By highlighting both the shared principles and the modality-specific adaptations of clustering strategies, this work outlines current capabilities and open challenges, and suggests future directions toward unified, multimodal clustering systems. Full article
(This article belongs to the Section Featured Reviews of Data Science Research)
Show Figures

Figure 1

28 pages, 6632 KB  
Article
Reliable Crack Evolution Monitoring from UAV Remote Sensing: Bridging Detection and Temporal Dynamics
by Canwei Wang and Jin Tang
Remote Sens. 2026, 18(1), 51; https://doi.org/10.3390/rs18010051 - 24 Dec 2025
Cited by 2 | Viewed by 858
Abstract
Surface crack detection and temporal evolution analysis are fundamental tasks in remote sensing and photogrammetry, providing critical information for slope stability assessment, infrastructure safety inspection, and long-term geohazard monitoring. However, current unmanned aerial vehicle (UAV)-based crack detection pipelines typically treat spatial detection and [...] Read more.
Surface crack detection and temporal evolution analysis are fundamental tasks in remote sensing and photogrammetry, providing critical information for slope stability assessment, infrastructure safety inspection, and long-term geohazard monitoring. However, current unmanned aerial vehicle (UAV)-based crack detection pipelines typically treat spatial detection and temporal change analysis as separate processes, leading to weak geometric consistency across time and limiting the interpretability of crack evolution patterns. To overcome these limitations, we propose the Longitudinal Crack Fitting Network (LCFNet), a unified and physically interpretable framework that achieves, for the first time, integrated time-series crack detection and evolution analysis from UAV remote sensing imagery. At its core, the Longitudinal Crack Fitting Convolution (LCFConv) integrates Fourier-series decomposition with affine Lie group convolution, enabling anisotropic feature representation that preserves equivariance to translation, rotation, and scale. This design effectively captures the elongated and oscillatory morphology of surface cracks while suppressing background interference under complex aerial viewpoints. Beyond detection, a Lie-group-based Temporal Crack Change Detection (LTCCD) module is introduced to perform geometrically consistent matching between bi-temporal UAV images, guided by a partial differential equation (PDE) formulation that models the continuous propagation of surface fractures, providing a bridge between discrete perception and physical dynamics. Extensive experiments on the constructed UAV-Filiform Crack Dataset (10,588 remote sensing images) demonstrate that LCFNet surpasses advanced detection frameworks such as You only look once v12 (YOLOv12), RT-DETR, and RS-Mamba, achieving superior performance (mAP50:95 = 75.3%, F1 = 85.5%, and CDR = 85.6%) while maintaining real-time inference speed (88.9 FPS). Field deployment on a UAV–IoT monitoring platform further confirms the robustness of LCFNet in multi-temporal remote sensing applications, accurately identifying newly formed and extended cracks under varying illumination and terrain conditions. This work establishes the first end-to-end paradigm that unifies spatial crack detection and temporal evolution modeling in UAV remote sensing, bridging discrete deep learning inference with continuous physical dynamics. The proposed LCFNet provides both algorithmic robustness and physical interpretability, offering a new foundation for intelligent remote sensing-based structural health assessment and high-precision photogrammetric monitoring. Full article
(This article belongs to the Special Issue Advances in Remote Sensing Technology for Ground Deformation)
Show Figures

Figure 1

41 pages, 1862 KB  
Article
Algorithm for Describing Neuronal Electric Operation
by János Végh
Algorithms 2026, 19(1), 6; https://doi.org/10.3390/a19010006 - 20 Dec 2025
Viewed by 680
Abstract
The development of neuroanatomy and neurophysiology has revealed many new details about neurons’ operation over the past few decades, requiring modifications to their theoretical models. The development of computing technology enables us to consider the fine details the new model requires, but it [...] Read more.
The development of neuroanatomy and neurophysiology has revealed many new details about neurons’ operation over the past few decades, requiring modifications to their theoretical models. The development of computing technology enables us to consider the fine details the new model requires, but it necessitates a different approach. To achieve that goal, the disciplinarity of science must be revisited for living matter, the theoretical model must be updated, and a series of processes instead of states must be considered; furthermore, new mathematics, algorithms, and computing technologies for the new view are also needed. We provide an algorithm implementing the mathematics of the updated theoretical model that considers the neuronal current to consist of charged ions (and so considers thermodynamic effects) and opens the way for explaining the mechanical, optical, etc., consequence phenomena of the electrical operation. We use a new technology in this effort: a tool designed to achieve extreme accuracy in simulating high-speed electronic circuits. The algorithm applies the cross-disciplinary unified electrical/thermodynamic model, along with an unusual programming method, to provide new insights into neuronal operations, describe the processes that take place in living matter, and determine their computing implementation. As has long been suspected, the faithful simulation of biological processes requires accurately mapping biological time to technical computing time. Therefore, the paper focuses on time handling in biology-targeting computations, especially in large-scale tasks. We also touch on the question of simulating the operation of their network, which is contrasted with that of Spiking Neural Networks. The way technical computing works inhibits efforts to achieve the required accuracy in reproducing the temporal behavior of biological operations using conventional computer programs. Full article
(This article belongs to the Collection Feature Papers in Algorithms for Multidisciplinary Applications)
Show Figures

Figure 1

33 pages, 4070 KB  
Article
Dynamic Risk Assessment of Wind Farms Under Extreme Gust Disturbances Using BiLSTM, Adaboost, and Adaptive Kernel Density Estimation for Enhanced Prediction
by Ke Shang, Yanbin Yang, Jinsheng Wang, Meng Shuai, Zhongjie Dang and Qing Liu
Electronics 2025, 14(23), 4584; https://doi.org/10.3390/electronics14234584 - 23 Nov 2025
Viewed by 450
Abstract
Under extreme gusty weather conditions, wind power output fluctuates significantly, and the operational risks of wind farms exhibit strong randomness and complexity. Addressing the issue of limited sample data under such extreme wind speed conditions, which leads to low accuracy in risk assessment, [...] Read more.
Under extreme gusty weather conditions, wind power output fluctuates significantly, and the operational risks of wind farms exhibit strong randomness and complexity. Addressing the issue of limited sample data under such extreme wind speed conditions, which leads to low accuracy in risk assessment, this study proposes a multi-dimensional risk assessment method for wind farms based on the BiLSTM-Adaboost-ABKDE algorithm framework and ITA-TPW (Integrated Trend Analysis—Time-Series Probability Weighting) weighted fusion. First, a wind farm operational risk assessment model based on a multi-dimensional risk indicator system is constructed to assess the impact of these extreme weather conditions on wind farm operational risks. Second, to enhance the robustness of wind power output prediction under small-sample conditions, a BiLSTM (Bidirectional Long Short-Term Memory) network is introduced to capture the bidirectional temporal dependencies of wind power output. Through the Adaboost weighted ensemble of sub-models, combined with the ABKDE (Adaptive Bandwidth Kernel Density Estimation) algorithm for probability density estimation of prediction outputs, high-precision point prediction and uncertainty quantification are unified. Furthermore, we combine trend analysis sensitivity analysis using the ITA method with time-series probability weighting using the TPW method to assess key risks such as voltage overlimit, power flow overlimit, and load imbalance in wind farms under extreme wind speeds from multiple perspectives. The dataset used in this study comes from Hebei Province, covering wind power data from 1 March 2020 to 31 May 2023, and the simulation platform used is MATLAB R2023a. Simulation analysis was conducted using the IEEE RTS-79 node system to validate the effectiveness of the proposed method. The results showed that the proposed method improved the accuracy of load shedding risk and voltage overlimit risk indicators by 6.15% and 4.79%, respectively. Additionally, the reliability of the system’s comprehensive risk indicators was significantly enhanced, validating the effectiveness of this method in improving the accuracy and reliability of operational risk assessment for wind farms under extreme weather conditions. Full article
(This article belongs to the Special Issue Advanced Online Monitoring and Fault Diagnosis of Power Equipment)
Show Figures

Figure 1

24 pages, 850 KB  
Article
Spatio-Temporal Artificial Intelligence for Multi-Hazard-Aware Renewable Energy Site Selection Using Integrated Geospatial and Climate Data
by Katleho Moloi, Kwabena Addo and Ernest Mnkandla
Processes 2025, 13(11), 3728; https://doi.org/10.3390/pr13113728 - 19 Nov 2025
Viewed by 840
Abstract
The siting of renewable energy systems (RESs) in regions vulnerable to multiple climate hazards presents a critical challenge for sustainable infrastructure planning. Traditional approaches, primarily driven by static assessments of solar and wind potential, often neglect the compounded risks posed by floods, droughts, [...] Read more.
The siting of renewable energy systems (RESs) in regions vulnerable to multiple climate hazards presents a critical challenge for sustainable infrastructure planning. Traditional approaches, primarily driven by static assessments of solar and wind potential, often neglect the compounded risks posed by floods, droughts, and windstorms, resulting in investments that are operationally vulnerable and economically unsustainable. This study proposes a novel spatio-temporal artificial intelligence (AI) framework for multi-objective RES deployment that integrates satellite-derived resource maps, high-resolution hazard data, and dynamic climate time series into a unified optimization pipeline. The methodology employs a gated recurrent unit (GRU)-based encoder to capture temporal hazard dynamics, combined with a multi-objective evolutionary algorithm (NSGA-II) to balance energy yield and resilience. A case study in South Africa’s Vhembe District demonstrates the framework’s effectiveness: the proposed model reduces the average hazard exposure by 31.6% while preserving 96.4% of the baseline energy output. Attention-based saliency analysis reveals that flood and windstorm hazards are the dominant drivers of site exclusion. Compared to conventional siting methods, the proposed framework achieves superior trade-offs between performance and risk, ensuring alignment with South Africa’s Just Energy Transition and Climate Adaptation strategies. The results confirm the value of spatio-temporal embeddings and hazard-aware multi-objective optimization in guiding resilient, data-driven energy infrastructure development. This model offers direct benefits to energy planners, climate adaptation agencies, and policymakers seeking to implement resilient, data-driven renewable energy strategies in hazard-prone regions. Full article
(This article belongs to the Section Energy Systems)
Show Figures

Figure 1

31 pages, 9036 KB  
Article
Algorithmic Investigation of Complex Dynamics Arising from High-Order Nonlinearities in Parametrically Forced Systems
by Barka Infal, Adil Jhangeer and Muhammad Muddassar
Algorithms 2025, 18(11), 681; https://doi.org/10.3390/a18110681 - 25 Oct 2025
Viewed by 2523
Abstract
The geometric content of chaos in nonlinear systems with multiple stabilities of high order is a challenge to computation. We introduce a single algorithmic framework to overcome this difficulty in the present study, where a parametrically forced oscillator with cubic–quintic nonlinearities is considered [...] Read more.
The geometric content of chaos in nonlinear systems with multiple stabilities of high order is a challenge to computation. We introduce a single algorithmic framework to overcome this difficulty in the present study, where a parametrically forced oscillator with cubic–quintic nonlinearities is considered as an example. The framework starts with the Sparse Identification of Nonlinear Dynamics (SINDy) algorithm, which is a self-learned algorithm that extracts an interpretable and correct model by simply analyzing time-series data. The resulting parsimonious model is well-validated, and besides being highly predictive, it also offers a solid base on which one can conduct further investigations. Based on this tested paradigm, we propose a unified diagnostic pathway that includes bifurcation analysis, computation of the Lyapunov exponent, power spectral analysis, and recurrence mapping to formally describe the dynamical features of the system. The main characteristic of the framework is an effective algorithm of computational basin analysis, which is able to display attractor basins and expose the fine scale riddled structures and fractal structures that are the indicators of extreme sensitivity to initial conditions. The primary contribution of this work is a comprehensive dynamical analysis of the DM-CQDO, revealing the intricate structure of its stability landscape and multi-stability. This integrated workflow identifies the period-doubling cascade as the primary route to chaos and quantifies the stabilizing effects of key system parameters. This study demonstrates a systematic methodology for applying a combination of data-driven discovery and classical analysis to investigate the complex dynamics of parametrically forced, high-order nonlinear systems. Full article
Show Figures

Figure 1

7 pages, 394 KB  
Proceeding Paper
An Approach to Prediction Using Networked Multimedia
by Vladislav Hinkov and Georgi Krastev
Eng. Proc. 2025, 104(1), 90; https://doi.org/10.3390/engproc2025104090 - 8 Sep 2025
Viewed by 386
Abstract
One of the tasks of statistical analysis is related to the development of forecasts with different horizons. The results of modeling the development trend can also be used for prognostic purposes. At the same time, the assumption is made that during the forecast [...] Read more.
One of the tasks of statistical analysis is related to the development of forecasts with different horizons. The results of modeling the development trend can also be used for prognostic purposes. At the same time, the assumption is made that during the forecast period, the phenomenon under study will exhibit the same patterns of development that it exhibited during the base period. Network multimedia is a unifying link in the parallel development of multimedia and communication technologies. The integrated interaction of technological solutions in the field of multimedia and computer networks is a condition for achieving a greater final application effect in the presentation of information. Experimental studies of modern network multimedia in operational conditions are important for revealing bottlenecks in their functioning. On this basis, recommendations can be made to improve performance indicators, such as performance, reliability, mode of service, etc. This publication is devoted to the experimental study of the trend and the possibility of predicting network multimedia with time series. The implemented algorithm for automated trend determination examines pre-set different trends–linear, quadratic, cubic, hyperbolic, fractional-rational, logarithmic, exponential, exponential, combined–and chooses the most effective of them. Full article
Show Figures

Figure 1

24 pages, 4895 KB  
Article
Research on Gas Concentration Anomaly Detection in Coal Mining Based on SGDBO-Transformer-LSSVM
by Mingyang Liu, Longcheng Zhang, Zhenguo Yan, Xiaodong Wang, Wei Qiao and Longfei Feng
Processes 2025, 13(9), 2699; https://doi.org/10.3390/pr13092699 - 25 Aug 2025
Cited by 1 | Viewed by 977
Abstract
Methane concentration anomalies during coal mining operations are identified as important factors triggering major safety accidents. This study aimed to address the key issues of insufficient adaptability of existing detection methods in dynamic and complex underground environments and limited characterization capabilities for non-uniform [...] Read more.
Methane concentration anomalies during coal mining operations are identified as important factors triggering major safety accidents. This study aimed to address the key issues of insufficient adaptability of existing detection methods in dynamic and complex underground environments and limited characterization capabilities for non-uniform sampling data. Specifically, an intelligent diagnostic model was proposed by integrating the improved Dung Beetle Optimization Algorithm (SGDBO) with Transformer-SVM. A dual-path feature fusion architecture was innovatively constructed. First, the original sequence length of samples was unified by interpolation algorithms to adapt to deep learning model inputs. Meanwhile, statistical features of samples (such as kurtosis and differential standard deviation) were extracted to deeply characterize local mutation characteristics. Then, the Transformer network was utilized to automatically capture the temporal dependencies of concentration time series. Additionally, the output features were concatenated with manual statistical features and input into the LSSVM classifier to form a complementary enhancement diagnostic mechanism. Sine chaotic mapping initialization and a golden sine search mechanism were integrated into DBO. Subsequently, the SGDBO algorithm was employed to optimize the hyperparameters of the Transformer-LSSVM hybrid model, breaking through the bottleneck of traditional parameter optimization falling into local optima. Experiments reveal that this model can significantly improve the classification accuracy and robustness of anomaly curve discrimination. Furthermore, core technical support can be provided to construct coal mine safety monitoring systems, demonstrating critical practical value for ensuring national energy security production. Full article
(This article belongs to the Section Process Control and Monitoring)
Show Figures

Figure 1

Back to TopTop