Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,335)

Search Parameters:
Keywords = RNN—recurrent neural network

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
20 pages, 5461 KiB  
Article
Design and Implementation of a 3D Korean Sign Language Learning System Using Pseudo-Hologram
by Naeun Kim, HaeYeong Choe, Sukwon Lee and Changgu Kang
Appl. Sci. 2025, 15(16), 8962; https://doi.org/10.3390/app15168962 - 14 Aug 2025
Abstract
Sign language is a three-dimensional (3D) visual language that conveys meaning through hand positions, shapes, and movements. Traditional sign language education methods, such as textbooks and videos, often fail to capture the spatial characteristics of sign language, leading to limitations in learning accuracy [...] Read more.
Sign language is a three-dimensional (3D) visual language that conveys meaning through hand positions, shapes, and movements. Traditional sign language education methods, such as textbooks and videos, often fail to capture the spatial characteristics of sign language, leading to limitations in learning accuracy and comprehension. To address this, we propose a 3D Korean Sign Language Learning System that leverages pseudo-hologram technology and hand gesture recognition using Leap Motion sensors. The proposed system provides learners with an immersive 3D learning experience by visualizing sign language gestures through pseudo-holographic displays. A Recurrent Neural Network (RNN) model, combined with Diffusion Convolutional Recurrent Neural Networks (DCRNNs) and ProbSparse Attention mechanisms, is used to recognize hand gestures from both hands in real-time. The system is implemented using a server–client architecture to ensure scalability and flexibility, allowing efficient updates to the gesture recognition model without modifying the client application. Experimental results show that the system enhances learners’ ability to accurately perform and comprehend sign language gestures. Additionally, a usability study demonstrated that 3D visualization significantly improves learning motivation and user engagement compared to traditional 2D learning methods. Full article
Show Figures

Figure 1

23 pages, 2132 KiB  
Article
Ontology Matching Method Based on Deep Learning and Syntax
by Jiawei Lu and Changfeng Yan
Big Data Cogn. Comput. 2025, 9(8), 208; https://doi.org/10.3390/bdcc9080208 - 14 Aug 2025
Abstract
Ontology technology addresses data heterogeneity challenges in Internet of Everything (IoE) systems enabled by Cyber Twin and 6G, yet the subjective nature of ontology engineering often leads to differing definitions of the same concept across ontologies, resulting in ontology heterogeneity. To solve this [...] Read more.
Ontology technology addresses data heterogeneity challenges in Internet of Everything (IoE) systems enabled by Cyber Twin and 6G, yet the subjective nature of ontology engineering often leads to differing definitions of the same concept across ontologies, resulting in ontology heterogeneity. To solve this problem, this study introduces a hybrid ontology matching method that integrates a Recurrent Neural Network (RNN) with syntax-based analysis. The method first extracts representative entities by leveraging in-degree and out-degree information from ontological tree structures, which reduces training noise and improves model generalization. Next, a matching framework combining RNN and N-gram is designed: the RNN captures medium-distance dependencies and complex sequential patterns, supporting the dynamic optimization of embedding parameters and semantic feature extraction; the N-gram module further captures local information and relationships between adjacent characters, improving the coverage of matched entities. The experiments were conducted on the OAEI benchmark dataset, where the proposed method was compared with representative baseline methods from OAEI as well as a Transformer-based method. The results demonstrate that the proposed method achieved an 18.18% improvement in F-measure over the best-performing baseline. This improvement was statistically significant, as validated by the Friedman and Holm tests. Moreover, the proposed method achieves the shortest runtime among all the compared methods. Compared to other RNN-based hybrid frameworks that adopt classical structure-based and semantics-based similarity measures, the proposed method further improved the F-measure by 18.46%. Furthermore, a comparison of time and space complexity with the standalone RNN model and its variants demonstrated that the proposed method achieved high performance while maintaining favorable computational efficiency. These findings confirm the effectiveness and efficiency of the method in addressing ontology heterogeneity in complex IoE environments. Full article
Show Figures

Figure 1

20 pages, 5393 KiB  
Article
Resource-Efficient Decoding of Topological Color Codes via Neural-Guided Union-Find Optimization
by Minghao Fu, Cewen Tian, Zaixu Fan and Hongyang Ma
Appl. Sci. 2025, 15(16), 8937; https://doi.org/10.3390/app15168937 - 13 Aug 2025
Viewed by 147
Abstract
Quantum error correction (QEC) is crucial for achieving reliable quantum computation. Among topological QEC codes, color codes can correct bit-flip and phase-flip errors simultaneously, enabling efficient resource utilization. However, existing decoders such as the Union–Find (UF) algorithm exhibit limited accuracy under high noise [...] Read more.
Quantum error correction (QEC) is crucial for achieving reliable quantum computation. Among topological QEC codes, color codes can correct bit-flip and phase-flip errors simultaneously, enabling efficient resource utilization. However, existing decoders such as the Union–Find (UF) algorithm exhibit limited accuracy under high noise levels. We propose a hybrid decoding framework that augments a modified UF algorithm—enhanced with a secondary growth strategy—with a lightweight recurrent neural network (RNN). The RNN refines the error chains identified by UF, improving resolution without significantly increasing computational overhead. The simulation results show that our method achieves notable accuracy gains over baseline UF decoding, particularly in high-error regimes, while preserving the near-linear runtime scaling and low memory footprint of UF. At higher physical error rates, RNN-based path optimization improves UF decoding accuracy by approximately 4.7%. The decoding threshold of the color code reaches 0.1365, representing an increase of about 2% compared to UF without RNN optimization. With its simple data structure and low space complexity, the proposed method is well suited for low-latency, resource-constrained quantum computing environments. Full article
(This article belongs to the Topic Quantum Information and Quantum Computing, 2nd Volume)
Show Figures

Figure 1

24 pages, 5251 KiB  
Article
Artificial Intelligence-Based Sensorless Control of Induction Motors with Dual-Field Orientation
by Eniko Szoke, Csaba Szabo and Lucian-Nicolae Pintilie
Appl. Sci. 2025, 15(16), 8919; https://doi.org/10.3390/app15168919 - 13 Aug 2025
Viewed by 146
Abstract
This paper introduces a speed-sensorless dual-field-oriented control (DFOC) strategy for induction motors (IMs). DFOC combines the advantages or rotor- and stator-field orientation to significantly reduce the parameter sensitivity of the control regarding the generation of the converter control variable. A simplified structure is [...] Read more.
This paper introduces a speed-sensorless dual-field-oriented control (DFOC) strategy for induction motors (IMs). DFOC combines the advantages or rotor- and stator-field orientation to significantly reduce the parameter sensitivity of the control regarding the generation of the converter control variable. A simplified structure is also proposed, using only two regulators for the flux and speed control, eliminating the two current regulators. Related to sensorless control, the classical adaptation mechanism within an MRAS (model reference adaptive system) observer is replaced with artificial intelligence (AI)-based approaches. Specifically, artificial neural networks (ANNs) and recurrent neural networks (RNNs) are employed for rotor speed estimation. They offer significant advantages in managing complex and nonlinear systems, providing enhanced flexibility and adaptability compared to traditional MRAS methods. The effectiveness of the proposed sensorless control scheme is validated through both simulation and real-time implementation. The paper focuses on the ANN and RNN architectures, as deep learning models, in terms of the reliability and accuracy of rotor speed estimation under various operating conditions. Full article
(This article belongs to the Special Issue New Trends in Sustainable Energy Technology)
Show Figures

Figure 1

16 pages, 22555 KiB  
Technical Note
A Hybrid RNN-CNN Approach with TPI for High-Precision DEM Reconstruction
by Ruizhe Cao, Chunjing Yao, Hongchao Ma, Bin Guo, Jie Wang and Junhao Xu
Remote Sens. 2025, 17(16), 2770; https://doi.org/10.3390/rs17162770 - 9 Aug 2025
Viewed by 373
Abstract
Digital elevation models (DEMs), as the fundamental unit of terrain morphology, are crucial for understanding surface processes and for land use planning. However, automated classification faces challenges due to inefficient terrain feature extraction from raw LiDAR point clouds and the limitations of traditional [...] Read more.
Digital elevation models (DEMs), as the fundamental unit of terrain morphology, are crucial for understanding surface processes and for land use planning. However, automated classification faces challenges due to inefficient terrain feature extraction from raw LiDAR point clouds and the limitations of traditional methods in capturing fine-scale topographic variations. To address this, we propose a novel hybrid RNN-CNN framework that integrates multi-scale Topographic Position Index (TPI) features to enhance DEM generation. Our approach first models voxelated LiDAR point clouds as spatially ordered sequences, using Recurrent Neural Networks (RNNs) to encode vertical elevation dependencies and Convolutional Neural Networks (CNNs) to extract planar spatial features. By incorporating TPI as a semantic constraint, the model learns to distinguish terrain structures at multiple scales. Residual connections refine feature representations to preserve micro-topographic details during DEM reconstruction. Extensive experiments in the complex terrains of Jiuzhaigou, China, demonstrate that our lightweight hybrid framework not only achieves excellent DEM reconstruction accuracy in complex terrains, but also improves computational efficiency by more than 20% on average compared to traditional interpolation methods, making it highly suitable for resource-constrained applications. Full article
Show Figures

Graphical abstract

23 pages, 6646 KiB  
Article
Short-Period Characteristics Analysis of On-Orbit Solar Arrays
by Huan Liu, Chenjie Kong, Yuan Shen, Baojun Lin, Xueliang Wang and Qiang Zhang
Aerospace 2025, 12(8), 706; https://doi.org/10.3390/aerospace12080706 - 9 Aug 2025
Viewed by 233
Abstract
Based on the analysis of solar array current data from a certain MEO-orbiting satellite, this paper reveals its short-period fluctuation characteristics and underlying mechanisms. The study finds that when solar panels face the sun during the light period, the output current exhibits significant [...] Read more.
Based on the analysis of solar array current data from a certain MEO-orbiting satellite, this paper reveals its short-period fluctuation characteristics and underlying mechanisms. The study finds that when solar panels face the sun during the light period, the output current exhibits significant short-period fluctuations in addition to being influenced by long-period factors such as sun–earth distance, incident light intensity changes, and space irradiation attenuation. Through theoretical analysis, we first confirm that the root cause of these short-period variations is the temperature change in the shunt circuit caused by load fluctuations, which in turn affects the output current characteristics. Unlike traditional methods that use static characteristic factors such as incident angles, this paper innovatively proposes using load current as a key characteristic factor. For asymmetric solar panel fault scenarios, load current, time phase, and fault-wing output current are used as characteristic factors to adaptively predict the current of normal wings. Meanwhile, feedforward neural network (FNN), Recurrent Neural Network (RNN), and long short-term memory (LSTM) are used for output current prediction. The experimental results show that these methods can accurately capture the short-period fluctuations caused by load mutations and adapt to the fluctuation trend of the normal wing during the prediction of current changes in the faulty wing. It is worth noting that, limited by the short-period fluctuation prediction scenario, the inherent advantage of LSTM in long-sequence prediction is not fully reflected. Full article
(This article belongs to the Section Astronautics & Space Science)
Show Figures

Figure 1

19 pages, 821 KiB  
Article
Multimodal Multisource Neural Machine Translation: Building Resources for Image Caption Translation from European Languages into Arabic
by Roweida Mohammed, Inad Aljarrah, Mahmoud Al-Ayyoub and Ali Fadel
Computation 2025, 13(8), 194; https://doi.org/10.3390/computation13080194 - 8 Aug 2025
Viewed by 209
Abstract
Neural machine translation (NMT) models combining textual and visual inputs generate more accurate translations compared with unimodal models. Moreover, translation models with an under-resourced target language benefit from multisource inputs (source sentences are provided in different languages). Building MultiModal MutliSource NMT (M3 [...] Read more.
Neural machine translation (NMT) models combining textual and visual inputs generate more accurate translations compared with unimodal models. Moreover, translation models with an under-resourced target language benefit from multisource inputs (source sentences are provided in different languages). Building MultiModal MutliSource NMT (M3S-NMT) systems require significant efforts to curate datasets suitable for such a multifaceted task. This work uses image caption translation as an example of multimodal translation and presents a novel public dataset for translating captions from multiple European languages (viz., English, German, French, and Czech) into the distant and under-resourced Arabic language. Moreover, it presents multitask learning models trained and tested on this dataset to serve as solid baselines to help further research in this area. These models involve two parts: one for learning the visual representations of the input images, and the other for translating the textual input based on these representations. The translations are produced from a framework of attention-based encoder–decoder architectures. The visual features are learned from a pretrained convolutional neural network (CNN). These features are then integrated with textual features learned through the very basic yet well-known recurrent neural networks (RNNs) with GloVe or BERT word embeddings. Despite the challenges associated with the task at hand, the results of these systems are very promising, reaching 34.57 and 42.52 METEOR scores. Full article
(This article belongs to the Section Computational Social Science)
Show Figures

Figure 1

27 pages, 1523 KiB  
Article
Reinforcement Learning-Based Agricultural Fertilization and Irrigation Considering N2O Emissions and Uncertain Climate Variability
by Zhaoan Wang, Shaoping Xiao, Jun Wang, Ashwin Parab and Shivam Patel
AgriEngineering 2025, 7(8), 252; https://doi.org/10.3390/agriengineering7080252 - 7 Aug 2025
Viewed by 324
Abstract
Nitrous oxide (N2O) emissions from agriculture are rising due to increased fertilizer use and intensive farming, posing a major challenge for climate mitigation. This study introduces a novel reinforcement learning (RL) framework to optimize farm management strategies that balance [...] Read more.
Nitrous oxide (N2O) emissions from agriculture are rising due to increased fertilizer use and intensive farming, posing a major challenge for climate mitigation. This study introduces a novel reinforcement learning (RL) framework to optimize farm management strategies that balance crop productivity with environmental impact, particularly N2O emissions. By modeling agricultural decision-making as a partially observable Markov decision process (POMDP), the framework accounts for uncertainties in environmental conditions and observational data. The approach integrates deep Q-learning with recurrent neural networks (RNNs) to train adaptive agents within a simulated farming environment. A Probabilistic Deep Learning (PDL) model was developed to estimate N2O emissions, achieving a high Prediction Interval Coverage Probability (PICP) of 0.937 within a 95% confidence interval on the available dataset. While the PDL model’s generalizability is currently constrained by the limited observational data, the RL framework itself is designed for broad applicability, capable of extending to diverse agricultural practices and environmental conditions. Results demonstrate that RL agents reduce N2O emissions without compromising yields, even under climatic variability. The framework’s flexibility allows for future integration of expanded datasets or alternative emission models, ensuring scalability as more field data becomes available. This work highlights the potential of artificial intelligence to advance climate-smart agriculture by simultaneously addressing productivity and sustainability goals in dynamic real-world settings. Full article
(This article belongs to the Special Issue Implementation of Artificial Intelligence in Agriculture)
Show Figures

Figure 1

30 pages, 1142 KiB  
Review
Beyond the Backbone: A Quantitative Review of Deep-Learning Architectures for Tropical Cyclone Track Forecasting
by He Huang, Difei Deng, Liang Hu, Yawen Chen and Nan Sun
Remote Sens. 2025, 17(15), 2675; https://doi.org/10.3390/rs17152675 - 2 Aug 2025
Viewed by 351
Abstract
Accurate forecasting of tropical cyclone (TC) tracks is critical for disaster preparedness and risk mitigation. While traditional numerical weather prediction (NWP) systems have long served as the backbone of operational forecasting, they face limitations in computational cost and sensitivity to initial conditions. In [...] Read more.
Accurate forecasting of tropical cyclone (TC) tracks is critical for disaster preparedness and risk mitigation. While traditional numerical weather prediction (NWP) systems have long served as the backbone of operational forecasting, they face limitations in computational cost and sensitivity to initial conditions. In recent years, deep learning (DL) has emerged as a promising alternative, offering data-driven modeling capabilities for capturing nonlinear spatiotemporal patterns. This paper presents a comprehensive review of DL-based approaches for TC track forecasting. We categorize all DL-based TC tracking models according to the architecture, including recurrent neural networks (RNNs), convolutional neural networks (CNNs), Transformers, graph neural networks (GNNs), generative models, and Fourier-based operators. To enable rigorous performance comparison, we introduce a Unified Geodesic Distance Error (UGDE) metric that standardizes evaluation across diverse studies and lead times. Based on this metric, we conduct a critical comparison of state-of-the-art models and identify key insights into their relative strengths, limitations, and suitable application scenarios. Building on this framework, we conduct a critical cross-model analysis that reveals key trends, performance disparities, and architectural tradeoffs. Our analysis also highlights several persistent challenges, such as long-term forecast degradation, limited physical integration, and generalization to extreme events, pointing toward future directions for developing more robust and operationally viable DL models for TC track forecasting. To support reproducibility and facilitate standardized evaluation, we release an open-source UGDE conversion tool on GitHub. Full article
(This article belongs to the Section AI Remote Sensing)
Show Figures

Figure 1

25 pages, 2515 KiB  
Article
Solar Agro Savior: Smart Agricultural Monitoring Using Drones and Deep Learning Techniques
by Manu Mundappat Ramachandran, Bisni Fahad Mon, Mohammad Hayajneh, Najah Abu Ali and Elarbi Badidi
Agriculture 2025, 15(15), 1656; https://doi.org/10.3390/agriculture15151656 - 1 Aug 2025
Viewed by 466
Abstract
The Solar Agro Savior (SAS) is an innovative solution that is assisted by drones for the sustainable utilization of water and plant disease observation in the agriculture sector. This system integrates an alerting mechanism for humidity, moisture, and temperature variations, which affect the [...] Read more.
The Solar Agro Savior (SAS) is an innovative solution that is assisted by drones for the sustainable utilization of water and plant disease observation in the agriculture sector. This system integrates an alerting mechanism for humidity, moisture, and temperature variations, which affect the plants’ health and optimization in water utilization, which enhances plant yield productivity. A significant feature of the system is the efficient monitoring system in a larger region through drones’ high-resolution cameras, which enables real-time, efficient response and alerting for environmental fluctuations to the authorities. The machine learning algorithm, particularly recurrent neural networks, which is a pioneer with agriculture and pest control, is incorporated for intelligent monitoring systems. The proposed system incorporates a specialized form of a recurrent neural network, Long Short-Term Memory (LSTM), which effectively addresses the vanishing gradient problem. It also utilizes an attention-based mechanism that enables the model to assign meaningful weights to the most important parts of the data sequence. This algorithm not only enhances water utilization efficiency but also boosts plant yield and strengthens pest control mechanisms. This system also provides sustainability through the re-utilization of water and the elimination of electric energy through solar panel systems for powering the inbuilt irrigation system. A comparative analysis of variant algorithms in the agriculture sector with a machine learning approach was also illustrated, and the proposed system yielded 99% yield accuracy, a 97.8% precision value, 98.4% recall, and a 98.4% F1 score value. By encompassing solar irrigation and artificial intelligence-driven analysis, the proposed algorithm, Solar Argo Savior, established a sustainable framework in the latest agricultural sectors and promoted sustainability to protect our environment and community. Full article
(This article belongs to the Section Agricultural Technology)
Show Figures

Figure 1

21 pages, 4147 KiB  
Article
OLTEM: Lumped Thermal and Deep Neural Model for PMSM Temperature
by Yuzhong Sheng, Xin Liu, Qi Chen, Zhenghao Zhu, Chuangxin Huang and Qiuliang Wang
AI 2025, 6(8), 173; https://doi.org/10.3390/ai6080173 - 31 Jul 2025
Viewed by 407
Abstract
Background and Objective: Temperature management is key for reliable operation of permanent magnet synchronous motors (PMSMs). The lumped-parameter thermal network (LPTN) is fast and interpretable but struggles with nonlinear behavior under high power density. We propose OLTEM, a physics-informed deep model that combines [...] Read more.
Background and Objective: Temperature management is key for reliable operation of permanent magnet synchronous motors (PMSMs). The lumped-parameter thermal network (LPTN) is fast and interpretable but struggles with nonlinear behavior under high power density. We propose OLTEM, a physics-informed deep model that combines LPTN with a thermal neural network (TNN) to improve prediction accuracy while keeping physical meaning. Methods: OLTEM embeds LPTN into a recurrent state-space formulation and learns three parameter sets: thermal conductance, inverse thermal capacitance, and power loss. Two additions are introduced: (i) a state-conditioned squeeze-and-excitation (SC-SE) attention that adapts feature weights using the current temperature state, and (ii) an enhanced power-loss sub-network that uses a deep MLP with SC-SE and non-negativity constraints. The model is trained and evaluated on the public Electric Motor Temperature dataset (Paderborn University/Kaggle). Performance is measured by mean squared error (MSE) and maximum absolute error across permanent-magnet, stator-yoke, stator-tooth, and stator-winding temperatures. Results: OLTEM tracks fast thermal transients and yields lower MSE than both the baseline TNN and a CNN–RNN model for all four components. On a held-out generalization set, MSE remains below 4.0 °C2 and the maximum absolute error is about 4.3–8.2 °C. Ablation shows that removing either SC-SE or the enhanced power-loss module degrades accuracy, confirming their complementary roles. Conclusions: By combining physics with learned attention and loss modeling, OLTEM improves PMSM temperature prediction while preserving interpretability. This approach can support motor thermal design and control; future work will study transfer to other machines and further reduce short-term errors during abrupt operating changes. Full article
Show Figures

Figure 1

20 pages, 1346 KiB  
Article
Integrated Smart Farm System Using RNN-Based Supply Scheduling and UAV Path Planning
by Dongwoo You, Yukai Chen and Donkyu Baek
Drones 2025, 9(8), 531; https://doi.org/10.3390/drones9080531 - 28 Jul 2025
Viewed by 505
Abstract
Smart farming has emerged as a promising solution to address challenges such as climate change, population growth, and limited agricultural infrastructure. To enhance the operational efficiency of smart farms, this paper proposes an integrated system that combines Recurrent Neural Networks (RNNs) and Unmanned [...] Read more.
Smart farming has emerged as a promising solution to address challenges such as climate change, population growth, and limited agricultural infrastructure. To enhance the operational efficiency of smart farms, this paper proposes an integrated system that combines Recurrent Neural Networks (RNNs) and Unmanned Aerial Vehicles (UAVs). The proposed framework forecasts future resource shortages using an RNN model and recent environmental data collected from the field. Based on these forecasts, the system schedules a resource supply plan and determines the UAV path by considering both dynamic energy consumption and priority levels, aiming to maximize the efficiency of the resource supply. Experimental results show that the proposed integrated smart farm framework achieves an average reduction of 81.08% in the supply miss rate. This paper demonstrates the potential of an integrated AI- and UAV-based smart farm management system in achieving both environmental responsiveness and operational optimization. Full article
(This article belongs to the Section Drones in Agriculture and Forestry)
Show Figures

Figure 1

24 pages, 2815 KiB  
Article
Blockchain-Powered LSTM-Attention Hybrid Model for Device Situation Awareness and On-Chain Anomaly Detection
by Qiang Zhang, Caiqing Yue, Xingzhe Dong, Guoyu Du and Dongyu Wang
Sensors 2025, 25(15), 4663; https://doi.org/10.3390/s25154663 - 28 Jul 2025
Viewed by 344
Abstract
With the increasing scale of industrial devices and the growing complexity of multi-source heterogeneous sensor data, traditional methods struggle to address challenges in fault detection, data security, and trustworthiness. Ensuring tamper-proof data storage and improving prediction accuracy for imbalanced anomaly detection for potential [...] Read more.
With the increasing scale of industrial devices and the growing complexity of multi-source heterogeneous sensor data, traditional methods struggle to address challenges in fault detection, data security, and trustworthiness. Ensuring tamper-proof data storage and improving prediction accuracy for imbalanced anomaly detection for potential deployment in the Industrial Internet of Things (IIoT) remain critical issues. This study proposes a blockchain-powered Long Short-Term Memory Network (LSTM)–Attention hybrid model: an LSTM-based Encoder–Attention–Decoder (LEAD) for industrial device anomaly detection. The model utilizes an encoder–attention–decoder architecture for processing multivariate time series data generated by industrial sensors and smart contracts for automated on-chain data verification and tampering alerts. Experiments on real-world datasets demonstrate that the LEAD achieves an F0.1 score of 0.96, outperforming baseline models (Recurrent Neural Network (RNN): 0.90; LSTM: 0.94; and Bi-directional LSTM (Bi-LSTM, 0.94)). We simulate the system using a private FISCO-BCOS network with a multi-node setup to demonstrate contract execution, anomaly data upload, and tamper alert triggering. The blockchain system successfully detects unauthorized access and data tampering, offering a scalable solution for device monitoring. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

26 pages, 5325 KiB  
Article
Spatiotemporal Dengue Forecasting for Sustainable Public Health in Bandung, Indonesia: A Comparative Study of Classical, Machine Learning, and Bayesian Models
by I Gede Nyoman Mindra Jaya, Yudhie Andriyana, Bertho Tantular, Sinta Septi Pangastuti and Farah Kristiani
Sustainability 2025, 17(15), 6777; https://doi.org/10.3390/su17156777 - 25 Jul 2025
Viewed by 498
Abstract
Accurate dengue forecasting is essential for sustainable public health planning, especially in tropical regions where the disease remains a persistent threat. This study evaluates the predictive performance of seven modeling approaches—Seasonal Autoregressive Integrated Moving Average (SARIMA), Extreme Gradient Boosting (XGBoost), Recurrent Neural Network [...] Read more.
Accurate dengue forecasting is essential for sustainable public health planning, especially in tropical regions where the disease remains a persistent threat. This study evaluates the predictive performance of seven modeling approaches—Seasonal Autoregressive Integrated Moving Average (SARIMA), Extreme Gradient Boosting (XGBoost), Recurrent Neural Network (RNN), Long Short-Term Memory (LSTM), Bidirectional LSTM (BiLSTM), Convolutional LSTM (CNN–LSTM), and a Bayesian spatiotemporal model—using monthly dengue incidence data from 2009 to 2023 in Bandung City, Indonesia. Model performance was assessed using MAE, sMAPE, RMSE, and Pearson’s correlation (R). Among all models, the Bayesian spatiotemporal model achieved the best performance, with the lowest MAE (5.543), sMAPE (62.137), and RMSE (7.482), and the highest R (0.723). While SARIMA and XGBoost showed signs of overfitting, the Bayesian model not only delivered more accurate forecasts but also produced spatial risk estimates and identified high-risk hotspots via exceedance probabilities. These features make it particularly valuable for developing early warning systems and guiding targeted public health interventions, supporting the broader goals of sustainable disease management. Full article
(This article belongs to the Section Health, Well-Being and Sustainability)
Show Figures

Figure 1

30 pages, 9222 KiB  
Article
Using Deep Learning in Forecasting the Production of Electricity from Photovoltaic and Wind Farms
by Michał Pikus, Jarosław Wąs and Agata Kozina
Energies 2025, 18(15), 3913; https://doi.org/10.3390/en18153913 - 23 Jul 2025
Viewed by 369
Abstract
Accurate forecasting of electricity production is crucial for the stability of the entire energy sector. However, predicting future renewable energy production and its value is difficult due to the complex processes that affect production using renewable energy sources. In this article, we examine [...] Read more.
Accurate forecasting of electricity production is crucial for the stability of the entire energy sector. However, predicting future renewable energy production and its value is difficult due to the complex processes that affect production using renewable energy sources. In this article, we examine the performance of basic deep learning models for electricity forecasting. We designed deep learning models, including recursive neural networks (RNNs), which are mainly based on long short-term memory (LSTM) networks; gated recurrent units (GRUs), convolutional neural networks (CNNs), temporal fusion transforms (TFTs), and combined architectures. In order to achieve this goal, we have created our benchmarks and used tools that automatically select network architectures and parameters. Data were obtained as part of the NCBR grant (the National Center for Research and Development, Poland). These data contain daily records of all the recorded parameters from individual solar and wind farms over the past three years. The experimental results indicate that the LSTM models significantly outperformed the other models in terms of forecasting. In this paper, multilayer deep neural network (DNN) architectures are described, and the results are provided for all the methods. This publication is based on the results obtained within the framework of the research and development project “POIR.01.01.01-00-0506/21”, realized in the years 2022–2023. The project was co-financed by the European Union under the Smart Growth Operational Programme 2014–2020. Full article
Show Figures

Figure 1

Back to TopTop