Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,114)

Search Parameters:
Keywords = power safety data

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
29 pages, 1289 KiB  
Article
An Analysis of Hybrid Management Strategies for Addressing Passenger Injuries and Equipment Failures in the Taipei Metro System: Enhancing Operational Quality and Resilience
by Sung-Neng Peng, Chien-Yi Huang, Hwa-Dong Liu and Ping-Jui Lin
Mathematics 2025, 13(15), 2470; https://doi.org/10.3390/math13152470 - 31 Jul 2025
Viewed by 195
Abstract
This study is the first to systematically integrate supervised machine learning (decision tree) and association rule mining techniques to analyze accident data from the Taipei Metro system, conducting a large-scale data-driven investigation into both passenger injury and train malfunction events. The research demonstrates [...] Read more.
This study is the first to systematically integrate supervised machine learning (decision tree) and association rule mining techniques to analyze accident data from the Taipei Metro system, conducting a large-scale data-driven investigation into both passenger injury and train malfunction events. The research demonstrates strong novelty and practical contributions. In the passenger injury analysis, a dataset of 3331 cases was examined, from which two highly explanatory rules were extracted: (i) elderly passengers (aged > 61) involved in station incidents are more likely to suffer moderate to severe injuries; and (ii) younger passengers (aged ≤ 61) involved in escalator incidents during off-peak hours are also at higher risk of severe injury. This is the first study to quantitatively reveal the interactive effect of age and time of use on injury severity. In the train malfunction analysis, 1157 incidents with delays exceeding five minutes were analyzed. The study identified high-risk condition combinations—such as those involving rolling stock, power supply, communication, and signaling systems—associated with specific seasons and time periods (e.g., a lift value of 4.0 for power system failures during clear mornings from 06:00–12:00, and 3.27 for communication failures during summer evenings from 18:00–24:00). These findings were further cross-validated with maintenance records to uncover underlying causes, including brake system failures, cable aging, and automatic train operation (ATO) module malfunctions. Targeted preventive maintenance recommendations were proposed. Additionally, the study highlighted existing gaps in the completeness and consistency of maintenance records, recommending improvements in documentation standards and data auditing mechanisms. Overall, this research presents a new paradigm for intelligent metro system maintenance and safety prediction, offering substantial potential for broader adoption and practical application. Full article
Show Figures

Figure 1

13 pages, 3360 KiB  
Review
Technological Advances in Pre-Operative Planning
by Mikolaj R. Kowal, Mohammed Ibrahim, André L. Mihaljević, Philipp Kron and Peter Lodge
J. Clin. Med. 2025, 14(15), 5385; https://doi.org/10.3390/jcm14155385 - 30 Jul 2025
Viewed by 206
Abstract
Surgery remains a healthcare intervention with significant risks for patients. Novel technologies can now enhance the peri-operative workflow, with artificial intelligence (AI) and extended reality (XR) to assist with pre-operative planning. This review focuses on innovation in AI, XR and imaging for hepato-biliary [...] Read more.
Surgery remains a healthcare intervention with significant risks for patients. Novel technologies can now enhance the peri-operative workflow, with artificial intelligence (AI) and extended reality (XR) to assist with pre-operative planning. This review focuses on innovation in AI, XR and imaging for hepato-biliary surgery planning. The clinical challenges in hepato-biliary surgery arise from heterogeneity of clinical presentations, the need for multiple imaging modalities and highly variable local anatomy. AI-based models have been developed for risk prediction and multi-disciplinary tumor (MDT) board meetings. The future could involve an on-demand and highly accurate AI-powered decision tool for hepato-biliary surgery, assisting the surgeon to make the most informed decision on the treatment plan, conferring the best possible outcome for individual patients. Advances in AI can also be used to automate image interpretation and 3D modelling, enabling fast and accurate 3D reconstructions of patient anatomy. Surgical navigation systems utilizing XR are already in development, showing an early signal towards improved patient outcomes when used for hepato-biliary surgery. Live visualization of hepato-biliary anatomy in the operating theatre is likely to improve operative safety and performance. The technological advances in AI and XR provide new applications in pre-operative planning with potential for patient benefit. Their use in surgical simulation could accelerate learning curves for surgeons in training. Future research must focus on standardization of AI and XR study reporting, robust databases that are ethically and data protection-compliant, and development of inter-disciplinary tools for various healthcare applications and systems. Full article
(This article belongs to the Special Issue Surgical Precision: The Impact of AI and Robotics in General Surgery)
Show Figures

Figure 1

21 pages, 964 KiB  
Article
A Data-Driven Strategy Assisted by Effective Parameter Optimization for Cable Fault Diagnosis in the Secondary Circuit of a Substation
by Dongbin Yu, Yanjing Zhang, Sijin Luo, Wei Zou, Junting Liu, Zhiyong Ran and Wei Liu
Processes 2025, 13(8), 2407; https://doi.org/10.3390/pr13082407 - 29 Jul 2025
Viewed by 194
Abstract
As power systems evolve rapidly, cables, essential for electric power transmission, demand accurate and timely fault diagnosis to ensure grid safety and stability. However, current cable fault diagnosis technologies often struggle with incomplete feature extraction from complex fault signals and inefficient parameter tuning [...] Read more.
As power systems evolve rapidly, cables, essential for electric power transmission, demand accurate and timely fault diagnosis to ensure grid safety and stability. However, current cable fault diagnosis technologies often struggle with incomplete feature extraction from complex fault signals and inefficient parameter tuning in diagnostic models, hindering efficient and precise fault detection in modern power systems. To address these, this paper proposes a data-driven strategy for cable fault diagnosis in substation secondary circuits, enhanced by effective parameter optimization. Initially, wavelet packet decomposition is employed to finely divide collected cable fault current signals into multiple levels and bands, effectively extracting fault feature vectors. To tackle the challenge of selecting penalty and kernel parameters in Support Vector Machine (SVM) models, an improved Golden Jackal Optimization (GJO) algorithm is introduced. This algorithm simulates the predatory behavior of golden jackals in nature, enabling efficient global optimization of SVM parameters and significantly improving the classification accuracy and generalization capability of the fault diagnosis model. Simulation verification using real cable fault cases confirms that the proposed method outperforms traditional techniques in fault recognition accuracy, diagnostic speed, and robustness, proving its effectiveness and feasibility. This study offers a novel and efficient solution for cable fault diagnosis. Full article
Show Figures

Figure 1

14 pages, 884 KiB  
Article
Evaluating the Safety and Cost-Effectiveness of Shoulder Rumble Strips and Road Lighting on Freeways in Saudi Arabia
by Saif Alarifi and Khalid Alkahtani
Sustainability 2025, 17(15), 6868; https://doi.org/10.3390/su17156868 - 29 Jul 2025
Viewed by 236
Abstract
This study examines the safety and cost-effectiveness of implementing shoulder rumble strips (SRS) and road lighting on Saudi Arabian freeways, providing insights into their roles in fostering sustainable transport systems. By leveraging the Highway Safety Manual (HSM) framework, this research develops localized Crash [...] Read more.
This study examines the safety and cost-effectiveness of implementing shoulder rumble strips (SRS) and road lighting on Saudi Arabian freeways, providing insights into their roles in fostering sustainable transport systems. By leveraging the Highway Safety Manual (HSM) framework, this research develops localized Crash Modification Factors (CMFs) for these interventions, ensuring evidence-based and context-specific evaluations. Data were collected for two periods—pre-pandemic (2017–2019) and post-pandemic (2021–2022). For each period, we obtained traffic crash records from the Saudi Highway Patrol database, traffic volume data from the Ministry of Transport and Logistic Services’ automated count stations, and roadway characteristics and pavement-condition metrics from the National Road Safety Center. The findings reveal that SRS reduces fatal and injury run-off-road crashes by 52.7% (CMF = 0.473) with a benefit–cost ratio of 14.12, highlighting their high cost-effectiveness. Road lighting, focused on nighttime crash reduction, decreases such crashes by 24% (CMF = 0.760), with a benefit–cost ratio of 1.25, although the adoption of solar-powered lighting systems offers potential for greater sustainability gains and a higher benefit–cost ratio. These interventions align with global sustainability goals by enhancing road safety, reducing the socio-economic burden of crashes, and promoting the integration of green technologies. This study not only provides actionable insights for achieving KSA Vision 2030’s target of improved road safety but also demonstrates how engineering solutions can be harmonized with sustainability objectives to advance equitable, efficient, and environmentally responsible transportation systems. Full article
Show Figures

Figure 1

13 pages, 3887 KiB  
Article
Exploring 3D Roadway Modeling Techniques Using CAD and Unity3D
by Yingbing Yang, Yunchuan Sun and Yuhong Wang
Processes 2025, 13(8), 2399; https://doi.org/10.3390/pr13082399 - 28 Jul 2025
Viewed by 200
Abstract
To tackle the inefficiencies in 3D mine tunnel modeling and the tedious task of drawing centerlines, this study introduces a faster method for generating centerlines using CAD secondary development. Starting with the tunnel centerline, the research then dives into techniques for creating detailed [...] Read more.
To tackle the inefficiencies in 3D mine tunnel modeling and the tedious task of drawing centerlines, this study introduces a faster method for generating centerlines using CAD secondary development. Starting with the tunnel centerline, the research then dives into techniques for creating detailed 3D tunnel models. The team first broke down the steps and logic behind tunnel modeling, designing a 3D tunnel framework and its data structure—complete with key geometric components like traverse points, junctions, nodes, and centerlines. By refining older centerline drawing techniques, they built a CAD-powered tool that slashes time and effort. The study also harnessed advanced algorithms, such as surface fitting and curve lofting, to swiftly model tricky tunnel sections like curves and crossings. This method fixes common problems like warped or incomplete surfaces in linked tunnel models, delivering precise and lifelike 3D scenes for VR-based mining safety drills and simulations. Full article
Show Figures

Figure 1

19 pages, 3636 KiB  
Article
Research on Wellbore Trajectory Prediction Based on a Pi-GRU Model
by Hanlin Liu, Yule Hu and Zhenkun Wu
Appl. Sci. 2025, 15(15), 8317; https://doi.org/10.3390/app15158317 - 26 Jul 2025
Viewed by 196
Abstract
Accurate wellbore trajectory prediction is of great significance for enhancing the efficiency and safety of directional drilling in coal mines. However, traditional mechanical analysis methods have high computational complexity, and the existing data-driven models cannot fully integrate non-sequential features such as stratum lithology. [...] Read more.
Accurate wellbore trajectory prediction is of great significance for enhancing the efficiency and safety of directional drilling in coal mines. However, traditional mechanical analysis methods have high computational complexity, and the existing data-driven models cannot fully integrate non-sequential features such as stratum lithology. To solve these problems, this study proposes a parallel input gated recurrent unit (Pi-GRU) model based on the TensorFlow framework. The GRU network captures the temporal dependencies of sequence data (such as dip angle and azimuth angle), while the BP neural network extracts deep correlations from non-sequence features (such as stratum lithology), thereby achieving multi-source data fusion modeling. Orthogonal experimental design was adopted to optimize the model hyperparameters, and the ablation experiment confirmed the necessity of the parallel architecture. The experimental results obtained based on the data of a certain coal mine in Shanxi Province show that the mean square errors (MSE) of the azimuth and dip angle angles of the Pi-GRU model are 0.06° and 0.01°, respectively. Compared with the emerging CNN-BiLSTM model, they are reduced by 66.67% and 76.92%, respectively. To evaluate the generalization performance of the model, we conducted cross-scenario validation on the dataset of the Dehong Coal Mine. The results showed that even under unknown geological conditions, the Pi-GRU model could still maintain high-precision predictions. The Pi-GRU model not only outperforms existing methods in terms of prediction accuracy, with an inference delay of only 0.21 milliseconds, but also requires much less computing power for training and inference than the maximum computing power of the Jetson TX2 hardware. This proves that the model has good practicability and deployability in the engineering field. It provides a new idea for real-time wellbore trajectory correction in intelligent drilling systems and shows strong application potential in engineering applications. Full article
(This article belongs to the Section Computing and Artificial Intelligence)
Show Figures

Figure 1

17 pages, 1192 KiB  
Article
A Power Monitor System Cybersecurity Alarm-Tracing Method Based on Knowledge Graph and GCNN
by Tianhao Ma, Juan Yu, Binquan Wang, Maosheng Gao, Zhifang Yang, Yajie Li and Mao Fan
Appl. Sci. 2025, 15(15), 8188; https://doi.org/10.3390/app15158188 - 23 Jul 2025
Viewed by 151
Abstract
Ensuring cybersecurity in power monitoring systems is of paramount importance to maintain the operational safety and stability of modern power grids. With the rapid expansion of grid infrastructure and increasing sophistication of cyber threats, existing manual alarm-tracing methods face significant challenges in handling [...] Read more.
Ensuring cybersecurity in power monitoring systems is of paramount importance to maintain the operational safety and stability of modern power grids. With the rapid expansion of grid infrastructure and increasing sophistication of cyber threats, existing manual alarm-tracing methods face significant challenges in handling the massive volume of security alerts, leading to delayed responses and potential system vulnerabilities. Current approaches often lack the capability to effectively model complex relationships among alerts and are hindered by imbalanced data distributions, which degrade tracing accuracy. To this end, this paper proposes a power monitor system cybersecurity alarm-tracing method based on the knowledge graph (KG) and graph convolutional neural networks (GCNN). Specifically, a cybersecurity KG is constituted based on the historical alert, accurately representing the entities and relationships in massive alerts. Then, a GCNN with attention mechanisms is applied to sufficiently extract the topological features along alarms in KG so that it can precisely and effectively trace the massive alarms. Most importantly, to mitigate the influence of imbalanced alarms for tracing, a specialized data process and model ensemble strategy by adaptively weighted imbalance sample is proposed. Finally, based on 70,000 alarm information from a regional power grid, by applying the method proposed in this paper, an alarm traceability accuracy rate of 96.59% was achieved. Moreover, compared with the traditional manual method, the traceability efficiency was improved by more than 80%. Full article
(This article belongs to the Special Issue Design, Optimization and Control Strategy of Smart Grids)
Show Figures

Figure 1

26 pages, 2875 KiB  
Article
Sustainable THz SWIPT via RIS-Enabled Sensing and Adaptive Power Focusing: Toward Green 6G IoT
by Sunday Enahoro, Sunday Cookey Ekpo, Mfonobong Uko, Fanuel Elias, Rahul Unnikrishnan, Stephen Alabi and Nurudeen Kolawole Olasunkanmi
Sensors 2025, 25(15), 4549; https://doi.org/10.3390/s25154549 - 23 Jul 2025
Viewed by 326
Abstract
Terahertz (THz) communications and simultaneous wireless information and power transfer (SWIPT) hold the potential to energize battery-less Internet-of-Things (IoT) devices while enabling multi-gigabit data transmission. However, severe path loss, blockages, and rectifier nonlinearity significantly hinder both throughput and harvested energy. Additionally, high-power THz [...] Read more.
Terahertz (THz) communications and simultaneous wireless information and power transfer (SWIPT) hold the potential to energize battery-less Internet-of-Things (IoT) devices while enabling multi-gigabit data transmission. However, severe path loss, blockages, and rectifier nonlinearity significantly hinder both throughput and harvested energy. Additionally, high-power THz beams pose safety concerns by potentially exceeding specific absorption rate (SAR) limits. We propose a sensing-adaptive power-focusing (APF) framework in which a reconfigurable intelligent surface (RIS) embeds low-rate THz sensors. Real-time backscatter measurements construct a spatial map used for the joint optimisation of (i) RIS phase configurations, (ii) multi-tone SWIPT waveforms, and (iii) nonlinear power-splitting ratios. A weighted MMSE inner loop maximizes the data rate, while an outer alternating optimisation applies semidefinite relaxation to enforce passive-element constraints and SAR compliance. Full-stack simulations at 0.3 THz with 20 GHz bandwidth and up to 256 RIS elements show that APF (i) improves the rate–energy Pareto frontier by 30–75% over recent adaptive baselines; (ii) achieves a 150% gain in harvested energy and a 440 Mbps peak per-user rate; (iii) reduces energy-efficiency variance by half while maintaining a Jain fairness index of 0.999;; and (iv) caps SAR at 1.6 W/kg, which is 20% below the IEEE C95.1 safety threshold. The algorithm converges in seven iterations and executes within <3 ms on a Cortex-A78 processor, ensuring compliance with real-time 6G control budgets. The proposed architecture supports sustainable THz-powered networks for smart factories, digital-twin logistics, wire-free extended reality (XR), and low-maintenance structural health monitors, combining high-capacity communication, safe wireless power transfer, and carbon-aware operation for future 6G cyber–physical systems. Full article
Show Figures

Figure 1

22 pages, 3091 KiB  
Article
Assessment of the Risk of Failure in Electric Power Supply Systems for Railway Traffic Control Devices
by Tomasz Ciszewski, Jerzy Wojciechowski, Mieczysław Kornaszewski, Grzegorz Krawczyk, Beata Kuźmińska-Sołśnia and Artur Hermanowicz
Sensors 2025, 25(14), 4501; https://doi.org/10.3390/s25144501 - 19 Jul 2025
Viewed by 370
Abstract
This paper provides a reliability analysis of selected components in the electrical power supply systems used for railway traffic control equipment. It includes rectifiers, controllers, inverters, generators, batteries, sensors, and switching elements. The study used failure data from power supply system elements on [...] Read more.
This paper provides a reliability analysis of selected components in the electrical power supply systems used for railway traffic control equipment. It includes rectifiers, controllers, inverters, generators, batteries, sensors, and switching elements. The study used failure data from power supply system elements on selected railway lines. The analysis was performed using a mathematical model based on Markov processes. Based on the findings, recommendations were made to improve safety levels. The results presented in the paper could serve as a valuable source of information for operators of power supply systems in railway traffic control, helping them optimize maintenance processes and increase equipment reliability. Full article
(This article belongs to the Special Issue Diagnosis and Risk Analysis of Electrical Systems)
Show Figures

Figure 1

33 pages, 2299 KiB  
Review
Edge Intelligence in Urban Landscapes: Reviewing TinyML Applications for Connected and Sustainable Smart Cities
by Athanasios Trigkas, Dimitrios Piromalis and Panagiotis Papageorgas
Electronics 2025, 14(14), 2890; https://doi.org/10.3390/electronics14142890 - 19 Jul 2025
Viewed by 464
Abstract
Tiny Machine Learning (TinyML) extends edge AI capabilities to resource-constrained devices, offering a promising solution for real-time, low-power intelligence in smart cities. This review systematically analyzes 66 peer-reviewed studies from 2019 to 2024, covering applications across urban mobility, environmental monitoring, public safety, waste [...] Read more.
Tiny Machine Learning (TinyML) extends edge AI capabilities to resource-constrained devices, offering a promising solution for real-time, low-power intelligence in smart cities. This review systematically analyzes 66 peer-reviewed studies from 2019 to 2024, covering applications across urban mobility, environmental monitoring, public safety, waste management, and infrastructure health. We examine hardware platforms and machine learning models, with particular attention to power-efficient deployment and data privacy. We review the approaches employed in published studies for deploying machine learning models on resource-constrained hardware, emphasizing the most commonly used communication technologies—while noting the limited uptake of low-power options such as Low Power Wide Area Networks (LPWANs). We also discuss hardware–software co-design strategies that enable sustainable operation. Furthermore, we evaluate the alignment of these deployments with the United Nations Sustainable Development Goals (SDGs), highlighting both their contributions and existing gaps in current practices. This review identifies recurring technical patterns, methodological challenges, and underexplored opportunities, particularly in the areas of hardware provisioning, usage of inherent privacy benefits in relevant applications, communication technologies, and dataset practices, offering a roadmap for future TinyML research and deployment in smart urban systems. Among the 66 studies examined, 29 focused on mobility and transportation, 17 on public safety, 10 on environmental sensing, 6 on waste management, and 4 on infrastructure monitoring. TinyML was deployed on constrained microcontrollers in 32 studies, while 36 used optimized models for resource-limited environments. Energy harvesting, primarily solar, was featured in 6 studies, and low-power communication networks were used in 5. Public datasets were used in 27 studies, custom datasets in 24, and the remainder relied on hybrid or simulated data. Only one study explicitly referenced SDGs, and 13 studies considered privacy in their system design. Full article
(This article belongs to the Special Issue New Advances in Embedded Software and Applications)
Show Figures

Figure 1

16 pages, 391 KiB  
Systematic Review
High-Protein Dietary Interventions in Heart Failure: A Systematic Review of Clinical and Functional Outcomes
by Lorraine S. Evangelista, Rebecca Meraz, Kelly L. Wierenga, Angelina P. Nguyen, Alona D. Angosta and Jennifer Kawi
Nutrients 2025, 17(14), 2361; https://doi.org/10.3390/nu17142361 - 18 Jul 2025
Viewed by 464
Abstract
Background: Heart failure (HF) is frequently associated with skeletal muscle wasting, reduced functional capacity, and malnutrition. High-protein diets offer a promising nutritional intervention to improve these outcomes in individuals with HF. Objective: This systematic review evaluated randomized controlled trials of high-protein dietary interventions [...] Read more.
Background: Heart failure (HF) is frequently associated with skeletal muscle wasting, reduced functional capacity, and malnutrition. High-protein diets offer a promising nutritional intervention to improve these outcomes in individuals with HF. Objective: This systematic review evaluated randomized controlled trials of high-protein dietary interventions in HF populations, with emphasis on intervention characteristics, quantitative benefits, and risk of bias. Methods: We conducted a comprehensive search in PubMed, MEDLINE, Embase, and Cochrane CENTRAL from inception to June 2025. Eligible studies enrolled adults (≥18 years) with HF, implemented high-protein regimens (≥1.1 g/kg/day or ~25–30% of energy), and reported on functional capacity, body composition, muscle strength, clinical outcomes, or biochemical markers. Two reviewers independently screened, extracted data, and assessed bias (Cochrane RoB 2). Heterogeneity in dosing, duration, and outcomes precluded meta-analysis; we therefore provide a narrative synthesis. Results: Ten trials (nine randomized controlled trials, one pilot) involving 1080 patients (median n = 38; range 21–652) were included. High-protein interventions yielded mean improvements in six-minute walk distance of +32 ± 14 m, lean body mass gain of +1.6 ± 0.9 kg, and 9 ± 4% enhancement in quality-of-life scores; muscle strength effects varied from −2% to +11%. Two studies reported an 18% reduction in HF readmissions (p < 0.05). The risk-of-bias assessment identified two low-risk, three moderate-risk, and one high-risk study. Key limitations include small sample sizes, varied protein dosing (1.1–1.5 g/kg/day), short follow-up (2–6 months), and outcome heterogeneity. Conclusions: High-protein dietary strategies appear to confer modest, clinically relevant gains in functional capacity, nutritional status, and HF readmission risk. Larger, well-powered trials with standardized dosing and longer follow-up are necessary to establish optimal protein targets, long-term efficacy, and safety. Full article
Show Figures

Figure 1

33 pages, 534 KiB  
Review
Local AI Governance: Addressing Model Safety and Policy Challenges Posed by Decentralized AI
by Bahrad A. Sokhansanj
AI 2025, 6(7), 159; https://doi.org/10.3390/ai6070159 - 17 Jul 2025
Viewed by 1236
Abstract
Policies and technical safeguards for artificial intelligence (AI) governance have implicitly assumed that AI systems will continue to operate via massive power-hungry data centers operated by large companies like Google and OpenAI. However, the present cloud-based AI paradigm is being challenged by rapidly [...] Read more.
Policies and technical safeguards for artificial intelligence (AI) governance have implicitly assumed that AI systems will continue to operate via massive power-hungry data centers operated by large companies like Google and OpenAI. However, the present cloud-based AI paradigm is being challenged by rapidly advancing software and hardware technologies. Open-source AI models now run on personal computers and devices, invisible to regulators and stripped of safety constraints. The capabilities of local-scale AI models now lag just months behind those of state-of-the-art proprietary models. Wider adoption of local AI promises significant benefits, such as ensuring privacy and autonomy. However, adopting local AI also threatens to undermine the current approach to AI safety. In this paper, we review how technical safeguards fail when users control the code, and regulatory frameworks cannot address decentralized systems as deployment becomes invisible. We further propose ways to harness local AI’s democratizing potential while managing its risks, aimed at guiding responsible technical development and informing community-led policy: (1) adapting technical safeguards for local AI, including content provenance tracking, configurable safe computing environments, and distributed open-source oversight; and (2) shaping AI policy for a decentralized ecosystem, including polycentric governance mechanisms, integrating community participation, and tailored safe harbors for liability. Full article
(This article belongs to the Section AI Systems: Theory and Applications)
Show Figures

Figure 1

22 pages, 1906 KiB  
Article
Explainable and Optuna-Optimized Machine Learning for Battery Thermal Runaway Prediction Under Class Imbalance Conditions
by Abir El Abed, Ghalia Nassreddine, Obada Al-Khatib, Mohamad Nassereddine and Ali Hellany
Thermo 2025, 5(3), 23; https://doi.org/10.3390/thermo5030023 - 15 Jul 2025
Viewed by 349
Abstract
Modern energy storage systems for both power and transportation are highly related to lithium-ion batteries (LIBs). However, their safety depends on a potentially hazardous failure mode known as thermal runaway (TR). Predicting and classifying TR causes can widely enhance the safety of power [...] Read more.
Modern energy storage systems for both power and transportation are highly related to lithium-ion batteries (LIBs). However, their safety depends on a potentially hazardous failure mode known as thermal runaway (TR). Predicting and classifying TR causes can widely enhance the safety of power and transportation systems. This paper presents an advanced machine learning method for forecasting and classifying the causes of TR. A generative model for synthetic data generation was used to handle class imbalance in the dataset. Hyperparameter optimization was conducted using Optuna for four classifiers: Support Vector Machine (SVM), Multi-Layer Perceptron (MLP), tabular network (TabNet), and Extreme Gradient Boosting (XGBoost). A three-fold cross-validation approach was used to guarantee a robust evaluation. An open-source database of LIB failure events is used for model training and testing. The XGBoost model outperforms the other models across all TR categories by achieving 100% accuracy and a high recall (1.00). Model results were interpreted using SHapley Additive exPlanations analysis to investigate the most significant factors in TR predictors. The findings show that important TR indicators include energy adjusted for heat and weight loss, heater power, average cell temperature upon activation, and heater duration. These findings guide the design of safer battery systems and preventive monitoring systems for real applications. They can help experts develop more efficient battery management systems, thereby improving the performance and longevity of battery-operated devices. By enhancing the predictive knowledge of temperature-driven failure mechanisms in LIBs, the study directly advances thermal analysis and energy storage safety domains. Full article
Show Figures

Figure 1

27 pages, 9802 KiB  
Article
Flight-Safe Inference: SVD-Compressed LSTM Acceleration for Real-Time UAV Engine Monitoring Using Custom FPGA Hardware Architecture
by Sreevalliputhuru Siri Priya, Penneru Shaswathi Sanjana, Rama Muni Reddy Yanamala, Rayappa David Amar Raj, Archana Pallakonda, Christian Napoli and Cristian Randieri
Drones 2025, 9(7), 494; https://doi.org/10.3390/drones9070494 - 14 Jul 2025
Viewed by 457
Abstract
Predictive maintenance (PdM) is a proactive strategy that enhances safety, minimizes unplanned downtime, and optimizes operational costs by forecasting equipment failures before they occur. This study presents a novel Field Programmable Gate Array (FPGA)-accelerated predictive maintenance framework for UAV engines using a Singular [...] Read more.
Predictive maintenance (PdM) is a proactive strategy that enhances safety, minimizes unplanned downtime, and optimizes operational costs by forecasting equipment failures before they occur. This study presents a novel Field Programmable Gate Array (FPGA)-accelerated predictive maintenance framework for UAV engines using a Singular Value Decomposition (SVD)-optimized Long Short-Term Memory (LSTM) model. The model performs binary classification to predict the likelihood of imminent engine failure by processing normalized multi-sensor data, including temperature, pressure, and vibration measurements. To enable real-time deployment on resource-constrained UAV platforms, the LSTM’s weight matrices are compressed using Singular Value Decomposition (SVD), significantly reducing computational complexity while preserving predictive accuracy. The compressed model is executed on a Xilinx ZCU-104 FPGA and uses a pipelined, AXI-based hardware accelerator with efficient memory mapping and parallelized gate calculations tailored for low-power onboard systems. Unlike prior works, this study uniquely integrates a tailored SVD compression strategy with a custom hardware accelerator co-designed for real-time, flight-safe inference in UAV systems. Experimental results demonstrate a 98% classification accuracy, a 24% reduction in latency, and substantial FPGA resource savings—specifically, a 26% decrease in BRAM usage and a 37% reduction in DSP consumption—compared to the 32-bit floating-point SVD-compressed FPGA implementation, not CPU or GPU. These findings confirm the proposed system as an efficient and scalable solution for real-time UAV engine health monitoring, thereby enhancing in-flight safety through timely fault prediction and enabling autonomous engine monitoring without reliance on ground communication. Full article
(This article belongs to the Special Issue Advances in Perception, Communications, and Control for Drones)
Show Figures

Figure 1

17 pages, 1795 KiB  
Article
Anomaly Detection in Nuclear Power Production Based on Neural Normal Stochastic Process
by Linyu Liu, Shiqiao Liu, Shuan He, Kui Xu, Yang Lan and Huajian Fang
Sensors 2025, 25(14), 4358; https://doi.org/10.3390/s25144358 - 12 Jul 2025
Viewed by 302
Abstract
To ensure the safety of nuclear power production, nuclear power plants deploy numerous sensors to monitor various physical indicators during production, enabling the early detection of anomalies. Efficient anomaly detection relies on complete sensor data. However, compared to conventional energy sources, the extreme [...] Read more.
To ensure the safety of nuclear power production, nuclear power plants deploy numerous sensors to monitor various physical indicators during production, enabling the early detection of anomalies. Efficient anomaly detection relies on complete sensor data. However, compared to conventional energy sources, the extreme physical environment of nuclear power plants is more likely to negatively impact the normal operation of sensors, compromising the integrity of the collected data. To address this issue, we propose an anomaly detection method for nuclear power data: Neural Normal Stochastic Process (NNSP). This method does not require imputing missing sensor data. Instead, it directly reads incomplete monitoring data through a sequentialization structure and encodes it as continuous latent representations in a neural network. This approach avoids additional “processing” of the raw data. Moreover, the continuity of these representations allows the decoder to specify supervisory signals at time points where data is missing or at future time points, thereby training the model to learn latent anomaly patterns in incomplete nuclear power monitoring data. Experimental results demonstrate that our model outperforms five mainstream baseline methods—ARMA, Isolation Forest, LSTM-AD, VAE, and NeutraL AD—in anomaly detection tasks on incomplete time series. On the Power Generation System (PGS) dataset with a 15% missing rate, our model achieves an F1 score of 83.72%, surpassing all baseline methods and maintaining strong performance across multiple industrial subsystems. Full article
(This article belongs to the Section Fault Diagnosis & Sensors)
Show Figures

Figure 1

Back to TopTop