Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (141)

Search Parameters:
Keywords = intelligent digital twin networks

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
30 pages, 9435 KiB  
Article
Intelligent Fault Warning Method for Wind Turbine Gear Transmission System Driven by Digital Twin and Multi-Source Data Fusion
by Tiantian Xu, Xuedong Zhang and Wenlei Sun
Appl. Sci. 2025, 15(15), 8655; https://doi.org/10.3390/app15158655 (registering DOI) - 5 Aug 2025
Viewed by 1
Abstract
To meet the demands for real-time and accurate fault warning of wind turbine gear transmission systems, this study proposes an innovative intelligent warning method based on the integration of digital twin and multi-source data fusion. A digital twin system architecture is developed, comprising [...] Read more.
To meet the demands for real-time and accurate fault warning of wind turbine gear transmission systems, this study proposes an innovative intelligent warning method based on the integration of digital twin and multi-source data fusion. A digital twin system architecture is developed, comprising a high-precision geometric model and a dynamic mechanism model, enabling real-time interaction and data fusion between the physical transmission system and its virtual model. At the algorithmic level, a CNN-LSTM-Attention fault prediction model is proposed, which innovatively integrates the spatial feature extraction capabilities of a convolutional neural network (CNN), the temporal modeling advantages of long short-term memory (LSTM), and the key information-focusing characteristics of an attention mechanism. Experimental validation shows that this model outperforms traditional methods in prediction accuracy. Specifically, it achieves average improvements of 0.3945, 0.546 and 0.061 in Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), and R-squared (R2) metrics, respectively. Building on the above findings, a monitoring and early warning platform for the wind turbine transmission system was developed, integrating digital twin visualization with intelligent prediction functions. This platform enables a fully intelligent process from data acquisition and status evaluation to fault warning, providing an innovative solution for the predictive maintenance of wind turbines. Full article
Show Figures

Figure 1

22 pages, 4426 KiB  
Article
A Digital Twin Platform for Real-Time Intersection Traffic Monitoring, Performance Evaluation, and Calibration
by Abolfazl Afshari, Joyoung Lee and Dejan Besenski
Infrastructures 2025, 10(8), 204; https://doi.org/10.3390/infrastructures10080204 - 4 Aug 2025
Viewed by 177
Abstract
Emerging transportation challenges necessitate cutting-edge technologies for real-time infrastructure and traffic monitoring. To create a dynamic digital twin for intersection monitoring, data gathering, performance assessment, and calibration of microsimulation software, this study presents a state-of-the-art platform that combines high-resolution LiDAR sensor data with [...] Read more.
Emerging transportation challenges necessitate cutting-edge technologies for real-time infrastructure and traffic monitoring. To create a dynamic digital twin for intersection monitoring, data gathering, performance assessment, and calibration of microsimulation software, this study presents a state-of-the-art platform that combines high-resolution LiDAR sensor data with VISSIM simulation software. Intending to track traffic flow and evaluate important factors, including congestion, delays, and lane configurations, the platform gathers and analyzes real-time data. The technology allows proactive actions to improve safety and reduce interruptions by utilizing the comprehensive information that LiDAR provides, such as vehicle trajectories, speed profiles, and lane changes. The digital twin technique offers unparalleled precision in traffic and infrastructure state monitoring by fusing real data streams with simulation-based performance analysis. The results show how the platform can transform real-time monitoring and open the door to data-driven decision-making, safer intersections, and more intelligent traffic data collection methods. Using the proposed platform, this study calibrated a VISSIM simulation network to optimize the driving behavior parameters in the software. This study addresses current issues in urban traffic management with real-time solutions, demonstrating the revolutionary impact of emerging technology in intelligent infrastructure monitoring. Full article
Show Figures

Figure 1

23 pages, 2888 KiB  
Review
Machine Learning in Flocculant Research and Application: Toward Smart and Sustainable Water Treatment
by Caichang Ding, Ling Shen, Qiyang Liang and Lixin Li
Separations 2025, 12(8), 203; https://doi.org/10.3390/separations12080203 - 1 Aug 2025
Viewed by 215
Abstract
Flocculants are indispensable in water and wastewater treatment, enabling the aggregation and removal of suspended particles, colloids, and emulsions. However, the conventional development and application of flocculants rely heavily on empirical methods, which are time-consuming, resource-intensive, and environmentally problematic due to issues such [...] Read more.
Flocculants are indispensable in water and wastewater treatment, enabling the aggregation and removal of suspended particles, colloids, and emulsions. However, the conventional development and application of flocculants rely heavily on empirical methods, which are time-consuming, resource-intensive, and environmentally problematic due to issues such as sludge production and chemical residues. Recent advances in machine learning (ML) have opened transformative avenues for the design, optimization, and intelligent application of flocculants. This review systematically examines the integration of ML into flocculant research, covering algorithmic approaches, data-driven structure–property modeling, high-throughput formulation screening, and smart process control. ML models—including random forests, neural networks, and Gaussian processes—have successfully predicted flocculation performance, guided synthesis optimization, and enabled real-time dosing control. Applications extend to both synthetic and bioflocculants, with ML facilitating strain engineering, fermentation yield prediction, and polymer degradability assessments. Furthermore, the convergence of ML with IoT, digital twins, and life cycle assessment tools has accelerated the transition toward sustainable, adaptive, and low-impact treatment technologies. Despite its potential, challenges remain in data standardization, model interpretability, and real-world implementation. This review concludes by outlining strategic pathways for future research, including the development of open datasets, hybrid physics–ML frameworks, and interdisciplinary collaborations. By leveraging ML, the next generation of flocculant systems can be more effective, environmentally benign, and intelligently controlled, contributing to global water sustainability goals. Full article
(This article belongs to the Section Environmental Separations)
Show Figures

Figure 1

24 pages, 2325 KiB  
Review
Personalization of AI-Based Digital Twins to Optimize Adaptation in Industrial Design and Manufacturing—Review
by Izabela Rojek, Dariusz Mikołajewski, Ewa Dostatni, Jan Cybulski and Mirosław Kozielski
Appl. Sci. 2025, 15(15), 8525; https://doi.org/10.3390/app15158525 (registering DOI) - 31 Jul 2025
Viewed by 172
Abstract
The growing scale of big data and artificial intelligence (AI)-based models has heightened the urgency of developing real-time digital twins (DTs), particularly those capable of simulating personalized behavior in dynamic environments. In this study, we examine the personalization of AI-based digital twins (DTs), [...] Read more.
The growing scale of big data and artificial intelligence (AI)-based models has heightened the urgency of developing real-time digital twins (DTs), particularly those capable of simulating personalized behavior in dynamic environments. In this study, we examine the personalization of AI-based digital twins (DTs), with a focus on overcoming computational latencies that hinder real-time responses—especially in complex, large-scale systems and networks. We use bibliometric analysis to map current trends, prevailing themes, and technical challenges in this field. The key findings highlight the growing emphasis on scalable model architectures, multimodal data integration, and the use of high-performance computing platforms. While existing research has focused on model decomposition, structural optimization, and algorithmic integration, there remains a need for fast DT platforms that support diverse user requirements. This review synthesizes these insights to outline new directions for accelerating adaptation and enhancing personalization. By providing a structured overview of the current research landscape, this study contributes to a better understanding of how AI and edge computing can drive the development of the next generation of real-time personalized DTs. Full article
(This article belongs to the Section Computing and Artificial Intelligence)
Show Figures

Figure 1

28 pages, 5699 KiB  
Article
Multi-Modal Excavator Activity Recognition Using Two-Stream CNN-LSTM with RGB and Point Cloud Inputs
by Hyuk Soo Cho, Kamran Latif, Abubakar Sharafat and Jongwon Seo
Appl. Sci. 2025, 15(15), 8505; https://doi.org/10.3390/app15158505 (registering DOI) - 31 Jul 2025
Viewed by 148
Abstract
Recently, deep learning algorithms have been increasingly applied in construction for activity recognition, particularly for excavators, to automate processes and enhance safety and productivity through continuous monitoring of earthmoving activities. These deep learning algorithms analyze construction videos to classify excavator activities for earthmoving [...] Read more.
Recently, deep learning algorithms have been increasingly applied in construction for activity recognition, particularly for excavators, to automate processes and enhance safety and productivity through continuous monitoring of earthmoving activities. These deep learning algorithms analyze construction videos to classify excavator activities for earthmoving purposes. However, previous studies have solely focused on single-source external videos, which limits the activity recognition capabilities of the deep learning algorithm. This paper introduces a novel multi-modal deep learning-based methodology for recognizing excavator activities, utilizing multi-stream input data. It processes point clouds and RGB images using the two-stream long short-term memory convolutional neural network (CNN-LSTM) method to extract spatiotemporal features, enabling the recognition of excavator activities. A comprehensive dataset comprising 495,000 video frames of synchronized RGB and point cloud data was collected across multiple construction sites under varying conditions. The dataset encompasses five key excavator activities: Approach, Digging, Dumping, Idle, and Leveling. To assess the effectiveness of the proposed method, the performance of the two-stream CNN-LSTM architecture is compared with that of single-stream CNN-LSTM models on the same RGB and point cloud datasets, separately. The results demonstrate that the proposed multi-stream approach achieved an accuracy of 94.67%, outperforming existing state-of-the-art single-stream models, which achieved 90.67% accuracy for the RGB-based model and 92.00% for the point cloud-based model. These findings underscore the potential of the proposed activity recognition method, making it highly effective for automatic real-time monitoring of excavator activities, thereby laying the groundwork for future integration into digital twin systems for proactive maintenance and intelligent equipment management. Full article
(This article belongs to the Special Issue AI-Based Machinery Health Monitoring)
Show Figures

Figure 1

31 pages, 2007 KiB  
Review
Artificial Intelligence-Driven Strategies for Targeted Delivery and Enhanced Stability of RNA-Based Lipid Nanoparticle Cancer Vaccines
by Ripesh Bhujel, Viktoria Enkmann, Hannes Burgstaller and Ravi Maharjan
Pharmaceutics 2025, 17(8), 992; https://doi.org/10.3390/pharmaceutics17080992 - 30 Jul 2025
Cited by 1 | Viewed by 692
Abstract
The convergence of artificial intelligence (AI) and nanomedicine has transformed cancer vaccine development, particularly in optimizing RNA-loaded lipid nanoparticles (LNPs). Stability and targeted delivery are major obstacles to the clinical translation of promising RNA-LNP vaccines for cancer immunotherapy. This systematic review analyzes the [...] Read more.
The convergence of artificial intelligence (AI) and nanomedicine has transformed cancer vaccine development, particularly in optimizing RNA-loaded lipid nanoparticles (LNPs). Stability and targeted delivery are major obstacles to the clinical translation of promising RNA-LNP vaccines for cancer immunotherapy. This systematic review analyzes the AI’s impact on LNP engineering through machine learning-driven predictive models, generative adversarial networks (GANs) for novel lipid design, and neural network-enhanced biodistribution prediction. AI reduces the therapeutic development timeline through accelerated virtual screening of millions of lipid combinations, compared to conventional high-throughput screening. Furthermore, AI-optimized LNPs demonstrate improved tumor targeting. GAN-generated lipids show structural novelty while maintaining higher encapsulation efficiency; graph neural networks predict RNA-LNP binding affinity with high accuracy vs. experimental data; digital twins reduce lyophilization optimization from years to months; and federated learning models enable multi-institutional data sharing. We propose a framework to address key technical challenges: training data quality (min. 15,000 lipid structures), model interpretability (SHAP > 0.65), and regulatory compliance (21CFR Part 11). AI integration reduces manufacturing costs and makes personalized cancer vaccine affordable. Future directions need to prioritize quantum machine learning for stability prediction and edge computing for real-time formulation modifications. Full article
Show Figures

Figure 1

52 pages, 3733 KiB  
Article
A Hybrid Deep Reinforcement Learning and Metaheuristic Framework for Heritage Tourism Route Optimization in Warin Chamrap’s Old Town
by Rapeepan Pitakaso, Thanatkij Srichok, Surajet Khonjun, Natthapong Nanthasamroeng, Arunrat Sawettham, Paweena Khampukka, Sairoong Dinkoksung, Kanya Jungvimut, Ganokgarn Jirasirilerd, Chawapot Supasarn, Pornpimol Mongkhonngam and Yong Boonarree
Heritage 2025, 8(8), 301; https://doi.org/10.3390/heritage8080301 - 28 Jul 2025
Viewed by 712
Abstract
Designing optimal heritage tourism routes in secondary cities involves complex trade-offs between cultural richness, travel time, carbon emissions, spatial coherence, and group satisfaction. This study addresses the Personalized Group Trip Design Problem (PGTDP) under real-world constraints by proposing DRL–IMVO–GAN—a hybrid multi-objective optimization framework [...] Read more.
Designing optimal heritage tourism routes in secondary cities involves complex trade-offs between cultural richness, travel time, carbon emissions, spatial coherence, and group satisfaction. This study addresses the Personalized Group Trip Design Problem (PGTDP) under real-world constraints by proposing DRL–IMVO–GAN—a hybrid multi-objective optimization framework that integrates Deep Reinforcement Learning (DRL) for policy-guided initialization, an Improved Multiverse Optimizer (IMVO) for global search, and a Generative Adversarial Network (GAN) for local refinement and solution diversity. The model operates within a digital twin of Warin Chamrap’s old town, leveraging 92 POIs, congestion heatmaps, and behaviorally clustered tourist profiles. The proposed method was benchmarked against seven state-of-the-art techniques, including PSO + DRL, Genetic Algorithm with Multi-Neighborhood Search (Genetic + MNS), Dual-ACO, ALNS-ASP, and others. Results demonstrate that DRL–IMVO–GAN consistently dominates across key metrics. Under equal-objective weighting, it attained the highest heritage score (74.2), shortest travel time (21.3 min), and top satisfaction score (17.5 out of 18), along with the highest hypervolume (0.85) and Pareto Coverage Ratio (0.95). Beyond performance, the framework exhibits strong generalization in zero- and few-shot scenarios, adapting to unseen POIs, modified constraints, and new user profiles without retraining. These findings underscore the method’s robustness, behavioral coherence, and interpretability—positioning it as a scalable, intelligent decision-support tool for sustainable and user-centered cultural tourism planning in secondary cities. Full article
(This article belongs to the Special Issue AI and the Future of Cultural Heritage)
Show Figures

Figure 1

25 pages, 1343 KiB  
Article
Low-Latency Edge-Enabled Digital Twin System for Multi-Robot Collision Avoidance and Remote Control
by Daniel Poul Mtowe, Lika Long and Dong Min Kim
Sensors 2025, 25(15), 4666; https://doi.org/10.3390/s25154666 - 28 Jul 2025
Viewed by 382
Abstract
This paper proposes a low-latency and scalable architecture for Edge-Enabled Digital Twin networked control systems (E-DTNCS) aimed at multi-robot collision avoidance and remote control in dynamic and latency-sensitive environments. Traditional approaches, which rely on centralized cloud processing or direct sensor-to-controller communication, are inherently [...] Read more.
This paper proposes a low-latency and scalable architecture for Edge-Enabled Digital Twin networked control systems (E-DTNCS) aimed at multi-robot collision avoidance and remote control in dynamic and latency-sensitive environments. Traditional approaches, which rely on centralized cloud processing or direct sensor-to-controller communication, are inherently limited by excessive network latency, bandwidth bottlenecks, and a lack of predictive decision-making, thus constraining their effectiveness in real-time multi-agent systems. To overcome these limitations, we propose a novel framework that seamlessly integrates edge computing with digital twin (DT) technology. By performing localized preprocessing at the edge, the system extracts semantically rich features from raw sensor data streams, reducing the transmission overhead of the original data. This shift from raw data to feature-based communication significantly alleviates network congestion and enhances system responsiveness. The DT layer leverages these extracted features to maintain high-fidelity synchronization with physical robots and to execute predictive models for proactive collision avoidance. To empirically validate the framework, a real-world testbed was developed, and extensive experiments were conducted with multiple mobile robots. The results revealed a substantial reduction in collision rates when DT was deployed, and further improvements were observed with E-DTNCS integration due to significantly reduced latency. These findings confirm the system’s enhanced responsiveness and its effectiveness in handling real-time control tasks. The proposed framework demonstrates the potential of combining edge intelligence with DT-driven control in advancing the reliability, scalability, and real-time performance of multi-robot systems for industrial automation and mission-critical cyber-physical applications. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

28 pages, 2918 KiB  
Article
Machine Learning-Powered KPI Framework for Real-Time, Sustainable Ship Performance Management
by Christos Spandonidis, Vasileios Iliopoulos and Iason Athanasopoulos
J. Mar. Sci. Eng. 2025, 13(8), 1440; https://doi.org/10.3390/jmse13081440 - 28 Jul 2025
Viewed by 365
Abstract
The maritime sector faces escalating demands to minimize emissions and optimize operational efficiency under tightening environmental regulations. Although technologies such as the Internet of Things (IoT), Artificial Intelligence (AI), and Digital Twins (DT) offer substantial potential, their deployment in real-time ship performance analytics [...] Read more.
The maritime sector faces escalating demands to minimize emissions and optimize operational efficiency under tightening environmental regulations. Although technologies such as the Internet of Things (IoT), Artificial Intelligence (AI), and Digital Twins (DT) offer substantial potential, their deployment in real-time ship performance analytics is at an emerging state. This paper proposes a machine learning-driven framework for real-time ship performance management. The framework starts with data collected from onboard sensors and culminates in a decision support system that is easily interpretable, even by non-experts. It also provides a method to forecast vessel performance by extrapolating Key Performance Indicator (KPI) values. Furthermore, it offers a flexible methodology for defining KPIs for every crucial component or aspect of vessel performance, illustrated through a use case focusing on fuel oil consumption. Leveraging Artificial Neural Networks (ANNs), hybrid multivariate data fusion, and high-frequency sensor streams, the system facilitates continuous diagnostics, early fault detection, and data-driven decision-making. Unlike conventional static performance models, the framework employs dynamic KPIs that evolve with the vessel’s operational state, enabling advanced trend analysis, predictive maintenance scheduling, and compliance assurance. Experimental comparison against classical KPI models highlights superior predictive fidelity, robustness, and temporal consistency. Furthermore, the paper delineates AI and ML applications across core maritime operations and introduces a scalable, modular system architecture applicable to both commercial and naval platforms. This approach bridges advanced simulation ecosystems with in situ operational data, laying a robust foundation for digital transformation and sustainability in maritime domains. Full article
(This article belongs to the Section Ocean Engineering)
Show Figures

Figure 1

37 pages, 1895 KiB  
Review
A Review of Artificial Intelligence and Deep Learning Approaches for Resource Management in Smart Buildings
by Bibars Amangeldy, Timur Imankulov, Nurdaulet Tasmurzayev, Gulmira Dikhanbayeva and Yedil Nurakhov
Buildings 2025, 15(15), 2631; https://doi.org/10.3390/buildings15152631 - 25 Jul 2025
Viewed by 594
Abstract
This comprehensive review maps the fast-evolving landscape in which artificial intelligence (AI) and deep-learning (DL) techniques converge with the Internet of Things (IoT) to manage energy, comfort, and sustainability across smart environments. A PRISMA-guided search of four databases retrieved 1358 records; after applying [...] Read more.
This comprehensive review maps the fast-evolving landscape in which artificial intelligence (AI) and deep-learning (DL) techniques converge with the Internet of Things (IoT) to manage energy, comfort, and sustainability across smart environments. A PRISMA-guided search of four databases retrieved 1358 records; after applying inclusion criteria, 143 peer-reviewed studies published between January 2019 and April 2025 were analyzed. This review shows that AI-driven controllers—especially deep-reinforcement-learning agents—deliver median energy savings of 18–35% for HVAC and other major loads, consistently outperforming rule-based and model-predictive baselines. The evidence further reveals a rapid diversification of methods: graph-neural-network models now capture spatial interdependencies in dense sensor grids, federated-learning pilots address data-privacy constraints, and early integrations of large language models hint at natural-language analytics and control interfaces for heterogeneous IoT devices. Yet large-scale deployment remains hindered by fragmented and proprietary datasets, unresolved privacy and cybersecurity risks associated with continuous IoT telemetry, the growing carbon and compute footprints of ever-larger models, and poor interoperability among legacy equipment and modern edge nodes. The authors of researches therefore converges on several priorities: open, high-fidelity benchmarks that marry multivariate IoT sensor data with standardized metadata and occupant feedback; energy-aware, edge-optimized architectures that lower latency and power draw; privacy-centric learning frameworks that satisfy tightening regulations; hybrid physics-informed and explainable models that shorten commissioning time; and digital-twin platforms enriched by language-model reasoning to translate raw telemetry into actionable insights for facility managers and end users. Addressing these gaps will be pivotal to transforming isolated pilots into ubiquitous, trustworthy, and human-centered IoT ecosystems capable of delivering measurable gains in efficiency, resilience, and occupant wellbeing at scale. Full article
(This article belongs to the Section Building Energy, Physics, Environment, and Systems)
Show Figures

Figure 1

26 pages, 2875 KiB  
Article
Sustainable THz SWIPT via RIS-Enabled Sensing and Adaptive Power Focusing: Toward Green 6G IoT
by Sunday Enahoro, Sunday Cookey Ekpo, Mfonobong Uko, Fanuel Elias, Rahul Unnikrishnan, Stephen Alabi and Nurudeen Kolawole Olasunkanmi
Sensors 2025, 25(15), 4549; https://doi.org/10.3390/s25154549 - 23 Jul 2025
Viewed by 351
Abstract
Terahertz (THz) communications and simultaneous wireless information and power transfer (SWIPT) hold the potential to energize battery-less Internet-of-Things (IoT) devices while enabling multi-gigabit data transmission. However, severe path loss, blockages, and rectifier nonlinearity significantly hinder both throughput and harvested energy. Additionally, high-power THz [...] Read more.
Terahertz (THz) communications and simultaneous wireless information and power transfer (SWIPT) hold the potential to energize battery-less Internet-of-Things (IoT) devices while enabling multi-gigabit data transmission. However, severe path loss, blockages, and rectifier nonlinearity significantly hinder both throughput and harvested energy. Additionally, high-power THz beams pose safety concerns by potentially exceeding specific absorption rate (SAR) limits. We propose a sensing-adaptive power-focusing (APF) framework in which a reconfigurable intelligent surface (RIS) embeds low-rate THz sensors. Real-time backscatter measurements construct a spatial map used for the joint optimisation of (i) RIS phase configurations, (ii) multi-tone SWIPT waveforms, and (iii) nonlinear power-splitting ratios. A weighted MMSE inner loop maximizes the data rate, while an outer alternating optimisation applies semidefinite relaxation to enforce passive-element constraints and SAR compliance. Full-stack simulations at 0.3 THz with 20 GHz bandwidth and up to 256 RIS elements show that APF (i) improves the rate–energy Pareto frontier by 30–75% over recent adaptive baselines; (ii) achieves a 150% gain in harvested energy and a 440 Mbps peak per-user rate; (iii) reduces energy-efficiency variance by half while maintaining a Jain fairness index of 0.999;; and (iv) caps SAR at 1.6 W/kg, which is 20% below the IEEE C95.1 safety threshold. The algorithm converges in seven iterations and executes within <3 ms on a Cortex-A78 processor, ensuring compliance with real-time 6G control budgets. The proposed architecture supports sustainable THz-powered networks for smart factories, digital-twin logistics, wire-free extended reality (XR), and low-maintenance structural health monitors, combining high-capacity communication, safe wireless power transfer, and carbon-aware operation for future 6G cyber–physical systems. Full article
Show Figures

Figure 1

27 pages, 1889 KiB  
Article
Advancing Smart City Sustainability Through Artificial Intelligence, Digital Twin and Blockchain Solutions
by Ivica Lukić, Mirko Köhler, Zdravko Krpić and Miljenko Švarcmajer
Technologies 2025, 13(7), 300; https://doi.org/10.3390/technologies13070300 - 11 Jul 2025
Cited by 1 | Viewed by 660
Abstract
This paper presents an integrated Smart City platform that combines digital twin technology, advanced machine learning, and a private blockchain network to enhance data-driven decision making and operational efficiency in both public enterprises and small and medium-sized enterprises (SMEs). The proposed cloud-based business [...] Read more.
This paper presents an integrated Smart City platform that combines digital twin technology, advanced machine learning, and a private blockchain network to enhance data-driven decision making and operational efficiency in both public enterprises and small and medium-sized enterprises (SMEs). The proposed cloud-based business intelligence model automates Extract, Transform, Load (ETL) processes, enables real-time analytics, and secures data integrity and transparency through blockchain-enabled audit trails. By implementing the proposed solution, Smart City and public service providers can significantly improve operational efficiency, including a 15% reduction in costs and a 12% decrease in fuel consumption for waste management, as well as increased citizen engagement and transparency in Smart City governance. The digital twin component facilitated scenario simulations and proactive resource management, while the participatory governance module empowered citizens through transparent, immutable records of proposals and voting. This study also discusses technical, organizational, and regulatory challenges, such as data integration, scalability, and privacy compliance. The results indicate that the proposed approach offers a scalable and sustainable model for Smart City transformation, fostering citizen trust, regulatory compliance, and measurable environmental and social benefits. Full article
(This article belongs to the Section Information and Communication Technologies)
Show Figures

Figure 1

40 pages, 2250 KiB  
Review
Comprehensive Comparative Analysis of Lower Limb Exoskeleton Research: Control, Design, and Application
by Sk Hasan and Nafizul Alam
Actuators 2025, 14(7), 342; https://doi.org/10.3390/act14070342 - 9 Jul 2025
Viewed by 662
Abstract
This review provides a comprehensive analysis of recent advancements in lower limb exoskeleton systems, focusing on applications, control strategies, hardware architecture, sensing modalities, human-robot interaction, evaluation methods, and technical innovations. The study spans systems developed for gait rehabilitation, mobility assistance, terrain adaptation, pediatric [...] Read more.
This review provides a comprehensive analysis of recent advancements in lower limb exoskeleton systems, focusing on applications, control strategies, hardware architecture, sensing modalities, human-robot interaction, evaluation methods, and technical innovations. The study spans systems developed for gait rehabilitation, mobility assistance, terrain adaptation, pediatric use, and industrial support. Applications range from sit-to-stand transitions and post-stroke therapy to balance support and real-world navigation. Control approaches vary from traditional impedance and fuzzy logic models to advanced data-driven frameworks, including reinforcement learning, recurrent neural networks, and digital twin-based optimization. These controllers support personalized and adaptive interaction, enabling real-time intent recognition, torque modulation, and gait phase synchronization across different users and tasks. Hardware platforms include powered multi-degree-of-freedom exoskeletons, passive assistive devices, compliant joint systems, and pediatric-specific configurations. Innovations in actuator design, modular architecture, and lightweight materials support increased usability and energy efficiency. Sensor systems integrate EMG, EEG, IMU, vision, and force feedback, supporting multimodal perception for motion prediction, terrain classification, and user monitoring. Human–robot interaction strategies emphasize safe, intuitive, and cooperative engagement. Controllers are increasingly user-specific, leveraging biosignals and gait metrics to tailor assistance. Evaluation methodologies include simulation, phantom testing, and human–subject trials across clinical and real-world environments, with performance measured through joint tracking accuracy, stability indices, and functional mobility scores. Overall, the review highlights the field’s evolution toward intelligent, adaptable, and user-centered systems, offering promising solutions for rehabilitation, mobility enhancement, and assistive autonomy in diverse populations. Following a detailed review of current developments, strategic recommendations are made to enhance and evolve existing exoskeleton technologies. Full article
(This article belongs to the Section Actuators for Robotics)
Show Figures

Figure 1

22 pages, 1381 KiB  
Review
Artificial Intelligence and ECG: A New Frontier in Cardiac Diagnostics and Prevention
by Dorota Bartusik-Aebisher, Kacper Rogóż and David Aebisher
Biomedicines 2025, 13(7), 1685; https://doi.org/10.3390/biomedicines13071685 - 9 Jul 2025
Viewed by 1336
Abstract
Objectives: With the growing importance of mobile technology and artificial intelligence (AI) in healthcare, the development of automated cardiac diagnostic systems has gained strategic significance. This review aims to summarize the current state of knowledge on the use of AI in the [...] Read more.
Objectives: With the growing importance of mobile technology and artificial intelligence (AI) in healthcare, the development of automated cardiac diagnostic systems has gained strategic significance. This review aims to summarize the current state of knowledge on the use of AI in the analysis of electrocardiographic (ECG) signals obtained from wearable devices, particularly smartwatches, and to outline perspectives for future clinical applications. Methods: A narrative literature review was conducted using PubMed, Web of Science, and Scopus databases. The search focused on combinations of keywords related to AI, ECG, and wearable technologies. After screening and applying inclusion criteria, 152 publications were selected for final analysis. Conclusions: Modern AI algorithms—especially deep neural networks—show promise in detecting arrhythmias, heart failure, prolonged QT syndrome, and other cardiovascular conditions. Smartwatches without ECG sensors, using photoplethysmography (PPG) and machine learning, show potential as supportive tools for preliminary atrial fibrillation (AF) screening at the population level, although further validation in diverse real-world settings is needed. This article explores innovation trends such as genetic data integration, digital twins, federated learning, and local signal processing. Regulatory, technical, and ethical challenges are also discussed, along with the issue of limited clinical evidence. Artificial intelligence enables a significant enhancement of personalized, mobile, and preventive cardiology. Its integration into smartwatch ECG analysis opens a path toward early detection of cardiac disorders and the implementation of population-scale screening approaches. Full article
(This article belongs to the Special Issue Feature Reviews in Cardiovascular Diseases)
Show Figures

Figure 1

17 pages, 6262 KiB  
Article
An Intelligent Thermal Management Strategy for a Data Center Prototype Based on Digital Twin Technology
by Hang Yuan, Zeyu Zhang, Duobing Yang, Tianyou Xue, Dongsheng Wen and Guice Yao
Appl. Sci. 2025, 15(14), 7675; https://doi.org/10.3390/app15147675 - 9 Jul 2025
Viewed by 334
Abstract
Data centers contribute to roughly 1% of global energy consumption and 0.3% of worldwide carbon dioxide emissions. The cooling system alone constitutes a substantial 50% of total energy consumption for data centers. Lowering Power Usage Effectiveness (PUE) of data center cooling systems from [...] Read more.
Data centers contribute to roughly 1% of global energy consumption and 0.3% of worldwide carbon dioxide emissions. The cooling system alone constitutes a substantial 50% of total energy consumption for data centers. Lowering Power Usage Effectiveness (PUE) of data center cooling systems from 2.2 to 1.4, or even below, is one of the critical issues in this thermal management area. In this work, a digital twin system of an Intelligent Data Center (IDC) prototype is designed to be capable of real-time monitoring the temperature distribution. Moreover, aiming to lower PUE, Deep Q-Learning Network (DQN) is further established to make optimization decisions of thermal management during cooling down of the local hotspot. The entire process of thermal management for IDC can be real-time visualized in Unity, forming the virtual entity of data center prototype, which provides an intelligent solution for sustainable data center operation. Full article
(This article belongs to the Special Issue Multiscale Heat and Mass Transfer and Artificial Intelligence)
Show Figures

Figure 1

Back to TopTop