Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (426)

Search Parameters:
Keywords = energy data privacy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
24 pages, 650 KiB  
Article
Investigating Users’ Acceptance of Autonomous Buses by Examining Their Willingness to Use and Willingness to Pay: The Case of the City of Trikala, Greece
by Spyros Niavis, Nikolaos Gavanas, Konstantina Anastasiadou and Paschalis Arvanitidis
Urban Sci. 2025, 9(8), 298; https://doi.org/10.3390/urbansci9080298 (registering DOI) - 1 Aug 2025
Abstract
Autonomous vehicles (AVs) have emerged as a promising sustainable urban mobility solution, expected to lead to enhanced road safety, smoother traffic flows, less traffic congestion, improved accessibility, better energy utilization and environmental performance, as well as more efficient passenger and freight transportation, in [...] Read more.
Autonomous vehicles (AVs) have emerged as a promising sustainable urban mobility solution, expected to lead to enhanced road safety, smoother traffic flows, less traffic congestion, improved accessibility, better energy utilization and environmental performance, as well as more efficient passenger and freight transportation, in terms of time and cost, due to better fleet management and platooning. However, challenges also arise, mostly related to data privacy, security and cyber-security, high acquisition and infrastructure costs, accident liability, even possible increased traffic congestion and air pollution due to induced travel demand. This paper presents the results of a survey conducted among 654 residents who experienced an autonomous bus (AB) service in the city of Trikala, Greece, in order to assess their willingness to use (WTU) and willingness to pay (WTP) for ABs, through testing a range of factors based on a literature review. Results useful to policy-makers were extracted, such as that the intention to use ABs was mostly shaped by psychological factors (e.g., users’ perceptions of usefulness and safety, and trust in the service provider), while WTU seemed to be positively affected by previous experience in using ABs. In contrast, sociodemographic factors were found to have very little effect on the intention to use ABs, while apart from personal utility, users’ perceptions of how autonomous driving will improve the overall life standards in the study area also mattered. Full article
Show Figures

Figure 1

24 pages, 845 KiB  
Article
Towards Tamper-Proof Trust Evaluation of Internet of Things Nodes Leveraging IOTA Ledger
by Assiya Akli and Khalid Chougdali 
Sensors 2025, 25(15), 4697; https://doi.org/10.3390/s25154697 - 30 Jul 2025
Viewed by 177
Abstract
Trust evaluation has become a major challenge in the quickly developing Internet of Things (IoT) environment because of the vulnerabilities and security hazards associated with networked devices. To overcome these obstacles, this study offers a novel approach for evaluating trust that uses IOTA [...] Read more.
Trust evaluation has become a major challenge in the quickly developing Internet of Things (IoT) environment because of the vulnerabilities and security hazards associated with networked devices. To overcome these obstacles, this study offers a novel approach for evaluating trust that uses IOTA Tangle technology. By decentralizing the trust evaluation process, our approach reduces the risks related to centralized solutions, including privacy violations and single points of failure. To offer a thorough and reliable trust evaluation, this study combines direct and indirect trust measures. Moreover, we incorporate IOTA-based trust metrics to evaluate a node’s trust based on its activity in creating and validating IOTA transactions. The proposed framework ensures data integrity and secrecy by implementing immutable, secure storage for trust scores on IOTA. This ensures that no node transmits a wrong trust score for itself. The results show that the proposed scheme is efficient compared to recent literature, achieving up to +3.5% higher malicious node detection accuracy, up to 93% improvement in throughput, 40% reduction in energy consumption, and up to 24% lower end-to-end delay across various network sizes and adversarial conditions. Our contributions improve the scalability, security, and dependability of trust assessment processes in Internet of Things networks, providing a strong solution to the prevailing issues in current centralized trust models. Full article
Show Figures

Figure 1

37 pages, 1895 KiB  
Review
A Review of Artificial Intelligence and Deep Learning Approaches for Resource Management in Smart Buildings
by Bibars Amangeldy, Timur Imankulov, Nurdaulet Tasmurzayev, Gulmira Dikhanbayeva and Yedil Nurakhov
Buildings 2025, 15(15), 2631; https://doi.org/10.3390/buildings15152631 - 25 Jul 2025
Viewed by 455
Abstract
This comprehensive review maps the fast-evolving landscape in which artificial intelligence (AI) and deep-learning (DL) techniques converge with the Internet of Things (IoT) to manage energy, comfort, and sustainability across smart environments. A PRISMA-guided search of four databases retrieved 1358 records; after applying [...] Read more.
This comprehensive review maps the fast-evolving landscape in which artificial intelligence (AI) and deep-learning (DL) techniques converge with the Internet of Things (IoT) to manage energy, comfort, and sustainability across smart environments. A PRISMA-guided search of four databases retrieved 1358 records; after applying inclusion criteria, 143 peer-reviewed studies published between January 2019 and April 2025 were analyzed. This review shows that AI-driven controllers—especially deep-reinforcement-learning agents—deliver median energy savings of 18–35% for HVAC and other major loads, consistently outperforming rule-based and model-predictive baselines. The evidence further reveals a rapid diversification of methods: graph-neural-network models now capture spatial interdependencies in dense sensor grids, federated-learning pilots address data-privacy constraints, and early integrations of large language models hint at natural-language analytics and control interfaces for heterogeneous IoT devices. Yet large-scale deployment remains hindered by fragmented and proprietary datasets, unresolved privacy and cybersecurity risks associated with continuous IoT telemetry, the growing carbon and compute footprints of ever-larger models, and poor interoperability among legacy equipment and modern edge nodes. The authors of researches therefore converges on several priorities: open, high-fidelity benchmarks that marry multivariate IoT sensor data with standardized metadata and occupant feedback; energy-aware, edge-optimized architectures that lower latency and power draw; privacy-centric learning frameworks that satisfy tightening regulations; hybrid physics-informed and explainable models that shorten commissioning time; and digital-twin platforms enriched by language-model reasoning to translate raw telemetry into actionable insights for facility managers and end users. Addressing these gaps will be pivotal to transforming isolated pilots into ubiquitous, trustworthy, and human-centered IoT ecosystems capable of delivering measurable gains in efficiency, resilience, and occupant wellbeing at scale. Full article
(This article belongs to the Section Building Energy, Physics, Environment, and Systems)
Show Figures

Figure 1

32 pages, 5164 KiB  
Article
Decentralized Distributed Sequential Neural Networks Inference on Low-Power Microcontrollers in Wireless Sensor Networks: A Predictive Maintenance Case Study
by Yernazar Bolat, Iain Murray, Yifei Ren and Nasim Ferdosian
Sensors 2025, 25(15), 4595; https://doi.org/10.3390/s25154595 - 24 Jul 2025
Viewed by 346
Abstract
The growing adoption of IoT applications has led to increased use of low-power microcontroller units (MCUs) for energy-efficient, local data processing. However, deploying deep neural networks (DNNs) on these constrained devices is challenging due to limitations in memory, computational power, and energy. Traditional [...] Read more.
The growing adoption of IoT applications has led to increased use of low-power microcontroller units (MCUs) for energy-efficient, local data processing. However, deploying deep neural networks (DNNs) on these constrained devices is challenging due to limitations in memory, computational power, and energy. Traditional methods like cloud-based inference and model compression often incur bandwidth, privacy, and accuracy trade-offs. This paper introduces a novel Decentralized Distributed Sequential Neural Network (DDSNN) designed for low-power MCUs in Tiny Machine Learning (TinyML) applications. Unlike the existing methods that rely on centralized cluster-based approaches, DDSNN partitions a pre-trained LeNet across multiple MCUs, enabling fully decentralized inference in wireless sensor networks (WSNs). We validate DDSNN in a real-world predictive maintenance scenario, where vibration data from an industrial pump is analyzed in real-time. The experimental results demonstrate that DDSNN achieves 99.01% accuracy, explicitly maintaining the accuracy of the non-distributed baseline model and reducing inference latency by approximately 50%, highlighting its significant enhancement over traditional, non-distributed approaches, demonstrating its practical feasibility under realistic operating conditions. Full article
Show Figures

Figure 1

23 pages, 906 KiB  
Article
Detection Model for 5G Core PFCP DDoS Attacks Based on Sin-Cos-bIAVOA
by Zheng Ma, Rui Zhang and Lang Gao
Algorithms 2025, 18(7), 449; https://doi.org/10.3390/a18070449 - 21 Jul 2025
Viewed by 258
Abstract
The development of 5G environments has several advantages, including accelerated data transfer speeds, reduced latency, and improved energy efficiency. Nevertheless, it also increases the risk of severe cybersecurity issues, including a complex and enlarged attack surface, privacy concerns, and security threats to 5G [...] Read more.
The development of 5G environments has several advantages, including accelerated data transfer speeds, reduced latency, and improved energy efficiency. Nevertheless, it also increases the risk of severe cybersecurity issues, including a complex and enlarged attack surface, privacy concerns, and security threats to 5G core network functions. A 5G core network DDoS attack detection model is been proposed which utilizes a binary improved non-Bald Eagle optimization algorithm (Sin-Cos-bIAVOA) originally designed for IoT DDoS detection to select effective features for DDoS attacks. This approach employs a novel composite transfer function (Sin-Cos) to enhance exploration. The proposed method’s performance is compared with classical algorithms on the 5G Core PFCP DDoS attacks dataset. After rigorous testing across a spectrum of attack scenarios, the proposed detection model exhibits superior performance compared to traditional DDoS detection algorithms. This is a significant finding, as it suggests that the model achieves a higher degree of detection accuracy, meaning it is better equipped to identify and mitigate DDoS attacks. This is particularly noteworthy in the context of 5G core networks, as it offers a novel solution to the problem of DDoS attack detection for this critical infrastructure. Full article
Show Figures

Figure 1

31 pages, 4668 KiB  
Article
BLE Signal Processing and Machine Learning for Indoor Behavior Classification
by Yi-Shiun Lee, Yong-Yi Fanjiang, Chi-Huang Hung and Yung-Shiang Huang
Sensors 2025, 25(14), 4496; https://doi.org/10.3390/s25144496 - 19 Jul 2025
Viewed by 295
Abstract
Smart home technology enhances the quality of life, particularly with respect to in-home care and health monitoring. While video-based methods provide accurate behavior analysis, privacy concerns drive interest in non-visual alternatives. This study proposes a Bluetooth Low Energy (BLE)-enabled indoor positioning and behavior [...] Read more.
Smart home technology enhances the quality of life, particularly with respect to in-home care and health monitoring. While video-based methods provide accurate behavior analysis, privacy concerns drive interest in non-visual alternatives. This study proposes a Bluetooth Low Energy (BLE)-enabled indoor positioning and behavior recognition system, integrating machine learning techniques to support sustainable and privacy-preserving health monitoring. Key optimizations include: (1) a vertically mounted Data Collection Unit (DCU) for improved height positioning, (2) synchronized data collection to reduce discrepancies, (3) Kalman filtering to smooth RSSI signals, and (4) AI-based RSSI analysis for enhanced behavior recognition. Experiments in a real home environment used a smart wristband to assess BLE signal variations across different activities (standing, sitting, lying down). The results show that the proposed system reliably tracks user locations and identifies behavior patterns. This research supports elderly care, remote health monitoring, and non-invasive behavior analysis, providing a privacy-preserving solution for smart healthcare applications. Full article
Show Figures

Figure 1

33 pages, 2299 KiB  
Review
Edge Intelligence in Urban Landscapes: Reviewing TinyML Applications for Connected and Sustainable Smart Cities
by Athanasios Trigkas, Dimitrios Piromalis and Panagiotis Papageorgas
Electronics 2025, 14(14), 2890; https://doi.org/10.3390/electronics14142890 - 19 Jul 2025
Viewed by 448
Abstract
Tiny Machine Learning (TinyML) extends edge AI capabilities to resource-constrained devices, offering a promising solution for real-time, low-power intelligence in smart cities. This review systematically analyzes 66 peer-reviewed studies from 2019 to 2024, covering applications across urban mobility, environmental monitoring, public safety, waste [...] Read more.
Tiny Machine Learning (TinyML) extends edge AI capabilities to resource-constrained devices, offering a promising solution for real-time, low-power intelligence in smart cities. This review systematically analyzes 66 peer-reviewed studies from 2019 to 2024, covering applications across urban mobility, environmental monitoring, public safety, waste management, and infrastructure health. We examine hardware platforms and machine learning models, with particular attention to power-efficient deployment and data privacy. We review the approaches employed in published studies for deploying machine learning models on resource-constrained hardware, emphasizing the most commonly used communication technologies—while noting the limited uptake of low-power options such as Low Power Wide Area Networks (LPWANs). We also discuss hardware–software co-design strategies that enable sustainable operation. Furthermore, we evaluate the alignment of these deployments with the United Nations Sustainable Development Goals (SDGs), highlighting both their contributions and existing gaps in current practices. This review identifies recurring technical patterns, methodological challenges, and underexplored opportunities, particularly in the areas of hardware provisioning, usage of inherent privacy benefits in relevant applications, communication technologies, and dataset practices, offering a roadmap for future TinyML research and deployment in smart urban systems. Among the 66 studies examined, 29 focused on mobility and transportation, 17 on public safety, 10 on environmental sensing, 6 on waste management, and 4 on infrastructure monitoring. TinyML was deployed on constrained microcontrollers in 32 studies, while 36 used optimized models for resource-limited environments. Energy harvesting, primarily solar, was featured in 6 studies, and low-power communication networks were used in 5. Public datasets were used in 27 studies, custom datasets in 24, and the remainder relied on hybrid or simulated data. Only one study explicitly referenced SDGs, and 13 studies considered privacy in their system design. Full article
(This article belongs to the Special Issue New Advances in Embedded Software and Applications)
Show Figures

Figure 1

25 pages, 732 KiB  
Article
Accuracy-Aware MLLM Task Offloading and Resource Allocation in UAV-Assisted Satellite Edge Computing
by Huabing Yan, Hualong Huang, Zijia Zhao, Zhi Wang and Zitian Zhao
Drones 2025, 9(7), 500; https://doi.org/10.3390/drones9070500 - 16 Jul 2025
Viewed by 332
Abstract
This paper presents a novel framework for optimizing multimodal large language model (MLLM) inference through task offloading and resource allocation in UAV-assisted satellite edge computing (SEC) networks. MLLMs leverage transformer architectures to integrate heterogeneous data modalities for IoT applications, particularly real-time monitoring in [...] Read more.
This paper presents a novel framework for optimizing multimodal large language model (MLLM) inference through task offloading and resource allocation in UAV-assisted satellite edge computing (SEC) networks. MLLMs leverage transformer architectures to integrate heterogeneous data modalities for IoT applications, particularly real-time monitoring in remote areas. However, cloud computing dependency introduces latency, bandwidth, and privacy challenges, while IoT device limitations require efficient distributed computing solutions. SEC, utilizing low-earth orbit (LEO) satellites and unmanned aerial vehicles (UAVs), extends mobile edge computing to provide ubiquitous computational resources for remote IoTDs. We formulate the joint optimization of MLLM task offloading and resource allocation as a mixed-integer nonlinear programming (MINLP) problem, minimizing latency and energy consumption while optimizing offloading decisions, power allocation, and UAV trajectories. To address the dynamic SEC environment characterized by satellite mobility, we propose an action-decoupled soft actor–critic (AD-SAC) algorithm with discrete–continuous hybrid action spaces. The simulation results demonstrate that our approach significantly outperforms conventional deep reinforcement learning methods in convergence and system cost reduction compared to baseline algorithms. Full article
Show Figures

Figure 1

35 pages, 11934 KiB  
Article
A Data-Driven Approach for Generating Synthetic Load Profiles with GANs
by Tsvetelina Kaneva, Irena Valova, Katerina Gabrovska-Evstatieva and Boris Evstatiev
Appl. Sci. 2025, 15(14), 7835; https://doi.org/10.3390/app15147835 - 13 Jul 2025
Viewed by 332
Abstract
The generation of realistic electrical load profiles is essential for advancing smart grid analytics, demand forecasting, and privacy-preserving data sharing. Traditional approaches often rely on large, high-resolution datasets and complex recurrent neural architectures, which can be unstable or ineffective when training data are [...] Read more.
The generation of realistic electrical load profiles is essential for advancing smart grid analytics, demand forecasting, and privacy-preserving data sharing. Traditional approaches often rely on large, high-resolution datasets and complex recurrent neural architectures, which can be unstable or ineffective when training data are limited. This paper proposes a data-driven framework based on a lightweight 1D Convolutional Wasserstein GAN with Gradient Penalty (Conv1D-WGAN-GP) for generating high-fidelity synthetic 24 h load profiles. The model is specifically designed to operate on small- to medium-sized datasets, where recurrent models often fail due to overfitting or training instability. The approach leverages the ability of Conv1D layers to capture localized temporal patterns while remaining compact and stable during training. We benchmark the proposed model against vanilla GAN, WGAN-GP, and Conv1D-GAN across four datasets with varying consumption patterns and sizes, including industrial, agricultural, and residential domains. Quantitative evaluations using statistical divergence measures, Real-vs-Synthetic Distinguishability Score, and visual similarity confirm that Conv1D-WGAN-GP consistently outperforms baselines, particularly in low-data scenarios. This demonstrates its robustness, generalization capability, and suitability for privacy-sensitive energy modeling applications where access to large datasets is constrained. Full article
(This article belongs to the Special Issue Innovations in Artificial Neural Network Applications)
Show Figures

Figure 1

34 pages, 924 KiB  
Systematic Review
Smart Microgrid Management and Optimization: A Systematic Review Towards the Proposal of Smart Management Models
by Paul Arévalo, Dario Benavides, Danny Ochoa-Correa, Alberto Ríos, David Torres and Carlos W. Villanueva-Machado
Algorithms 2025, 18(7), 429; https://doi.org/10.3390/a18070429 - 11 Jul 2025
Cited by 1 | Viewed by 542
Abstract
The increasing integration of renewable energy sources (RES) in power systems presents challenges related to variability, stability, and efficiency, particularly in smart microgrids. This systematic review, following the PRISMA 2020 methodology, analyzed 66 studies focused on advanced energy storage systems, intelligent control strategies, [...] Read more.
The increasing integration of renewable energy sources (RES) in power systems presents challenges related to variability, stability, and efficiency, particularly in smart microgrids. This systematic review, following the PRISMA 2020 methodology, analyzed 66 studies focused on advanced energy storage systems, intelligent control strategies, and optimization techniques. Hybrid storage solutions combining battery systems, hydrogen technologies, and pumped hydro storage were identified as effective approaches to mitigate RES intermittency and balance short- and long-term energy demands. The transition from centralized to distributed control architectures, supported by predictive analytics, digital twins, and AI-based forecasting, has improved operational planning and system monitoring. However, challenges remain regarding interoperability, data privacy, cybersecurity, and the limited availability of high-quality data for AI model training. Economic analyses show that while initial investments are high, long-term operational savings and improved resilience justify the adoption of advanced microgrid solutions when supported by appropriate policies and financial mechanisms. Future research should address the standardization of communication protocols, development of explainable AI models, and creation of sustainable business models to enhance resilience, efficiency, and scalability. These efforts are necessary to accelerate the deployment of decentralized, low-carbon energy systems capable of meeting future energy demands under increasingly complex operational conditions. Full article
(This article belongs to the Special Issue Algorithms for Smart Cities (2nd Edition))
Show Figures

Figure 1

21 pages, 2751 KiB  
Review
Artificial Intelligence in Construction Project Management: A Structured Literature Review of Its Evolution in Application and Future Trends
by Yetunde Adebayo, Paul Udoh, Xebiso Blessing Kamudyariwa and Oluyomi Abayomi Osobajo
Digital 2025, 5(3), 26; https://doi.org/10.3390/digital5030026 - 9 Jul 2025
Viewed by 1441
Abstract
The integration of Artificial Intelligence (AI) in construction project management is revolutionising the industry; offering innovative solutions to enhance efficiency, reduce costs, and improve decision making. This structured literature review explored the current applications, benefits, challenges, and future trends of AI in construction [...] Read more.
The integration of Artificial Intelligence (AI) in construction project management is revolutionising the industry; offering innovative solutions to enhance efficiency, reduce costs, and improve decision making. This structured literature review explored the current applications, benefits, challenges, and future trends of AI in construction project management. This study synthesised findings from 135 peer-reviewed articles published between 1985 and 2024; representing Industry 3.0 (3IR), Industry 4.0 (4IR), and Industry 4.0 Post COVID-19 (4IR PC). Analysis showed that the Planning and Monitoring and Control phases of the project have the greatest application of AI, while decision making, prediction, optimisation, and performance improvement are the most common purposes of AI use in the construction industry. The drivers of AI adoption within the construction industry include technology availability, project outcome and performance improvement, a competitive advantage, and a focus on sustainability. Despite these advancements, the review revealed several barriers to AI adoption, including data integration issues, the high cost of AI implementation, resistance to change among stakeholders, and ethical concerns surrounding data privacy, amongst others. This review also identified future ongoing applications of AI in the construction industry, such as sustainability and energy efficiency, digital twins, advanced robotics and autonomous construction, and optimisation. By providing a comprehensive analysis of the evolution of practices and the future direction of AI application, this study serves as a resource for researchers, practitioners, and policymakers seeking to understand the evolving landscape of AI in construction project management. Full article
(This article belongs to the Special Issue AI-Driven Innovations in Ubiquitous Computing and Smart Environments)
Show Figures

Figure 1

40 pages, 886 KiB  
Article
Machine Learning in Smart Buildings: A Review of Methods, Challenges, and Future Trends
by Fatema El Husseini, Hassan N. Noura, Ola Salman and Khaled Chahine
Appl. Sci. 2025, 15(14), 7682; https://doi.org/10.3390/app15147682 - 9 Jul 2025
Viewed by 537
Abstract
Machine learning (ML) has emerged as a transformative force in smart building management due to its ability to significantly enhance energy efficiency and promote sustainability within the built environment. This review examines the pivotal role of ML in optimizing building operations through the [...] Read more.
Machine learning (ML) has emerged as a transformative force in smart building management due to its ability to significantly enhance energy efficiency and promote sustainability within the built environment. This review examines the pivotal role of ML in optimizing building operations through the application of predictive analytics and sophisticated automated control systems. It explores the diverse applications of ML techniques in critical areas such as energy forecasting, non-intrusive load monitoring (NILM), and predictive maintenance. A thorough analysis then identifies key challenges that impede widespread adoption, including issues related to data quality, privacy concerns, system integration complexities, and scalability limitations. Conversely, the review highlights promising emerging opportunities in advanced analytics, the seamless integration of renewable energy sources, and the convergence with the Internet of Things (IoT). Illustrative case studies underscore the tangible benefits of ML implementation, demonstrating substantial energy savings ranging from 15% to 40%. Future trends indicate a clear trajectory towards the development of highly autonomous building management systems and the widespread adoption of occupant-centric designs. Full article
Show Figures

Figure 1

31 pages, 9063 KiB  
Article
Client Selection in Federated Learning on Resource-Constrained Devices: A Game Theory Approach
by Zohra Dakhia and Massimo Merenda
Appl. Sci. 2025, 15(13), 7556; https://doi.org/10.3390/app15137556 - 5 Jul 2025
Viewed by 414
Abstract
Federated Learning (FL), a key paradigm in privacy-preserving and distributed machine learning (ML), enables collaborative model training across decentralized data sources without requiring raw data exchange. FL enables collaborative model training across decentralized data sources while preserving privacy. However, selecting appropriate clients remains [...] Read more.
Federated Learning (FL), a key paradigm in privacy-preserving and distributed machine learning (ML), enables collaborative model training across decentralized data sources without requiring raw data exchange. FL enables collaborative model training across decentralized data sources while preserving privacy. However, selecting appropriate clients remains a major challenge, especially in heterogeneous environments with diverse battery levels, privacy needs, and learning capacities. In this work, a centralized reward-based payoff strategy (RBPS) with cooperative intent is proposed for client selection. In RBPS, each client evaluates participation based on locally measured battery level, privacy requirement, and the model’s accuracy in the current round computing a payoff from these factors and electing to participate if the payoff exceeds a predefined threshold. Participating clients then receive the updated global model. By jointly optimizing model accuracy, privacy preservation, and battery-level constraints, RBPS realizes a multi-objective selection mechanism. Under realistic simulations of client heterogeneity, RBPS yields more robust and efficient training compared to existing methods, confirming its suitability for deployment in resource-constrained FL settings. Experimental analysis demonstrates that RBPS offers significant advantages over state-of-the-art (SOA) client selection methods, particularly those relying on a single selection criterion such as accuracy, battery, or privacy alone. These one-dimensional approaches often lead to trade-offs where improvements in one aspect come at the cost of another. In contrast, RBPS leverages client heterogeneity not as a limitation, but as a strategic asset to maintain and balance all critical characteristics simultaneously. Rather than optimizing performance for a single device type or constraint, RBPS benefits from the diversity of heterogeneous clients, enabling improved accuracy, energy preservation, and privacy protection all at once. This is achieved by dynamically adapting the selection strategy to the strengths of different client profiles. Unlike homogeneous environments, where only one capability tends to dominate, RBPS ensures that no key property is sacrificed. RBPS thus aligns more closely with real-world FL deployments, where mixed-device participation is common and balanced optimization is essential. Full article
Show Figures

Figure 1

21 pages, 4241 KiB  
Article
Federated Learning-Driven Cybersecurity Framework for IoT Networks with Privacy Preserving and Real-Time Threat Detection Capabilities
by Milad Rahmati and Antonino Pagano
Informatics 2025, 12(3), 62; https://doi.org/10.3390/informatics12030062 - 4 Jul 2025
Cited by 1 | Viewed by 727
Abstract
The rapid expansion of the Internet of Things (IoT) ecosystem has transformed industries but also exposed significant cybersecurity vulnerabilities. Traditional centralized methods for securing IoT networks struggle to balance privacy preservation with real-time threat detection. This study presents a Federated Learning-Driven Cybersecurity Framework [...] Read more.
The rapid expansion of the Internet of Things (IoT) ecosystem has transformed industries but also exposed significant cybersecurity vulnerabilities. Traditional centralized methods for securing IoT networks struggle to balance privacy preservation with real-time threat detection. This study presents a Federated Learning-Driven Cybersecurity Framework designed for IoT environments, enabling decentralized data processing through local model training on edge devices to ensure data privacy. Secure aggregation using homomorphic encryption supports collaborative learning without exposing sensitive information. The framework employs GRU-based recurrent neural networks (RNNs) for anomaly detection, optimized for resource-constrained IoT networks. Experimental results demonstrate over 98% accuracy in detecting threats such as distributed denial-of-service (DDoS) attacks, with a 20% reduction in energy consumption and a 30% reduction in communication overhead, showcasing the framework’s efficiency over traditional centralized approaches. This work addresses critical gaps in IoT cybersecurity by integrating federated learning with advanced threat detection techniques. It offers a scalable, privacy-preserving solution for diverse IoT applications, with future directions including blockchain integration for model aggregation traceability and quantum-resistant cryptography to enhance security. Full article
Show Figures

Figure 1

25 pages, 1524 KiB  
Article
Detecting Emerging DGA Malware in Federated Environments via Variational Autoencoder-Based Clustering and Resource-Aware Client Selection
by Ma Viet Duc, Pham Minh Dang, Tran Thu Phuong, Truong Duc Truong, Vu Hai and Nguyen Huu Thanh
Future Internet 2025, 17(7), 299; https://doi.org/10.3390/fi17070299 - 3 Jul 2025
Viewed by 368
Abstract
Domain Generation Algorithms (DGAs) remain a persistent technique used by modern malware to establish stealthy command-and-control (C&C) channels, thereby evading traditional blacklist-based defenses. Detecting such evolving threats is especially challenging in decentralized environments where raw traffic data cannot be aggregated due to privacy [...] Read more.
Domain Generation Algorithms (DGAs) remain a persistent technique used by modern malware to establish stealthy command-and-control (C&C) channels, thereby evading traditional blacklist-based defenses. Detecting such evolving threats is especially challenging in decentralized environments where raw traffic data cannot be aggregated due to privacy or policy constraints. To address this, we present FedSAGE, a security-aware federated intrusion detection framework that combines Variational Autoencoder (VAE)-based latent representation learning with unsupervised clustering and resource-efficient client selection. Each client encodes its local domain traffic into a semantic latent space using a shared, pre-trained VAE trained solely on benign domains. These embeddings are clustered via affinity propagation to group clients with similar data distributions and identify outliers indicative of novel threats without requiring any labeled DGA samples. Within each cluster, FedSAGE selects only the fastest clients for training, balancing computational constraints with threat visibility. Experimental results from the multi-zones DGA dataset show that FedSAGE improves detection accuracy by up to 11.6% and reduces energy consumption by up to 93.8% compared to standard FedAvg under non-IID conditions. Notably, the latent clustering perfectly recovers ground-truth DGA family zones, enabling effective anomaly detection in a fully unsupervised manner while remaining privacy-preserving. These foundations demonstrate that FedSAGE is a practical and lightweight approach for decentralized detection of evasive malware, offering a viable solution for secure and adaptive defense in resource-constrained edge environments. Full article
(This article belongs to the Special Issue Security of Computer System and Network)
Show Figures

Figure 1

Back to TopTop