Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (119)

Search Parameters:
Keywords = fog/edge security

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
45 pages, 7613 KB  
Article
BrainTwin-AI: A Multimodal MRI-EEG-Based Cognitive Digital Twin for Real-Time Brain Health Intelligence
by Himadri Nath Saha, Utsho Banerjee, Rajarshi Karmakar, Saptarshi Banerjee and Jon Turdiev
Brain Sci. 2026, 16(4), 411; https://doi.org/10.3390/brainsci16040411 - 13 Apr 2026
Viewed by 343
Abstract
Background/Objectives: Brain health monitoring is increasingly essential as modern cognitive load, stress, and lifestyle pressures contribute to widespread neural instability. The paper presents BrainTwin, a next-generation cognitive digital twin, as a patient-specific, constantly updating computer model that combines state-of-the-art MRI analytics for [...] Read more.
Background/Objectives: Brain health monitoring is increasingly essential as modern cognitive load, stress, and lifestyle pressures contribute to widespread neural instability. The paper presents BrainTwin, a next-generation cognitive digital twin, as a patient-specific, constantly updating computer model that combines state-of-the-art MRI analytics for neuro-oncological assessment related to clinical study and management of tumors affecting the central nervous system (including their detection, progression, and monitoring) with real-time EEG-based brain health intelligence. Methods: Structural analysis is driven by an Enhanced Vision Transformer (ViT++), which improves spatial representation and boundary localization, achieving more accurate tumor prediction than conventional models. The extracted tumor volume forms the baseline for short-horizon tumor progression modeling. Parallel to MRI analysis, continuous EEG signals are captured through an in-house wearable skullcap, preprocessed using Edge AI on a Hailo Toolkit-enabled Raspberry Pi 5 for low-latency denoising and secure cloud transmission. Pre-processed EEG packets are authenticated at the fog layer, ensuring secure and reliable cloud transfer, enabling significant load reduction in the edge and cloud nodes. In the digital twin, EEG characteristics offer real-time functional monitoring through dynamic brainwave analysis, while a BiLSTM classifier distinguishes relaxed, stress, and fatigue states, which are probabilistically inferred cognitive conditions derived from EEG spectral patterns. Unlike static MRI imaging, EEG provides real-time brain health monitoring. The BrainTwin performs EEG–MRI fusion, correlating functional EEG metrics with ViT++ structural embeddings to produce a single risk score that can be interpreted by clinicians to determine brain vulnerability to future diseases. Explainable artificial intelligence (XAI) provides clinical interpretability through gradient-weighted class activation mapping (Grad-CAM) heatmaps, which are used to interpret ViT++ decisions and are visualized on a 3D interactive brain model to allow more in-depth inspection of spatial details. Results: The evaluation metrics demonstrate a BiLSTM macro-F1 of 0.94 (Precision/Recall/F1: Relaxed 0.96, Stress 0.93, Fatigue 0.92) and a ViT++ MRI accuracy of 96%, outperforming baseline architectures. Conclusions: These results demonstrate BrainTwin’s reliability, interpretability, and clinical utility as an integrated digital companion for tumor assessment and real-time functional brain monitoring. Full article
Show Figures

Figure 1

45 pages, 3695 KB  
Article
Towards a Reference Architecture for Machine Learning Operations
by Miguel Ángel Mateo-Casalí, Andrés Boza and Francisco Fraile
Computers 2026, 15(4), 218; https://doi.org/10.3390/computers15040218 - 1 Apr 2026
Viewed by 521
Abstract
Industrial organisations increasingly rely on machine learning (ML) to improve quality, maintenance, and planning in Industry 4.0/5.0 ecosystems. However, turning experimental models into reliable services on the production floor remains complex due to the heterogeneity of operational technologies (OTs) and information technologies (ITs), [...] Read more.
Industrial organisations increasingly rely on machine learning (ML) to improve quality, maintenance, and planning in Industry 4.0/5.0 ecosystems. However, turning experimental models into reliable services on the production floor remains complex due to the heterogeneity of operational technologies (OTs) and information technologies (ITs), including implementation constraints, latency in edge-fog-cloud scenarios, governance requirements, and continuous performance degradation caused by data drift. Although Machine Learning Operations (MLOps) provides lifecycle practices for deployment, monitoring, and retraining, the evidence is fragmented across tool-centric descriptions, case-specific pipelines, and conceptual architectures, offering limited guidance on which industrial constraints should inform architectural decisions and how to evaluate solutions. This work addresses that gap through a PRISMA-guided systematic review of 49 studies on industrial MLOps (with the search and screening primarily targeting Industry 4.0/IIoT operationalisation contexts, as reflected in the search strategy and corpus) and an evidence-based synthesis of principles, challenges, lifecycle practices, and enabling technologies. From this synthesis, industrial requirements are derived that encompass OT/IT integration, edge-fog-cloud orchestration, security and traceability, and observability-based lifecycle control. On this basis, a reference architecture is proposed that maps these requirements to functional layers, data and control flows, and verifiable responsibilities. To support reproducibility and practical inspectability, the article also presents an open-source architectural instantiation aligned with the proposed decomposition. Finally, the evaluation is illustrated through a predictive maintenance use case (tool breakage) in a single CNC machining cell, where the objective is to demonstrate end-to-end feasibility under realistic operational constraints rather than cross-scenario superiority or broad industrial generalisability. Full article
(This article belongs to the Special Issue Machine Learning: Innovation, Implementation, and Impact)
Show Figures

Figure 1

27 pages, 2849 KB  
Systematic Review
Intrusion Detection in Fog Computing: A Systematic Review of Security Advances and Challenges
by Nyashadzashe Tamuka, Topside Ehleketani Mathonsi, Thomas Otieno Olwal, Solly Maswikaneng, Tonderai Muchenje and Tshimangadzo Mavin Tshilongamulenzhe
Computers 2026, 15(3), 169; https://doi.org/10.3390/computers15030169 - 5 Mar 2026
Viewed by 676
Abstract
Fog computing extends cloud services to the network edge to support low-latency IoT applications. However, since fog environments are distributed and resource-constrained, intrusion detection systems must be adapted to defend against cyberattacks while keeping computation and communication overhead minimal. This systematic review presents [...] Read more.
Fog computing extends cloud services to the network edge to support low-latency IoT applications. However, since fog environments are distributed and resource-constrained, intrusion detection systems must be adapted to defend against cyberattacks while keeping computation and communication overhead minimal. This systematic review presents research on intrusion detection systems (IDSs) for fog computing and synthesizes advances and research gaps. The study was guided by the “Preferred-Reporting-Items for-Systematic-Reviews-and-Meta-Analyses” (PRISMA) framework. Scopus and Web of Science were searched in the title field using TITLE/TI = (“intrusion detection” AND “fog computing”) for 2021–2025. The inclusion criteria were (i) 2021–2025 publications, (ii) journal or conference papers, (iii) English language, and (iv) open access availability; duplicates were removed programmatically using a DOI-first key with a title, year, and author alternative. The search identified 8560 records, of which 4905 were unique and included for qualitative grouping and bibliometric synthesis. Metadata (year, venue, authors, affiliations, keywords, and citations) were extracted and analyzed in Python to compute trends and collaboration. Intrusion detection systems in fog networks were categorized into traditional/signature-based, machine learning, deep learning, and hybrid/ensemble. Hybrid and DL approaches reported accuracy ranging from 95 to 99% on benchmark datasets (such as NSL-KDD, UNSW-NB15, CIC-IDS2017, KDD99, BoT-IoT). Notable bottlenecks included computational load relative to real-time latency on resource-constrained nodes, elevated false-positive rates for anomaly detection under concept drift, limited generalization to unseen attacks, privacy risks from centralizing data, and limited real-world validation. Bibliometric analyses highlighted the field’s concentration in fast-turnaround, open-access journals such as IEEE Access and Sensors, as well as a small number of highly collaborative author clusters, alongside dominant terms such as “learning,” “federated,” “ensemble,” “lightweight,” and “explainability.” Emerging directions include federated and distributed training to preserve privacy, as well as online/continual learning adaptation. Future work should consist of real-world evaluation of fog networks, ultra-lightweight yet adaptive hybrid IDS, self-learning, and secure cooperative frameworks. These insights help researchers select appropriate IDS models for fog networks. Full article
Show Figures

Figure 1

36 pages, 721 KB  
Article
A Survey on IoT-Based Smart Electrical Systems: An Analysis of Standards, Security, and Applications
by Chiara Matta, Sara Pinna, Samoel Ortu, Francesco Parodo, Daniele Giusto and Matteo Anedda
Energies 2026, 19(4), 965; https://doi.org/10.3390/en19040965 - 12 Feb 2026
Viewed by 786
Abstract
The rapid integration of Internet of Things (IoT) technologies is transforming electrical power systems into intelligent, interconnected, and data-driven infrastructures, enabling advanced monitoring, control, and optimization across the entire energy value chain. IoT-based smart electrical systems enable advanced monitoring, control, and optimization of [...] Read more.
The rapid integration of Internet of Things (IoT) technologies is transforming electrical power systems into intelligent, interconnected, and data-driven infrastructures, enabling advanced monitoring, control, and optimization across the entire energy value chain. IoT-based smart electrical systems enable advanced monitoring, control, and optimization of energy generation, distribution, and consumption, while also introducing new challenges related to interoperability, security, scalability, and data management. Despite the growing body of literature, existing surveys typically address these challenges in isolation, focusing on individual technological or operational aspects and thus failing to capture their strong cross-dependencies in real-world deployments. This paper delivers a comprehensive survey that systematically analyzes and interrelates nine key dimensions that prior literature largely examines in separate silos: architectural models, communication protocols, reference standards, cybersecurity and privacy mechanisms, data processing paradigms (edge, fog, and cloud), interoperability solutions, energy management strategies, application scenarios, and future research directions. Unlike conventional reviews confined to single-layer or domain-specific perspectives, this survey adopts a holistic, cross-layer approach, explicitly linking architectural choices, protocol stacks, interoperability frameworks, and security mechanisms with application and energy management requirements. Full article
(This article belongs to the Section F5: Artificial Intelligence and Smart Energy)
Show Figures

Figure 1

26 pages, 461 KB  
Systematic Review
A Systematic Review of Federated and Cloud Computing Approaches for Predicting Mental Health Risks
by Iram Fiaz, Nadia Kanwal and Amro Al-Said Ahmad
Sensors 2026, 26(1), 229; https://doi.org/10.3390/s26010229 - 30 Dec 2025
Viewed by 1047
Abstract
Mental health disorders affect large numbers of people worldwide and are a major cause of long-term disability. Digital health technologies such as mobile apps and wearable devices now generate rich behavioural data that could support earlier detection and more personalised care. However, these [...] Read more.
Mental health disorders affect large numbers of people worldwide and are a major cause of long-term disability. Digital health technologies such as mobile apps and wearable devices now generate rich behavioural data that could support earlier detection and more personalised care. However, these data are highly sensitive and distributed across devices and platforms, which makes privacy protection and scalable analysis challenging; federated learning offers a way to train models across devices while keeping raw data local. When combined with edge, fog, or cloud computing, federated learning offers a way to support near-real-time mental health analysis while keeping raw data local. This review screened 1104 records, assessed 31 full-text articles using a five-question quality checklist, and retained 17 empirical studies that achieved a score of at least 7/10 for synthesis. The included studies were compared in terms of their FL and edge/cloud architectures, data sources, privacy and security techniques, and evidence for operation in real-world settings. The synthesis highlights innovative but fragmented progress, with limited work on comorbidity modelling, deployment evaluation, and common benchmarks, and identifies priorities for the development of scalable, practical, and ethically robust FL systems for digital mental health. Full article
(This article belongs to the Special Issue Secure AI for Biomedical Sensing and Imaging Applications)
Show Figures

Figure 1

20 pages, 3345 KB  
Article
Secure Fog Computing for Remote Health Monitoring with Data Prioritisation and AI-Based Anomaly Detection
by Kiran Fahd, Sazia Parvin, Antony Di Serio and Sitalakshmi Venkatraman
Sensors 2025, 25(23), 7329; https://doi.org/10.3390/s25237329 - 2 Dec 2025
Viewed by 875
Abstract
Smart remote health monitoring requires time-critical medical data of patients from IoT-enabled cyber–physical systems (CPSs) to be securely transmitted and analysed in real time for early interventions and personalised patient care. Existing cloud architectures are insufficient for smart health systems due to their [...] Read more.
Smart remote health monitoring requires time-critical medical data of patients from IoT-enabled cyber–physical systems (CPSs) to be securely transmitted and analysed in real time for early interventions and personalised patient care. Existing cloud architectures are insufficient for smart health systems due to their inherent issues with latency, bandwidth, and privacy. Fog architectures using data storage closer to edge devices introduce challenges in data management, security, and privacy for effective monitoring of a patient’s sensitive and critical health data. These gaps found in the literature form the main research focus of this study. As an initial modest step to advance research further, we propose an innovative fog-based framework which is the first of its kind to integrate secure communication with intelligent data prioritisation (IDP) integrated into an AI-based enhanced Random Forest anomaly and threat detection model. Our experimental study to validate our model involves a simulated smart healthcare scenario with synthesised health data streams from distributed wearable devices. Features such as heart rate, SpO2, and breathing rate are dynamically prioritised using AI strategies and rule-based thresholds so that urgent health anomalies are transmitted securely in real time to support clinicians and medical experts for personalised early interventions. We establish a successful proof-of-concept implementation of our framework by achieving high predictive performance measures with an initial high score of 93.5% accuracy, 90.8% precision, 88.7% recall, and 89.7% F1-score. Full article
Show Figures

Figure 1

33 pages, 5209 KB  
Review
Integrated Photonics for IoT, RoF, and Distributed Fog–Cloud Computing: A Comprehensive Review
by Gerardo Antonio Castañón Ávila, Walter Cerroni and Ana Maria Sarmiento-Moncada
Appl. Sci. 2025, 15(13), 7494; https://doi.org/10.3390/app15137494 - 3 Jul 2025
Cited by 5 | Viewed by 5910
Abstract
Integrated photonics is a transformative technology for enhancing communication and computation in Cloud and Fog computing networks. Photonic integrated circuits (PICs) enable significant improvements in data-processing speed, energy-efficiency, scalability, and latency. In Cloud infrastructures, PICs support high-speed optical interconnects, energy-efficient switching, and compact [...] Read more.
Integrated photonics is a transformative technology for enhancing communication and computation in Cloud and Fog computing networks. Photonic integrated circuits (PICs) enable significant improvements in data-processing speed, energy-efficiency, scalability, and latency. In Cloud infrastructures, PICs support high-speed optical interconnects, energy-efficient switching, and compact wavelength division multiplexing (WDM), addressing growing data demands. Fog computing, with its edge-focused processing and analytics, benefits from the compactness and low latency of integrated photonics for real-time signal processing, sensing, and secure data transmission near IoT devices. PICs also facilitate the low-loss, high-speed modulation, transmission, and detection of RF signals in scalable Radio-over-Fiber (RoF) links, enabling seamless IoT integration with Cloud and Fog networks. This results in centralized processing, reduced latency, and efficient bandwidth use across distributed infrastructures. Overall, integrating photonic technologies into RoF, Fog and Cloud computing networks paves the way for ultra-efficient, flexible, and scalable next-generation network architectures capable of supporting diverse real-time and high-bandwidth applications. This paper provides a comprehensive review of the current state and emerging trends in integrated photonics for IoT sensors, RoF, Fog and Cloud computing systems. It also outlines open research opportunities in photonic devices and system-level integration, aimed at advancing performance, energy-efficiency, and scalability in next-generation distributed computing networks. Full article
(This article belongs to the Special Issue New Trends in Next-Generation Optical Networks)
Show Figures

Figure 1

30 pages, 4009 KB  
Article
Secure Data Transmission Using GS3 in an Armed Surveillance System
by Francisco Alcaraz-Velasco, José M. Palomares, Fernando León-García and Joaquín Olivares
Information 2025, 16(7), 527; https://doi.org/10.3390/info16070527 - 23 Jun 2025
Viewed by 1025
Abstract
Nowadays, the evolution and growth of machine learning (ML) algorithms and the Internet of Things (IoT) are enabling new applications. Smart weapons and people detection systems are examples. Firstly, this work takes advantage of an efficient, scalable, and distributed system, named SmartFog, which [...] Read more.
Nowadays, the evolution and growth of machine learning (ML) algorithms and the Internet of Things (IoT) are enabling new applications. Smart weapons and people detection systems are examples. Firstly, this work takes advantage of an efficient, scalable, and distributed system, named SmartFog, which identifies people with weapons by leveraging edge, fog, and cloud computing paradigms. Nevertheless, security vulnerabilities during data transmission are not addressed. Thus, this work bridges this gap by proposing a secure data transmission system integrating a lightweight security scheme named GS3. Therefore, the main novelty is the evaluation of the GS3 proposal in a real environment. In the first fog sublayer, GS3 leads to a 14% increase in execution time with respect to no secure data transmission, but AES results in a 34.5% longer execution time. GS3 achieves a 70% reduction in decipher time and a 55% reduction in cipher time compared to the AES algorithm. Furthermore, an energy consumption analysis shows that GS3 consumes 31% less power than AES. The security analysis confirms that GS3 detects tampering, replaying, forwarding, and forgery attacks. Moreover, GS3 has a key space of 2544 permutations, slightly larger than those of Chacha20 and Salsa20, with a faster solution than these methods. In addition, GS3 exhibits strength against differential cryptoanalysis. This mechanism is a compelling choice for energy-constrained environments and for securing event data transmissions with a short validity period. Moreover, GS3 maintains full architectural transparency with the underlying armed detection system. Full article
Show Figures

Graphical abstract

41 pages, 4206 KB  
Systematic Review
A Systematic Literature Review on Load-Balancing Techniques in Fog Computing: Architectures, Strategies, and Emerging Trends
by Danah Aldossary, Ezaz Aldahasi, Taghreed Balharith and Tarek Helmy
Computers 2025, 14(6), 217; https://doi.org/10.3390/computers14060217 - 2 Jun 2025
Cited by 5 | Viewed by 3954
Abstract
Fog computing has emerged as a promising paradigm to extend cloud services toward the edge of the network, enabling low-latency processing and real-time responsiveness for Internet of Things (IoT) applications. However, the distributed, heterogeneous, and resource-constrained nature of fog environments introduces significant challenges [...] Read more.
Fog computing has emerged as a promising paradigm to extend cloud services toward the edge of the network, enabling low-latency processing and real-time responsiveness for Internet of Things (IoT) applications. However, the distributed, heterogeneous, and resource-constrained nature of fog environments introduces significant challenges in balancing workloads efficiently. This study presents a systematic literature review (SLR) of 113 peer-reviewed articles published between 2020 and 2024, aiming to provide a comprehensive overview of load-balancing strategies in fog computing. This review categorizes fog computing architectures, load-balancing algorithms, scheduling and offloading techniques, fault-tolerance mechanisms, security models, and evaluation metrics. The analysis reveals that three-layer (IoT–Fog–Cloud) architectures remain predominant, with dynamic clustering and virtualization commonly employed to enhance adaptability. Heuristic and hybrid load-balancing approaches are most widely adopted due to their scalability and flexibility. Evaluation frequently centers on latency, energy consumption, and resource utilization, while simulation is primarily conducted using tools such as iFogSim and YAFS. Despite considerable progress, key challenges persist, including workload diversity, security enforcement, and real-time decision-making under dynamic conditions. Emerging trends highlight the growing use of artificial intelligence, software-defined networking, and blockchain to support intelligent, secure, and autonomous load balancing. This review synthesizes current research directions, identifies critical gaps, and offers recommendations for designing efficient and resilient fog-based load-balancing systems. Full article
(This article belongs to the Special Issue Edge and Fog Computing for Internet of Things Systems (2nd Edition))
Show Figures

Figure 1

18 pages, 4079 KB  
Article
A Scalable Hybrid Autoencoder–Extreme Learning Machine Framework for Adaptive Intrusion Detection in High-Dimensional Networks
by Anubhav Kumar, Rajamani Radhakrishnan, Mani Sumithra, Prabu Kaliyaperumal, Balamurugan Balusamy and Francesco Benedetto
Future Internet 2025, 17(5), 221; https://doi.org/10.3390/fi17050221 - 15 May 2025
Cited by 5 | Viewed by 2171
Abstract
The rapid expansion of network environments has introduced significant cybersecurity challenges, particularly in handling high-dimensional traffic and detecting sophisticated threats. This study presents a novel, scalable Hybrid Autoencoder–Extreme Learning Machine (AE–ELM) framework for Intrusion Detection Systems (IDS), specifically designed to operate effectively in [...] Read more.
The rapid expansion of network environments has introduced significant cybersecurity challenges, particularly in handling high-dimensional traffic and detecting sophisticated threats. This study presents a novel, scalable Hybrid Autoencoder–Extreme Learning Machine (AE–ELM) framework for Intrusion Detection Systems (IDS), specifically designed to operate effectively in dynamic, cloud-supported IoT environments. The scientific novelty lies in the integration of an Autoencoder for deep feature compression with an Extreme Learning Machine for rapid and accurate classification, enhanced through adaptive thresholding techniques. Evaluated on the CSE-CIC-IDS2018 dataset, the proposed method demonstrates a high detection accuracy of 98.52%, outperforming conventional models in terms of precision, recall, and scalability. Additionally, the framework exhibits strong adaptability to emerging threats and reduced computational overhead, making it a practical solution for real-time, scalable IDS in next-generation network infrastructures. Full article
Show Figures

Figure 1

34 pages, 8526 KB  
Article
Zero-Trust Mechanisms for Securing Distributed Edge and Fog Computing in 6G Networks
by Abdulrahman K. Alnaim and Ahmed M. Alwakeel
Mathematics 2025, 13(8), 1239; https://doi.org/10.3390/math13081239 - 9 Apr 2025
Cited by 5 | Viewed by 2994
Abstract
The rapid advancement in 6G networks, driven by the proliferation of distributed edge and fog computing, has introduced unprecedented challenges in securing these decentralized architectures. Traditional security paradigms are inadequate for protecting the dynamic and heterogeneous environments of 6G-enabled systems. In this context, [...] Read more.
The rapid advancement in 6G networks, driven by the proliferation of distributed edge and fog computing, has introduced unprecedented challenges in securing these decentralized architectures. Traditional security paradigms are inadequate for protecting the dynamic and heterogeneous environments of 6G-enabled systems. In this context, we propose ZTF-6G (Zero-Trust Framework for 6G Networks), a novel model that integrates Zero-Trust principles to secure distributed edge and fog computing environments. Robust security is ensured by ZTF-6G by adopting a “never trust, always verify” approach, which comprises adaptive authentication, continuous verification, and fine-grained access control against all entities within the network. Within this context, our proposed framework makes use of Zero-Trust-based multi-layering that extends to AI-driven anomaly detection and blockchain-based identity management for the authentication and real-time monitoring of network interactions. Simulation results indicate that ZTF-6G is able to reduce latency by 77.6% (up to 2.8 ms, compared to the standard models’ 12.5 ms), improve throughput by 70%, and improve resource utilization by 41.5% (90% of utilization). Additionally, the trust score accuracy increased from 95% to 98%, energy efficiency improved by 22.2% (from 88% to 110% efficiency), and threat detection accuracy increased to 98%. Finally, the framework perfectly mitigated the insider threats by 85% and enforced a dynamic policy within 1.8 ms. ZTF-6G maintained a low latency while providing more resilience to insider threats, unauthorized access, and data breaches, which is a requirement of 6G networks. This research aims to lay a foundation for deploying Zero-Trust as an integral part of the next-generation networks which will face the security challenges of the distributed systems driven by 6G networks. Full article
(This article belongs to the Special Issue Advanced Computational Intelligence in Cloud/Edge Computing)
Show Figures

Figure 1

24 pages, 3496 KB  
Article
What Is the Best Solution for Smart Buildings? A Case Study of Fog, Edge Computing and Smart IoT Devices
by Mauro Chiozzotto and Miguel Arjona Ramírez
Appl. Sci. 2025, 15(7), 3805; https://doi.org/10.3390/app15073805 - 31 Mar 2025
Cited by 2 | Viewed by 3239
Abstract
This paper presents a case study of Fog Computing, Edge Computing (EC) and Intelligent EC applied to Smart Buildings, focusing on the deployment of innovative services and smart IoT devices, discussing new architecture as Software-Defined Network (SDN). Specifically, a comprehensive solution of a [...] Read more.
This paper presents a case study of Fog Computing, Edge Computing (EC) and Intelligent EC applied to Smart Buildings, focusing on the deployment of innovative services and smart IoT devices, discussing new architecture as Software-Defined Network (SDN). Specifically, a comprehensive solution of a Smart Building case is proposed to validate main statements and conclusions are drawn, providing a general guideline to address the problems of choosing between Edge or Fog Computing and the specific category of IoT devices. The methodology employed in this study is based on field research conducted in buildings within the metropolitan region of São Paulo, Brazil, that aim to enable their transformation into Smart Buildings (SBs). Moreover, principles of Electronic Systems Engineering and Cloud Computing such as reliability, scalability and security are applied. In that way, this study integrates advanced multimedia technical services to enhancing security and communication within the SBs through centralized control. The method focuses on identifying and analyzing the most common problems observed in field research within SBs in early stages of development, prior to the intensive implementation of IoT devices and Fog or Edge Computing technologies on the state of the art. The research adopts a comparative approach, investigating the best solutions for each application category. The results are consolidated in a main table within the article, correlating solutions to the four main problems identified in the field research, such as impairments in voice over IP and video communication using IoT devices; latency and delays in communication between SBs and the Cloud center; access security issues; and the Quality of Experience of video over IP communication, both in live transmissions and recordings between SBs. Regarding applications, this study considers the use of specific IoT devices and Cloud Computing architectures, such as Fog or IEC. Furthermore, it explores the implementation of new open network and communication models, such as SDN and NFV, to optimize communication between the various SBs and the SB’s connection to the control center of a Smart City. Full article
(This article belongs to the Section Electrical, Electronics and Communications Engineering)
Show Figures

Figure 1

38 pages, 3173 KB  
Review
SDN-Enabled IoT Security Frameworks—A Review of Existing Challenges
by Sandipan Rakeshkumar Mishra, Bharanidharan Shanmugam, Kheng Cher Yeo and Suresh Thennadil
Technologies 2025, 13(3), 121; https://doi.org/10.3390/technologies13030121 - 18 Mar 2025
Cited by 11 | Viewed by 9714
Abstract
This comprehensive systematic review examines the integration of software-defined networking (SDN) with IoT security frameworks, analyzing recent advancements in encryption, authentication, access control techniques, and intrusion detection systems. Our analysis reveals that while SDN demonstrates promising capabilities in enhancing IoT security through centralized [...] Read more.
This comprehensive systematic review examines the integration of software-defined networking (SDN) with IoT security frameworks, analyzing recent advancements in encryption, authentication, access control techniques, and intrusion detection systems. Our analysis reveals that while SDN demonstrates promising capabilities in enhancing IoT security through centralized control and dynamic policy enforcement, several critical limitations persist, particularly in scalability and real-world validation. As intrusion detection represents an integral security requirement for robust IoT frameworks, we conduct an in-depth evaluation of Machine Learning (ML) and Deep Learning (DL) techniques that have emerged as predominant approaches for threat detection in SDN-enabled IoT environments. The review categorizes and analyzes these ML/DL implementations across various architectural paradigms, identifying patterns in their effectiveness for different security contexts. Furthermore, recognizing that the performance of these ML/DL models critically depends on training data quality, we evaluate existing IoT security datasets, identifying significant gaps in representing contemporary attack vectors and realistic IoT environments. A key finding indicates that hybrid architectures integrating cloud–edge–fog computing demonstrate superior performance in distributing security workloads compared to single-tier implementations. Based on this systematic analysis, we propose key future research directions, including adaptive zero-trust architectures, federated machine learning for distributed security, and comprehensive dataset creation methodologies, that address current limitations in IoT security research. Full article
(This article belongs to the Special Issue IoT-Enabling Technologies and Applications)
Show Figures

Figure 1

16 pages, 4047 KB  
Article
A High-Performance and Lightweight Maritime Target Detection Algorithm
by Shidan Sun, Zhiping Xu, Xiaochun Cao, Jiachun Zheng, Jiawen Yang and Ni Jin
Remote Sens. 2025, 17(6), 1012; https://doi.org/10.3390/rs17061012 - 13 Mar 2025
Cited by 7 | Viewed by 2153
Abstract
Maritime surveillance video (MSV) target detection systems are important for maritime security and ocean economy. Hindered by many complex factors, the existing MSV target detection systems have low detection accuracy. These factors include target distance, potential occlusion from rain and fog, and limited [...] Read more.
Maritime surveillance video (MSV) target detection systems are important for maritime security and ocean economy. Hindered by many complex factors, the existing MSV target detection systems have low detection accuracy. These factors include target distance, potential occlusion from rain and fog, and limited computing power of edge devices. To overcome these factors, a high performance and lightweight maritime target detection algorithm (HPMTD) is proposed in this paper. HPMTD consists of three modules: feature extraction, shallow feature progressive fusion (SFPF), and multi-scale sensing head. In the feature extraction module, a global coordinate attention-optimized offset regression module is proposed for deformable convolution. Thus, the ability to handle low visibility and target occlusion issues is enhanced. In the SFPF module, the ghost dynamic convolution combined with low-cost adaptive spatial feature fusion is proposed. In this way, lightweight design can be realized, and multi-scale target-detecting capacity can be increased. Furthermore, multi-scale sensing head is incorporated to learn and fuse scale features more effectively, thus improving localization accuracy. To evaluate the performance of the proposed algorithm, the Singapore Maritime Dataset is adopted in our experiments. The experimental results show that the proposed algorithm can achieve a nearly 10 percent mean average precision value improvement with nearly half the model size, compared with counterparts. Furthermore, the proposed algorithm runs three times faster with only half of the computation resources, and maintains nearly same accuracy in the maritime surface with low visibility. These results demonstrate that the HPMTD achieves lightweight and high-precision detection of marine targets. Full article
(This article belongs to the Section Ocean Remote Sensing)
Show Figures

Graphical abstract

31 pages, 616 KB  
Review
Fog Service Placement Optimization: A Survey of State-of-the-Art Strategies and Techniques
by Hemant Kumar Apat, Veena Goswami, Bibhudatta Sahoo, Rabindra K. Barik and Manob Jyoti Saikia
Computers 2025, 14(3), 99; https://doi.org/10.3390/computers14030099 - 11 Mar 2025
Cited by 11 | Viewed by 4153
Abstract
The rapid development of Internet of Things (IoT) devices in various smart city-based applications such as healthcare, traffic management systems, environment sensing systems, and public safety systems produce large volumes of data. To process these data, it requires substantial computing and storage resources [...] Read more.
The rapid development of Internet of Things (IoT) devices in various smart city-based applications such as healthcare, traffic management systems, environment sensing systems, and public safety systems produce large volumes of data. To process these data, it requires substantial computing and storage resources for smooth implementation and execution. While centralized cloud computing offers scalability, flexibility, and resource sharing, it faces significant limitations in IoT-based applications, especially in terms of latency, bandwidth, security, and cost. The fog computing paradigm complements the existing cloud computing services at the edge of the network to facilitate the various services without sending the data to a centralized cloud server. By processing the data in fog computing, it satisfies the delay requirement of various time-sensitive services of IoT applications. However, many resource-intensive IoT systems exist that require substantial computing resources for their processing. In such scenarios, finding the optimal computing node for processing and executing the service is a challenge. The optimal placement of various IoT applications services in heterogeneous fog computing environments is a well-known NP-complete problem. To solve this problem, various authors proposed different algorithms like the randomized algorithm, heuristic algorithm, meta heuristic algorithm, machine learning algorithm, and graph-based algorithm for finding the optimal placement. In the present survey, we first describe the fundamental and mathematical aspects of the three-layer IoT–fog–cloud computing model. Then, we classify the IoT application model based on different attributes that help to find the optimal computing node. Furthermore, we discuss the complexity analysis of the service placement problem in detail. Finally, we provide a comprehensive evaluation of both single-objective and multi-objective IoT service placement strategies in fog computing. Additionally, we highlight new challenges and identify promising directions for future research, specifically in the context of multi-objective IoT service optimization. Full article
Show Figures

Graphical abstract

Back to TopTop