Previous Issue
Volume 17, April
 
 

Future Internet, Volume 17, Issue 5 (May 2025) – 40 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
13 pages, 2276 KiB  
Article
Trajectory Optimization for UAV-Aided IoT Secure Communication Against Multiple Eavesdroppers
by Lingfeng Shen, Jiangtao Nie, Ming Li, Guanghui Wang, Qiankun Zhang and Xin He
Future Internet 2025, 17(5), 225; https://doi.org/10.3390/fi17050225 - 19 May 2025
Abstract
This study concentrates on physical layer security (PLS) in UAV-aided Internet of Things (IoT) networks and proposes an innovative approach to enhance security by optimizing the trajectory of unmanned aerial vehicles (UAVs). In an IoT system with multiple eavesdroppers, formulating the optimal UAV [...] Read more.
This study concentrates on physical layer security (PLS) in UAV-aided Internet of Things (IoT) networks and proposes an innovative approach to enhance security by optimizing the trajectory of unmanned aerial vehicles (UAVs). In an IoT system with multiple eavesdroppers, formulating the optimal UAV trajectory poses a non-convex and non-differentiable optimization challenge. The paper utilizes the successive convex approximation (SCA) method in conjunction with hypograph theory to address this challenge. First, a set of trajectory increment variables is introduced to replace the original UAV trajectory coordinates, thereby converting the original non-convex problem into a sequence of convex subproblems. Subsequently, hypograph theory is employed to convert these non-differentiable subproblems into standard convex forms, which can be solved using the CVX toolbox. Simulation results demonstrate the UAV’s trajectory fluctuations under different parameters, affirming that trajectory optimization significantly improves PLS performance in IoT systems. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

27 pages, 3548 KiB  
Article
Research on Advancing Radio Wave Source Localization Technology Through UAV Path Optimization
by Tomoroh Takahashi and Gia Khanh Tran
Future Internet 2025, 17(5), 224; https://doi.org/10.3390/fi17050224 - 16 May 2025
Viewed by 17
Abstract
With an increasing number of illegal radio stations, connected cars, and IoT devices, high-accuracy radio source localization techniques are in demand. Traditional methods such as GPS positioning and triangulation suffer from accuracy degradation in NLOS (non-line-of-sight) environments due to obstructions. In contrast, the [...] Read more.
With an increasing number of illegal radio stations, connected cars, and IoT devices, high-accuracy radio source localization techniques are in demand. Traditional methods such as GPS positioning and triangulation suffer from accuracy degradation in NLOS (non-line-of-sight) environments due to obstructions. In contrast, the fingerprinting method builds a database of pre-collected radio information and estimates the source location via pattern matching, maintaining relatively high accuracy in NLOS environments. This study aims to improve the accuracy of fingerprinting-based localization by optimizing UAV flight paths. Previous research mainly relied on RSSI-based localization, but we introduce an AOA model considering AOA (angle of arrival) and EOA (elevation of arrival), as well as a HYBRID model that integrates multiple radio features with weighting. Using Wireless Insite, we conducted ray-tracing simulations based on the Institute of Science Tokyo’s Ookayama campus and optimized UAV flight paths with PSO (Particle Swarm Optimization). Results show that the HYBRID model achieved the highest accuracy, limiting the maximum error to 20 m. Sequential estimation improved accuracy for high-error sources, particularly when RSSI was used first, followed by AOA or HYBRID. Future work includes estimating unknown frequency sources, refining sequential estimation, and implementing cooperative localization. Full article
Show Figures

Figure 1

21 pages, 9744 KiB  
Article
Real-Time Identification of Look-Alike Medical Vials Using Mixed Reality-Enabled Deep Learning
by Bahar Uddin Mahmud, Guanyue Hong, Virinchi Ravindrakumar Lalwani, Nicholas Brown and Zachary D. Asher
Future Internet 2025, 17(5), 223; https://doi.org/10.3390/fi17050223 - 16 May 2025
Viewed by 11
Abstract
The accurate identification of look-alike medical vials is essential for patient safety, particularly when similar vials contain different substances, volumes, or concentrations. Traditional methods, such as manual selection or barcode-based identification, are prone to human error or face reliability issues under varying lighting [...] Read more.
The accurate identification of look-alike medical vials is essential for patient safety, particularly when similar vials contain different substances, volumes, or concentrations. Traditional methods, such as manual selection or barcode-based identification, are prone to human error or face reliability issues under varying lighting conditions. This study addresses these challenges by introducing a real-time deep learning-based vial identification system, leveraging a Lightweight YOLOv4 model optimized for edge devices. The system is integrated into a Mixed Reality (MR) environment, enabling the real-time detection and annotation of vials with immediate operator feedback. Compared to standard barcode-based methods and the baseline YOLOv4-Tiny model, the proposed approach improves identification accuracy while maintaining low computational overhead. The experimental evaluations demonstrate a mean average precision (mAP) of 98.76 percent, with an inference speed of 68 milliseconds per frame on HoloLens 2, achieving real-time performance. The results highlight the model’s robustness in diverse lighting conditions and its ability to mitigate misclassifications of visually similar vials. By combining deep learning with MR, this system offers a more reliable and efficient alternative for pharmaceutical and medical applications, paving the way for AI-driven MR-assisted workflows in critical healthcare environments. Full article
(This article belongs to the Special Issue Smart Technology: Artificial Intelligence, Robotics and Algorithms)
Show Figures

Figure 1

16 pages, 10369 KiB  
Article
A Portable Non-Motorized Smart IoT Weather Station Platform for Urban Thermal Comfort Studies
by Raju Sethupatu Bala, Salaheddin Hosseinzadeh, Farhad Sadeghineko, Craig Scott Thomson and Rohinton Emmanuel
Future Internet 2025, 17(5), 222; https://doi.org/10.3390/fi17050222 - 15 May 2025
Viewed by 181
Abstract
Smart cities are widely regarded as a promising solution to urbanization challenges; however, environmental aspects such as outdoor thermal comfort and urban heat island are often less addressed than social and economic dimensions of sustainability. To address this gap, we developed and evaluated [...] Read more.
Smart cities are widely regarded as a promising solution to urbanization challenges; however, environmental aspects such as outdoor thermal comfort and urban heat island are often less addressed than social and economic dimensions of sustainability. To address this gap, we developed and evaluated an affordable, scalable, and cost-effective weather station platform, consisting of a centralized server and portable edge devices to facilitate urban heat island and outdoor thermal comfort studies. This edge device is designed in accordance with the ISO 7726 (1998) standards and further enhanced with a positioning system. The device can regularly log parameters such as air temperature, relative humidity, globe temperature, wind speed, and geographical coordinates. Strategic selection of components allowed for a low-cost device that can perform data manipulation, pre-processing, store the data, and exchange data with a centralized server via the internet. The centralized server facilitates scalability, processing, storage, and live monitoring of data acquisition processes. The edge devices’ electrical and shielding design was evaluated against a commercial weather station, showing Mean Absolute Error and Root Mean Square Error values of 0.1 and 0.33, respectively, for air temperature. Further, empirical test campaigns were conducted under two scenarios: “stop-and-go” and “on-the-move”. These tests provided an insight into transition and response times required for urban heat island and thermal comfort studies, and evaluated the platform’s overall performance, validating it for nuanced human-scale thermal comfort, urban heat island, and bio-meteorological studies. Full article
(This article belongs to the Special Issue Joint Design and Integration in Smart IoT Systems)
Show Figures

Figure 1

18 pages, 4079 KiB  
Article
A Scalable Hybrid Autoencoder–Extreme Learning Machine Framework for Adaptive Intrusion Detection in High-Dimensional Networks
by Anubhav Kumar, Rajamani Radhakrishnan, Mani Sumithra, Prabu Kaliyaperumal, Balamurugan Balusamy and Francesco Benedetto
Future Internet 2025, 17(5), 221; https://doi.org/10.3390/fi17050221 - 15 May 2025
Viewed by 136
Abstract
The rapid expansion of network environments has introduced significant cybersecurity challenges, particularly in handling high-dimensional traffic and detecting sophisticated threats. This study presents a novel, scalable Hybrid Autoencoder–Extreme Learning Machine (AE–ELM) framework for Intrusion Detection Systems (IDS), specifically designed to operate effectively in [...] Read more.
The rapid expansion of network environments has introduced significant cybersecurity challenges, particularly in handling high-dimensional traffic and detecting sophisticated threats. This study presents a novel, scalable Hybrid Autoencoder–Extreme Learning Machine (AE–ELM) framework for Intrusion Detection Systems (IDS), specifically designed to operate effectively in dynamic, cloud-supported IoT environments. The scientific novelty lies in the integration of an Autoencoder for deep feature compression with an Extreme Learning Machine for rapid and accurate classification, enhanced through adaptive thresholding techniques. Evaluated on the CSE-CIC-IDS2018 dataset, the proposed method demonstrates a high detection accuracy of 98.52%, outperforming conventional models in terms of precision, recall, and scalability. Additionally, the framework exhibits strong adaptability to emerging threats and reduced computational overhead, making it a practical solution for real-time, scalable IDS in next-generation network infrastructures. Full article
Show Figures

Figure 1

22 pages, 1034 KiB  
Article
A Novel Crowdsourcing-Assisted 5G Wireless Signal Ranging Technique in MEC Architecture
by Rui Lu, Lei Shi, Yinlong Liu and Zhongkai Dang
Future Internet 2025, 17(5), 220; https://doi.org/10.3390/fi17050220 - 14 May 2025
Viewed by 111
Abstract
In complex indoor and outdoor scenarios, traditional GPS-based ranging technology faces limitations in availability due to signal occlusion and user privacy issues. Wireless signal ranging technology based on 5G base stations has emerged as a potential alternative. However, existing methods are limited by [...] Read more.
In complex indoor and outdoor scenarios, traditional GPS-based ranging technology faces limitations in availability due to signal occlusion and user privacy issues. Wireless signal ranging technology based on 5G base stations has emerged as a potential alternative. However, existing methods are limited by low efficiency in constructing static signal databases, poor environmental adaptability, and high resource overhead, restricting their practical application. This paper proposes a 5G wireless signal ranging framework that integrates mobile edge computing (MEC) and crowdsourced intelligence to systematically address the aforementioned issues. This study designs a progressive solution by (1) building a crowdsourced data collection network, using mobile terminals equipped with GPS technology to automatically collect device signal features, replacing inefficient manual drive tests; (2) developing a progressive signal update algorithm that integrates real-time crowdsourced data and historical signals to optimize the signal fingerprint database in dynamic environments; (3) establishing an edge service architecture to offload signal matching and trajectory estimation tasks to MEC nodes, using lightweight computing engines to reduce the load on the core network. Experimental results demonstrate a mean positioning error of 5 m, with 95% of devices achieving errors within 10 m, as well as building and floor prediction error rates of 0.5% and 1%, respectively. The proposed framework outperforms traditional static methods by 3× in ranging accuracy while maintaining computational efficiency, achieving significant improvements in environmental adaptability and service scalability. Full article
Show Figures

Figure 1

2 pages, 130 KiB  
Correction
Correction: Kalodanis et al. High-Risk AI Systems—Lie Detection Application. Future Internet 2025, 17, 26
by Konstantinos Kalodanis, Panagiotis Rizomiliotis, Georgios Feretzakis, Charalampos Papapavlou and Dimosthenis Anagnostopoulos
Future Internet 2025, 17(5), 219; https://doi.org/10.3390/fi17050219 - 14 May 2025
Viewed by 64
Abstract
In the original publication [...] Full article
23 pages, 1875 KiB  
Article
U-SCAD: An Unsupervised Method of System Call-Driven Anomaly Detection for Containerized Edge Clouds
by Jiawei Ye, Ming Yan, Shenglin Wu, Jingxuan Tan and Jie Wu
Future Internet 2025, 17(5), 218; https://doi.org/10.3390/fi17050218 - 14 May 2025
Viewed by 136
Abstract
Container technology is currently one of the mainstream technologies in the field of cloud computing, yet its adoption in resource-constrained, latency-sensitive edge environments introduces unique security challenges. While existing system call-based anomaly-detection methods partially address these issues, they suffer from high false positive [...] Read more.
Container technology is currently one of the mainstream technologies in the field of cloud computing, yet its adoption in resource-constrained, latency-sensitive edge environments introduces unique security challenges. While existing system call-based anomaly-detection methods partially address these issues, they suffer from high false positive rates and excessive computational overhead. To achieve security and observability in edge-native containerized environments and lower the cost of computing resources, we propose an unsupervised anomaly-detection method based on system calls. This method filters out unnecessary system call data through automatic rule generation and an unsupervised classification model. To increase the accuracy of anomaly detection and reduce the false positive rates, this method embeds system calls into sequences using the proposed Syscall2vec and processes the remain sequences in favor of the anomaly detection model’s analysis. We conduct experiments using our method with a background based on modern containerized cloud microservices. The results show that the detection part of our method improves the F1 score by 23.88% and 41.31%, respectively, as compared to HIDS and LSTM-VAE. Moreover, our method can effectively reduce the original processing data to 13%, which means that it significantly lowers the cost of computing resources. Full article
Show Figures

Figure 1

22 pages, 5507 KiB  
Article
A Web-Based Application for Smart City Data Analysis and Visualization
by Panagiotis Karampakakis, Despoina Ioakeimidou, Periklis Chatzimisios and Konstantinos A. Tsintotas
Future Internet 2025, 17(5), 217; https://doi.org/10.3390/fi17050217 - 13 May 2025
Viewed by 232
Abstract
Smart cities are urban areas that use contemporary technology to improve citizens’ overall quality of life. These modern digital civil hubs aim to manage environmental conditions, traffic flow, and infrastructure through interconnected and data-driven decision-making systems. Today, many applications employ intelligent sensors for [...] Read more.
Smart cities are urban areas that use contemporary technology to improve citizens’ overall quality of life. These modern digital civil hubs aim to manage environmental conditions, traffic flow, and infrastructure through interconnected and data-driven decision-making systems. Today, many applications employ intelligent sensors for real-time data acquisition, leveraging visualization to derive actionable insights. However, despite the proliferation of such platforms, challenges like high data volume, noise, and incompleteness continue to hinder practical visual analysis. As missing data is a frequent issue in visualizing those urban sensing systems, our approach prioritizes their correction as a fundamental step. We deploy a hybrid imputation strategy combining SARIMAX, k-nearest neighbors, and random forest regression to address this. Building on this foundation, we propose an interactive web-based pipeline that processes, analyzes, and presents the sensor data provided by Basel’s “Smarte Strasse”. Our platform receives and projects environmental measurements, i.e., NO2, O3, PM2.5, and traffic noise, as well as mobility indicators such as vehicle speed and type, parking occupancy, and electric vehicle charging behavior. By resolving gaps in the data, we provide a solid foundation for high-fidelity and quality visual analytics. Built on the Flask web framework, the platform incorporates performance optimizations through Flask-Caching. Concerning the user’s dashboard, it supports interactive exploration via dynamic charts and spatial maps. This way, we demonstrate how future internet technologies permit the accessibility of complex urban sensor data for research, planning, and public engagement. Lastly, our open-source web-based application keeps reproducible, privacy-aware urban analytics. Full article
(This article belongs to the Section Smart System Infrastructure and Applications)
Show Figures

Graphical abstract

20 pages, 1198 KiB  
Article
Mitigating Class Imbalance in Network Intrusion Detection with Feature-Regularized GANs
by Jing Li, Wei Zong, Yang-Wai Chow and Willy Susilo
Future Internet 2025, 17(5), 216; https://doi.org/10.3390/fi17050216 - 13 May 2025
Viewed by 109
Abstract
Network Intrusion Detection Systems (NIDS) often suffer from severe class imbalance, where minority attack types are underrepresented, leading to degraded detection performance. To address this challenge, we propose a novel augmentation framework that integrates Soft Nearest Neighbor Loss (SNNL) into Generative Adversarial Networks [...] Read more.
Network Intrusion Detection Systems (NIDS) often suffer from severe class imbalance, where minority attack types are underrepresented, leading to degraded detection performance. To address this challenge, we propose a novel augmentation framework that integrates Soft Nearest Neighbor Loss (SNNL) into Generative Adversarial Networks (GANs), including WGAN, CWGAN, and WGAN-GP. Unlike traditional oversampling methods (e.g., SMOTE, ADASYN), our approach improves feature-space alignment between real and synthetic samples, enhancing classifier generalization on rare classes. Experiments on NSL-KDD, CSE-CIC-IDS2017, and CSE-CIC-IDS2018 show that SNNL-augmented GANs consistently improve minority-class F1-scores without degrading overall accuracy or majority-class performance. UMAP visualizations confirm that SNNL produces more compact and class-consistent sample distributions. We also evaluate the computational overhead, finding the added cost moderate. These results demonstrate the effectiveness and practicality of SNNL as a general enhancement for GAN-based data augmentation in imbalanced NIDS tasks. Full article
Show Figures

Figure 1

36 pages, 2702 KiB  
Article
Multi-Criteria Genetic Algorithm for Optimizing Distributed Computing Systems in Neural Network Synthesis
by Valeriya V. Tynchenko, Ivan Malashin, Sergei O. Kurashkin, Vadim Tynchenko, Andrei Gantimurov, Vladimir Nelyub and Aleksei Borodulin
Future Internet 2025, 17(5), 215; https://doi.org/10.3390/fi17050215 - 13 May 2025
Viewed by 105
Abstract
Artificial neural networks (ANNs) are increasingly effective in addressing complex scientific and technological challenges. However, challenges persist in synthesizing neural network models and defining their structural parameters. This study investigates the use of parallel evolutionary algorithms on distributed computing systems (DCSs) to optimize [...] Read more.
Artificial neural networks (ANNs) are increasingly effective in addressing complex scientific and technological challenges. However, challenges persist in synthesizing neural network models and defining their structural parameters. This study investigates the use of parallel evolutionary algorithms on distributed computing systems (DCSs) to optimize energy consumption and computational time. New mathematical models for DCS performance and reliability are proposed, based on a mass service system framework, along with a multi-criteria optimization model designed for resource-intensive computational problems. This model employs a multi-criteria GA to generate a diverse set of Pareto-optimal solutions. Additionally, a decision-support system is developed, incorporating the multi-criteria GA, allowing for customization of the genetic algorithm (GA) and the construction of specialized ANNs for specific problem domains. The application of the decision-support system (DSS) demonstrated performance of 1220.745 TFLOPS and an availability factor of 99.03%. These findings highlight the potential of the proposed DCS framework to enhance computational efficiency in relevant applications. Full article
(This article belongs to the Special Issue Parallel and Distributed Systems)
Show Figures

Graphical abstract

16 pages, 2616 KiB  
Article
Low-Complexity Microclimate Classification in Smart Greenhouses: A Fuzzy-Neural Approach
by Cristian Bua, Francesco Fiorini, Michele Pagano, Davide Adami and Stefano Giordano
Future Internet 2025, 17(5), 214; https://doi.org/10.3390/fi17050214 - 13 May 2025
Viewed by 207
Abstract
Maintaining optimal microclimatic conditions within greenhouses represents a significant challenge in modern agricultural contexts, where prediction systems play a crucial role in regulating temperature and humidity, thereby enabling timely interventions to prevent plant diseases or adverse growth conditions. In this work, we propose [...] Read more.
Maintaining optimal microclimatic conditions within greenhouses represents a significant challenge in modern agricultural contexts, where prediction systems play a crucial role in regulating temperature and humidity, thereby enabling timely interventions to prevent plant diseases or adverse growth conditions. In this work, we propose a novel approach which integrates a cascaded Feed-Forward Neural Network (FFNN) with the Granular Computing paradigm to achieve accurate microclimate forecasting and reduced computational complexity. The experimental results demonstrate that the accuracy of our approach is the same as that of the FFNN-based approach but the complexity is reduced, making this solution particularly well suited for deployment on edge devices with limited computational capabilities. Our innovative approach has been validated using a real-world dataset collected from four greenhouses and integrated into a distributed network architecture. This setup supports the execution of predictive models both on sensors deployed within the greenhouse and at the network edge, where more computationally intensive models can be utilized to enhance decision-making accuracy. Full article
(This article belongs to the Special Issue Joint Design and Integration in Smart IoT Systems)
Show Figures

Figure 1

3 pages, 144 KiB  
Editorial
Industrial Internet of Things (IIoT): Trends and Technologies
by Zhihao Liu, Franco Davoli and Davide Borsatti
Future Internet 2025, 17(5), 213; https://doi.org/10.3390/fi17050213 - 13 May 2025
Viewed by 131
Abstract
The Industrial Internet of Things (IIoT) integrates sensors, machines, and data processing in industrial facilities to enable real-time monitoring, predictive insights, and autonomous control of equipment [...] Full article
(This article belongs to the Special Issue Industrial Internet of Things (IIoT): Trends and Technologies)
33 pages, 10361 KiB  
Article
Enhanced Propaganda Detection in Public Social Media Discussions Using a Fine-Tuned Deep Learning Model: A Diffusion of Innovation Perspective
by Pir Noman Ahmad, Adnan Muhammad Shah and KangYoon Lee
Future Internet 2025, 17(5), 212; https://doi.org/10.3390/fi17050212 - 12 May 2025
Viewed by 224
Abstract
During the COVID-19 pandemic, social media platforms emerged as both vital information sources and conduits for the rapid spread of propaganda and misinformation. However, existing studies often rely on single-label classification, lack contextual sensitivity, or use models that struggle to effectively capture nuanced [...] Read more.
During the COVID-19 pandemic, social media platforms emerged as both vital information sources and conduits for the rapid spread of propaganda and misinformation. However, existing studies often rely on single-label classification, lack contextual sensitivity, or use models that struggle to effectively capture nuanced propaganda cues across multiple categories. These limitations hinder the development of robust, generalizable detection systems in dynamic online environments. In this study, we propose a novel deep learning (DL) framework grounded in fine-tuning the RoBERTa model for a multi-label, multi-class (ML-MC) classification task, selecting RoBERTa due to its strong contextual representation capabilities and demonstrated superiority in complex NLP tasks. Our approach is rigorously benchmarked against traditional and neural methods, including, TF-IDF with n-grams, Conditional Random Fields (CRFs), and long short-term memory (LSTM) networks. While LSTM models show strong performance in capturing sequential patterns, our RoBERTa-based model achieves the highest overall accuracy at 88%, outperforming state-of-the-art baselines. Framed within the diffusion of innovations theory, the proposed model offers clear relative advantages—including accuracy, scalability, and contextual adaptability—that support its early adoption by Information Systems researchers and practitioners. This study not only contributes a high-performing detection model but also delivers methodological and theoretical insights for combating propaganda in digital discourse, enhancing resilience in online information ecosystems. Full article
Show Figures

Figure 1

27 pages, 9653 KiB  
Article
DNS over HTTPS Tunneling Detection System Based on Selected Features via Ant Colony Optimization
by Hardi Sabah Talabani, Zrar Khalid Abdul and Hardi Mohammed Mohammed Saleh
Future Internet 2025, 17(5), 211; https://doi.org/10.3390/fi17050211 - 7 May 2025
Viewed by 266
Abstract
DNS over HTTPS (DoH) is an advanced version of the traditional DNS protocol that prevents eavesdropping and man-in-the-middle attacks by encrypting queries and responses. However, it introduces new challenges such as encrypted traffic communication, masking malicious activity, tunneling attacks, and complicating intrusion detection [...] Read more.
DNS over HTTPS (DoH) is an advanced version of the traditional DNS protocol that prevents eavesdropping and man-in-the-middle attacks by encrypting queries and responses. However, it introduces new challenges such as encrypted traffic communication, masking malicious activity, tunneling attacks, and complicating intrusion detection system (IDS) packet inspection. In contrast, unencrypted packets in the traditional Non-DoH version remain vulnerable to eavesdropping, privacy breaches, and spoofing. To address these challenges, an optimized dual-path feature selection approach is designed to select the most efficient packet features for binary class (DoH-Normal, DoH-Malicious) and multiclass (Non-DoH, DoH-Normal, DoH-Malicious) classification. Ant Colony Optimization (ACO) is integrated with machine learning algorithms such as XGBoost, K-Nearest Neighbors (KNN), Random Forest (RF), and Convolutional Neural Networks (CNNs) using CIRA-CIC-DoHBrw-2020 as the benchmark dataset. Experimental results show that the proposed model selects the most effective features for both scenarios, achieving the highest detection and outperforming previous studies in IDS. The highest accuracy obtained for binary and multiclass classifications was 0.9999 and 0.9955, respectively. The optimized feature set contributed significantly to reducing computational costs and processing time across all utilized classifiers. The results provide a robust, fast, and accurate solution to challenges associated with encrypted DNS packets. Full article
Show Figures

Figure 1

23 pages, 682 KiB  
Article
A Blockchain-Based Strategy for Certifying Timestamps in a Distributed Healthcare Emergency Response Systems
by Daniele Marletta, Alessandro Midolo and Emiliano Tramontana
Future Internet 2025, 17(5), 210; https://doi.org/10.3390/fi17050210 - 7 May 2025
Viewed by 114
Abstract
A high level of data integrity is a strong requirement in systems where the life of people depends on accurate and timely responses. In healthcare emergency response systems, a centralized authority that handles data related to occurring events is prone to challenges, such [...] Read more.
A high level of data integrity is a strong requirement in systems where the life of people depends on accurate and timely responses. In healthcare emergency response systems, a centralized authority that handles data related to occurring events is prone to challenges, such as, e.g., disputes over event timestamps and data authenticity. To address both the potential lack of trust among collaborating parties and the inability of an authority to clearly certify events by itself, this paper proposes a blockchain-based framework designed to provide proof of integrity and authenticity of data in healthcare emergency response systems. The proposed solution integrates blockchain technology to certify the accuracy of events throughout their incident lifecycle. Critical events are timestamped and hashed using SHA-256; then, such hashes are stored immutably on an EVM-compatible blockchain via smart contracts. The system combines blockchain technology with cloud storage to ensure scalability, security, and transparency. Blockchain technology provides the advantage of eliminating a trusted server, providing timestamping and reducing costs by forgoing such a service. The experimental results, using publicly available incident data, demonstrated the feasibility and effectiveness of this approach. The system provides a cost-effective, scalable solution for managing incident data while keeping a proof of their integrity. The proposed blockchain-based framework offers a reliable, transparent mechanism for certifying incident-related data. This fosters trust among healthcare emergency response system actors. Full article
(This article belongs to the Special Issue Security and Privacy in Blockchains and the IoT—3rd Edition)
Show Figures

Figure 1

21 pages, 2595 KiB  
Article
Adversarial Training for Mitigating Insider-Driven XAI-Based Backdoor Attacks
by R. G. Gayathri, Atul Sajjanhar and Yong Xiang
Future Internet 2025, 17(5), 209; https://doi.org/10.3390/fi17050209 - 6 May 2025
Viewed by 181
Abstract
The study investigates how adversarial training techniques can be used to introduce backdoors into deep learning models by an insider with privileged access to training data. The research demonstrates an insider-driven poison-label backdoor approach in which triggers are introduced into the training dataset. [...] Read more.
The study investigates how adversarial training techniques can be used to introduce backdoors into deep learning models by an insider with privileged access to training data. The research demonstrates an insider-driven poison-label backdoor approach in which triggers are introduced into the training dataset. These triggers misclassify poisoned inputs while maintaining standard classification on clean data. An adversary can improve the stealth and effectiveness of such attacks by utilizing XAI techniques, which makes the detection of such attacks more difficult. The study uses publicly available datasets to evaluate the robustness of the deep learning models in this situation. Our experiments show that adversarial training considerably reduces backdoor attacks. These results are verified using various performance metrics, revealing model vulnerabilities and possible countermeasures. The findings demonstrate the importance of robust training techniques and effective adversarial defenses to improve the security of deep learning models against insider-driven backdoor attacks. Full article
(This article belongs to the Special Issue Generative Artificial Intelligence (AI) for Cybersecurity)
Show Figures

Figure 1

20 pages, 1664 KiB  
Article
A Network Traffic Characteristics Reconstruction Method for Mitigating the Impact of Packet Loss in Edge Computing Scenarios
by Jiawei Ye, Yanting Chen, Aierpanjiang Simayi, Yu Liu, Zhihui Lu and Jie Wu
Future Internet 2025, 17(5), 208; https://doi.org/10.3390/fi17050208 - 5 May 2025
Viewed by 252
Abstract
This paper presents TCReC, an innovative model designed for reconstructing network traffic characteristics in the presence of packet loss. With the rapid expansion of wireless networks driven by edge computing, IoT, and 5G technologies, challenges such as transmission instability, channel competition, and environmental [...] Read more.
This paper presents TCReC, an innovative model designed for reconstructing network traffic characteristics in the presence of packet loss. With the rapid expansion of wireless networks driven by edge computing, IoT, and 5G technologies, challenges such as transmission instability, channel competition, and environmental interference have led to significant packet loss rates, adversely impacting deep learning-based network traffic analysis tasks. To address this issue, TCReC leverages masked autoencoder techniques to reconstruct missing traffic features, ensuring reliable input for downstream tasks in edge computing scenarios. Experimental results demonstrate that TCReC maintains detection model accuracy within 10% of the original data, even under packet loss rates as high as 70%. For instance, on the ISCX-VPN-2016 dataset, TCReC achieves a Reconstruction Ability Index (RAI) of 94.02%, while on the CIC-IDS-2017 dataset, it achieves an RAI of 94.99% when combined with LSTM, significantly outperforming other methods such as Transformer, KNN, and RNN. Additionally, TCReC exhibits robustness across various packet loss scenarios, consistently delivering high-quality feature reconstruction for both attack traffic and common Internet application data. TCReC provides a robust solution for network traffic analysis in high-loss edge computing scenarios, offering practical value for real-world deployment. Full article
Show Figures

Figure 1

22 pages, 11622 KiB  
Article
Classification of Hacker’s Posts Based on Zero-Shot, Few-Shot, and Fine-Tuned LLMs in Environments with Constrained Resources
by Theodoros Giannilias, Andreas Papadakis, Nikolaos Nikolaou and Theodore Zahariadis
Future Internet 2025, 17(5), 207; https://doi.org/10.3390/fi17050207 - 5 May 2025
Viewed by 297
Abstract
This paper investigates, applies, and evaluates state-of-the-art Large Language Models (LLMs) for the classification of posts from a dark web hackers’ forum into four cyber-security categories. The LLMs applied included Mistral-7B-Instruct-v0.2, Gemma-1.1-7B, Llama-3-8B-Instruct, and Llama-2-7B, with zero-shot learning, few-shot learning, and fine-tuning. The [...] Read more.
This paper investigates, applies, and evaluates state-of-the-art Large Language Models (LLMs) for the classification of posts from a dark web hackers’ forum into four cyber-security categories. The LLMs applied included Mistral-7B-Instruct-v0.2, Gemma-1.1-7B, Llama-3-8B-Instruct, and Llama-2-7B, with zero-shot learning, few-shot learning, and fine-tuning. The four cyber-security categories consisted of “Access Control and Management”, “Availability Protection and Security by Design Mechanisms”, “Software and Firmware Flaws”, and “not relevant”. The hackers’ posts were also classified and labelled by a human cyber-security expert, allowing a detailed evaluation of the classification accuracy per each LLM and customization/learning method. We verified LLM fine-tuning as the most effective mechanism to enhance the accuracy and reliability of the classifications. The results include the methodology applied and the labelled hackers’ posts dataset. Full article
(This article belongs to the Special Issue Generative Artificial Intelligence (AI) for Cybersecurity)
Show Figures

Figure 1

15 pages, 1473 KiB  
Article
XGBoost-Based Detection of DDoS Attacks in Named Data Networking
by Liang Liu, Weiqing Yu, Zhijun Wu and Silin Peng
Future Internet 2025, 17(5), 206; https://doi.org/10.3390/fi17050206 - 4 May 2025
Viewed by 211
Abstract
Named Data Networking (NDN) is highly susceptible to Distributed Denial of Service (DDoS) attacks, such as Interest Flooding Attack (IFA) and Cache Pollution Attack (CPA). These attacks exploit the inherent data retrieval and caching mechanisms of NDN, leading to severe disruptions in data [...] Read more.
Named Data Networking (NDN) is highly susceptible to Distributed Denial of Service (DDoS) attacks, such as Interest Flooding Attack (IFA) and Cache Pollution Attack (CPA). These attacks exploit the inherent data retrieval and caching mechanisms of NDN, leading to severe disruptions in data availability and network efficiency, thereby undermining the overall performance and reliability of the system. In this paper, an attack detection method based on an improved XGBoost is proposed and applied to the hybrid attack pattern of IFA and CPA. Through experiments, the performance of the new attacks and the efficacy of the detection algorithm are analyzed. In comparison with other algorithms, the proposed method is demonstrated to have advantages in terms of the advanced nature of the proposed classifier, which is confirmed by the AUC-score. Full article
Show Figures

Figure 1

36 pages, 889 KiB  
Review
Securing Blockchain Systems: A Layer-Oriented Survey of Threats, Vulnerability Taxonomy, and Detection Methods
by Mohammad Jaminur Islam, Saminur Islam, Mahmud Hossain, Shahid Noor and S. M. Riazul Islam
Future Internet 2025, 17(5), 205; https://doi.org/10.3390/fi17050205 - 3 May 2025
Viewed by 363
Abstract
Blockchain technology is emerging as a pivotal framework to enhance the security of internet-based systems, especially as advancements in machine learning (ML), artificial intelligence (AI), and cyber–physical systems such as smart grids and IoT applications in healthcare continue to accelerate. Although these innovations [...] Read more.
Blockchain technology is emerging as a pivotal framework to enhance the security of internet-based systems, especially as advancements in machine learning (ML), artificial intelligence (AI), and cyber–physical systems such as smart grids and IoT applications in healthcare continue to accelerate. Although these innovations promise significant improvements, security remains a critical challenge. Blockchain offers a secure foundation for integrating diverse technologies; however, vulnerabilities—including adversarial exploits—can undermine performance and compromise application reliability. To address these risks effectively, it is essential to comprehensively analyze the vulnerability landscape of blockchain systems. This paper contributes in two key ways. First, it presents a unique layer-based framework for analyzing and illustrating security attacks within blockchain architectures. Second, it introduces a novel taxonomy that classifies existing research on blockchain vulnerability detection. Our analysis reveals that while ML and deep learning offer promising approaches for detecting vulnerabilities, their effectiveness often depends on access to extensive and high-quality datasets. Additionally, the layer-based framework demonstrates that vulnerabilities span all layers of a blockchain system, with attacks frequently targeting the consensus process, network integrity, and smart contract code. Overall, this paper provides a comprehensive overview of blockchain security threats and detection methods, emphasizing the need for a multifaceted approach to safeguard these evolving systems. Full article
(This article belongs to the Special Issue Security and Privacy in Blockchains and the IoT—3rd Edition)
Show Figures

Graphical abstract

27 pages, 3296 KiB  
Article
AI-Powered Stroke Diagnosis System: Methodological Framework and Implementation
by Marta Narigina, Agris Vindecs, Dušanka Bošković, Yuri Merkuryev and Andrejs Romanovs
Future Internet 2025, 17(5), 204; https://doi.org/10.3390/fi17050204 - 2 May 2025
Viewed by 145
Abstract
This study introduces an AI-based framework for stroke diagnosis that merges clinical data and curated imaging data. The system utilizes traditional machine learning and advanced deep learning techniques to tackle dataset imbalances and variability in stroke presentations. Our approach involves rigorous data preprocessing, [...] Read more.
This study introduces an AI-based framework for stroke diagnosis that merges clinical data and curated imaging data. The system utilizes traditional machine learning and advanced deep learning techniques to tackle dataset imbalances and variability in stroke presentations. Our approach involves rigorous data preprocessing, feature engineering, and ensemble techniques to optimize the predictive performance. Comprehensive evaluations demonstrate that gradient-boosted models outperform in accuracy, while CNNs enhance stroke detection rates. Calibration and threshold optimization are utilized to align predictions with clinical requirements, ensuring diagnostic reliability. This multi-modal framework highlights the capacity of AI to accelerate stroke diagnosis and aid clinical decision making, ultimately enhancing patient outcomes in critical care. Full article
Show Figures

Figure 1

18 pages, 935 KiB  
Article
C6EnPLS: A High-Performance Computing Job Dataset for the Analysis of Linear Solvers’ Power Consumption
by Marcello Artioli, Andrea Borghesi, Marta Chinnici, Anna Ciampolini, Michele Colonna, Davide De Chiara and Daniela Loreti
Future Internet 2025, 17(5), 203; https://doi.org/10.3390/fi17050203 - 30 Apr 2025
Viewed by 201
Abstract
In recent decades, driven by global efforts towards sustainability, the priorities of HPC facilities have changed to include maximising energy efficiency besides computing performance. In this regard, a crucial open question is how to accurately predict the contribution of each parallel job to [...] Read more.
In recent decades, driven by global efforts towards sustainability, the priorities of HPC facilities have changed to include maximising energy efficiency besides computing performance. In this regard, a crucial open question is how to accurately predict the contribution of each parallel job to the system’s energy consumption. Accurate estimations in this sense could offer an initial insight into the overall power requirements of the system, and provide meaningful information for, e.g., power-aware scheduling, load balancing, infrastructure design, etc. While ML-based attempts employing large training datasets of past executions may suffer from the high variability of HPC workloads, a more specific knowledge of the nature of the jobs can improve prediction accuracy. In this work, we restrict our attention to the rather pervasive task of linear system resolution. We propose a methodology to build a large dataset of runs (including the measurements coming from physical sensors deployed on a large HPC cluster), and we report a statistical analysis and preliminary evaluation of the efficacy of the obtained dataset when employed to train well-established ML methods aiming to predict the energy footprint of specific software. Full article
(This article belongs to the Special Issue Distributed Machine Learning and Federated Edge Computing for IoT)
Show Figures

Figure 1

55 pages, 3433 KiB  
Review
SoK: Delegated Security in the Internet of Things
by Emiliia Geloczi, Felix Klement, Patrick Struck and Stefan Katzenbeisser
Future Internet 2025, 17(5), 202; https://doi.org/10.3390/fi17050202 - 30 Apr 2025
Viewed by 123
Abstract
The increased use of electronic devices in the Internet of Things (IoT) leads not only to an improved comfort of living but also to an increased risk of attacks. IoT security has thus become an important research field. However, due to limits on [...] Read more.
The increased use of electronic devices in the Internet of Things (IoT) leads not only to an improved comfort of living but also to an increased risk of attacks. IoT security has thus become an important research field. However, due to limits on performance and bandwidth, IoT devices are often not powerful enough to execute, e.g., costly cryptographic algorithms or protocols. This limitation can be solved through a delegation concept. By delegating certain operations to devices with sufficient resources, it is possible to achieve a high level of security without overloading a device that needs protection. In this paper, we give an overview of current approaches for security delegation in the context of IoT, formalise security notions, discuss the security of existing approaches, and identify further research questions. Furthermore, a mathematical formalisation of the CIA triad (confidentiality, integrity, and availability) is proposed for the predefined application areas, in order to evaluate the different approaches. Full article
(This article belongs to the Special Issue Cybersecurity in the IoT)
Show Figures

Figure 1

32 pages, 3449 KiB  
Article
Optimizing Internet of Things Services Placement in Fog Computing Using Hybrid Recommendation System
by Hanen Ben Rjeb, Layth Sliman, Hela Zorgati, Raoudha Ben Djemaa and Amine Dhraief
Future Internet 2025, 17(5), 201; https://doi.org/10.3390/fi17050201 - 30 Apr 2025
Viewed by 249
Abstract
Fog Computing extends Cloud computing capabilities by providing computational resources closer to end users. Fog Computing has gained considerable popularity in various domains such as drones, autonomous vehicles, and smart cities. In this context, the careful selection of suitable Fog resources and the [...] Read more.
Fog Computing extends Cloud computing capabilities by providing computational resources closer to end users. Fog Computing has gained considerable popularity in various domains such as drones, autonomous vehicles, and smart cities. In this context, the careful selection of suitable Fog resources and the optimal assignment of services to these resources (the service placement problem (SPP)) is essential. Numerous studies have attempted to tackle this issue. However, to the best of our knowledge, none of the previously proposed works took into consideration the dynamic context awareness and the user preferences for IoT service placement. To deal with this issue, we propose a hybrid recommendation system for service placement that combines two techniques: collaborative filtering and content-based recommendation. By considering user and service context, user preferences, service needs, and resource availability, the proposed recommendation system provides optimal placement suggestions for each IoT service. To assess the efficiency of the proposed system, a validation scenario based on Internet of Drones (IoD) was simulated and tested. The results show that the proposed approach leads to a considerable reduction in waiting time and a substantial improvement in resource utilization and the number of executed services. Full article
Show Figures

Graphical abstract

17 pages, 12426 KiB  
Article
Implementation and Performance Analysis of an Industrial Robot’s Vision System Based on Cloud Vision Services
by Ioana-Livia Stefan, Andrei Mateescu, Ionut Lentoiu, Silviu Raileanu, Florin Daniel Anton, Dragos Constantin Popescu and Ioan Stefan Sacala
Future Internet 2025, 17(5), 200; https://doi.org/10.3390/fi17050200 - 30 Apr 2025
Viewed by 196
Abstract
With its fast advancements, cloud computing opens many opportunities for research in various applications from the robotics field. In our paper, we further explore the prospect of integrating Cloud AI object recognition services into an industrial robotics sorting task. Starting from our previously [...] Read more.
With its fast advancements, cloud computing opens many opportunities for research in various applications from the robotics field. In our paper, we further explore the prospect of integrating Cloud AI object recognition services into an industrial robotics sorting task. Starting from our previously implemented solution on a digital twin, we are now putting our proposed architecture to the test in the real world, on an industrial robot, where factors such as illumination, shadows, different colors, and textures of the materials influence the performance of the vision system. We compare the results of our suggested method with those from an industrial machine vision software, indicating promising performance and opening additional application perspectives in the robotics field, simultaneously with the continuous improvement of Cloud and AI technology. Full article
(This article belongs to the Special Issue Artificial Intelligence and Control Systems for Industry 4.0 and 5.0)
Show Figures

Graphical abstract

35 pages, 1503 KiB  
Systematic Review
Integrating AIoT Technologies in Aquaculture: A Systematic Review
by Fahmida Wazed Tina, Nasrin Afsarimanesh, Anindya Nag and Md Eshrat E. Alahi
Future Internet 2025, 17(5), 199; https://doi.org/10.3390/fi17050199 - 30 Apr 2025
Viewed by 1125
Abstract
The increasing global demand for seafood underscores the necessity for sustainable aquaculture practices. However, several challenges, including rising operational costs, variable environmental conditions, and the threat of disease outbreaks, impede progress in this field. This review explores the transformative role of the Artificial [...] Read more.
The increasing global demand for seafood underscores the necessity for sustainable aquaculture practices. However, several challenges, including rising operational costs, variable environmental conditions, and the threat of disease outbreaks, impede progress in this field. This review explores the transformative role of the Artificial Intelligence of Things (AIoT) in mitigating these challenges. We analyse current research on AIoT applications in aquaculture, with a strong emphasis on the use of IoT sensors for real-time data collection and AI algorithms for effective data analysis. Our focus areas include monitoring water quality, implementing smart feeding strategies, detecting diseases, analysing fish behaviour, and employing automated counting techniques. Nevertheless, several research gaps remain, particularly regarding the integration of AI in broodstock management, the development of multimodal AI systems, and challenges regarding model generalization. Future advancements in AIoT should prioritise real-time adaptability, cost-effectiveness, and sustainability while emphasizing the importance of multimodal systems, advanced biosensing capabilities, and digital twin technologies. In conclusion, while AIoT presents substantial opportunities for enhancing aquaculture practices, successful implementation will depend on overcoming challenges related to scalability, cost, and technical expertise, improving models’ adaptability, and ensuring environmental sustainability. Full article
(This article belongs to the Special Issue Internet of Things (IoT) in Smart City)
Show Figures

Graphical abstract

30 pages, 18616 KiB  
Article
Leveraging Retrieval-Augmented Generation for Automated Smart Home Orchestration
by Negin Jahanbakhsh, Mario Vega-Barbas, Iván Pau, Lucas Elvira-Martín, Hirad Moosavi and Carolina García-Vázquez
Future Internet 2025, 17(5), 198; https://doi.org/10.3390/fi17050198 - 29 Apr 2025
Viewed by 247
Abstract
The rapid growth of smart home technologies, driven by the expansion of the Internet of Things (IoT), has introduced both opportunities and challenges in automating daily routines and orchestrating device interactions. Traditional rule-based automation systems often fall short in adapting to dynamic conditions, [...] Read more.
The rapid growth of smart home technologies, driven by the expansion of the Internet of Things (IoT), has introduced both opportunities and challenges in automating daily routines and orchestrating device interactions. Traditional rule-based automation systems often fall short in adapting to dynamic conditions, integrating heterogeneous devices, and responding to evolving user needs. To address these limitations, this study introduces a novel smart home orchestration framework that combines generative Artificial Intelligence (AI), Retrieval-Augmented Generation (RAG), and the modular OSGi framework. The proposed system allows users to express requirements in natural language, which are then interpreted and transformed into executable service bundles by large language models (LLMs) enhanced with contextual knowledge retrieved from vector databases. These AI-generated service bundles are dynamically deployed via OSGi, enabling real-time service adaptation without system downtime. Manufacturer-provided device capabilities are seamlessly integrated into the orchestration pipeline, ensuring compatibility and extensibility. The framework was validated through multiple use-case scenarios involving dynamic device discovery, on-demand code generation, and adaptive orchestration based on user preferences. Results highlight the system’s ability to enhance automation efficiency, personalization, and resilience. This work demonstrates the feasibility and advantages of AI-driven orchestration in realising intelligent, flexible, and scalable smart home environments. Full article
(This article belongs to the Special Issue Joint Design and Integration in Smart IoT Systems)
Show Figures

Figure 1

31 pages, 1997 KiB  
Article
Leveraging Blockchain Technology for Secure 5G Offloading Processes
by Cristina Regueiro, Santiago de Diego and Borja Urkizu
Future Internet 2025, 17(5), 197; https://doi.org/10.3390/fi17050197 - 29 Apr 2025
Viewed by 359
Abstract
This paper presents a secure 5G offloading mechanism leveraging Blockchain technology and Self-Sovereign Identity (SSI). The advent of 5G has significantly enhanced the capabilities of all sectors, enabling innovative applications and improving security and efficiency. However, challenges such as limited infrastructure, signal interference, [...] Read more.
This paper presents a secure 5G offloading mechanism leveraging Blockchain technology and Self-Sovereign Identity (SSI). The advent of 5G has significantly enhanced the capabilities of all sectors, enabling innovative applications and improving security and efficiency. However, challenges such as limited infrastructure, signal interference, and high upgrade costs persist. Offloading processes already address these issues but they require more transparency and security. This paper proposes a Blockchain-based marketplace using Hyperledger Fabric to optimize resource allocation and enhance security. This marketplace facilitates the exchange of services and resources among operators, promoting competition and flexibility. Additionally, the paper introduces an SSI-based authentication system to ensure privacy and security during the offloading process. The architecture and components of the marketplace and authentication system are detailed, along with their data models and operations. Performance evaluations indicate that the proposed solutions do not significantly degrade offloading times, making them suitable for everyday applications. As a result, the integration of Blockchain and SSI technologies enhances the security and efficiency of 5G offloading. Full article
(This article belongs to the Special Issue 5G Security: Challenges, Opportunities, and the Road Ahead)
Show Figures

Figure 1

27 pages, 960 KiB  
Article
Ephemeral Node Identifiers for Enhanced Flow Privacy
by Gregor Tamati Haywood and Saleem Noel Bhatti
Future Internet 2025, 17(5), 196; https://doi.org/10.3390/fi17050196 - 28 Apr 2025
Viewed by 208
Abstract
The Internet Protocol (IP) uses numerical address values carried in IP packets at the network layer to allow correct forwarding of packets between source and destination. Those address values must be kept visible in all parts of the network. By definition, those addresses [...] Read more.
The Internet Protocol (IP) uses numerical address values carried in IP packets at the network layer to allow correct forwarding of packets between source and destination. Those address values must be kept visible in all parts of the network. By definition, those addresses must carry enough information to identify the source and destination for the communication. This means that successive flows of IP packets can be correlated—it is possible for an observer of the flows to easily link them to an individual source and so, potentially, to an individual user. To alleviate this privacy concern, it is desirable to have ephemeral address values—values that have a limited lifespan and so make flow correlation more difficult for an attacker. However, the IP address is also used in the end-to-end communication state for transport layer flows so must remain consistent to allow correct operation at the transport layer. We present a solution to this tension in requirements by the use of ephemeral Node Identifier (eNID) values in IP packets as part of the address value. We have implemented our approach as an extension to IPv6 in the FreeBSD14 operating system kernel. We have evaluated the implementation with existing applications over both a testbed network in a controlled environment, as well as with global IPv6 network connectivity. Our results show that eNIDs work with existing applications and over existing IPv6 networks. Our analyses shows that using eNIDs creates a disruption to the correlation of flows and so effectively perturbs linkability. As our approach is a network layer (layer 3) mechanism, it is usable by any transport layer (layer 4) protocol, improving privacy for all applications and all users. Full article
Show Figures

Figure 1

Previous Issue
Back to TopTop