applsci-logo

Journal Browser

Journal Browser

AI-Enabled Next-Generation Computing and Its Applications

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Computing and Artificial Intelligence".

Deadline for manuscript submissions: closed (20 February 2026) | Viewed by 21244

Special Issue Editors


E-Mail Website
Guest Editor
School of Mathematics and Computing (Computational Science and Engineering), Yonsei University, Seoul 03722, Republic of Korea
Interests: machine learning; data mining; social network analysis; mobile computing
Special Issues, Collections and Topics in MDPI journals
Institute of Theoretical and Applied Informatics, Polish Academy of Sciences, 44-100 Gliwice, Poland
Interests: machine learning (ML); Internet of Things (IoT); ML at the edge; cybersecurity of IoT; massive access

Special Issue Information

Dear Colleagues,

As artificial intelligence (AI) continues to advance at an unprecedented pace, its integration into computing systems is driving innovations. The convergence of AI with next-generation computing technologies, such as cloud computing, mobile computing, edge computing, secure and quantum-safe computing, and high-performance computing, is unlocking new frontiers in computational capabilities and enabling a wide range of applications across various domains. This Special Issue aims to provide a comprehensive overview of the current state of the art, identify challenges, and propose solutions for future research directions in AI-enabled next-generation computing. We welcome articles focused on the following relevant topics:

  • Internet of Things (IoT) and Computation of Things;
  • Cloud computing and big data applications;
  • Mobile network computing and multi-access edge computing;
  • Multimedia and communication systems;
  • AI techniques in high-performance computing;
  • Applications of AI in cybersecurity, healthcare, and IoT;
  • Secure and quantum-safe computing;
  • Future mobility applications and ICT convergence technology.

Submissions on other topics that are in accordance with the theme of this Special Issue are also welcome and may take the form of original research articles, reviews, and case studies.

Prof. Dr. Ilsun You
Prof. Dr. Won-Yong Shin
Prof. Dr. Hsing-Chung Chen
Dr. Mert Nakip
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • cloud computing
  • mobile computing
  • edge computing
  • secure and quantum-safe computing
  • high-performance computing
  • next-generation computing
  • IoT

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

26 pages, 3721 KB  
Article
Column-Wise Autoencoder Representation Learning for Intrusion Detection in Multi-MEC Edge Networks
by Min-Gyu Kim and Jonghyun Kim
Appl. Sci. 2026, 16(6), 3055; https://doi.org/10.3390/app16063055 - 21 Mar 2026
Viewed by 284
Abstract
Mobile Edge Computing (MEC) is a key enabler of 5G/6G services, but multi-base-station deployment enlarges the attack surface and motivates edge-native intrusion detection systems (IDSs). Existing MEC-based IDSs are mainly single-node or centralized, which struggle with heterogeneous traffic across next-generation Node Bs (gNBs) [...] Read more.
Mobile Edge Computing (MEC) is a key enabler of 5G/6G services, but multi-base-station deployment enlarges the attack surface and motivates edge-native intrusion detection systems (IDSs). Existing MEC-based IDSs are mainly single-node or centralized, which struggle with heterogeneous traffic across next-generation Node Bs (gNBs) and incur latency and network load due to data aggregation. To address these limitations, this paper proposes a Column-Wise Autoencoder Ensemble (CW-AE) distributed learning framework for multi-MEC environments. Each MEC node trains column-wise autoencoder encoders locally to extract compact latent features, and a master MEC trains a stacking-based meta-classifier using concatenated latent features, avoiding raw traffic transfer and parameter averaging. By preserving node-specific behavior while integrating heterogeneous features, CW-AE improves detection performance and reduces communication overhead. Using the real-world 5G-NIDD dataset collected from two physical 5G base stations, we compare local single-node, centralized, and CW-AE-based distributed learning. The results show that CW-AE achieves superior detection capability and network efficiency, making it suitable for scalable edge IDS deployments. Full article
(This article belongs to the Special Issue AI-Enabled Next-Generation Computing and Its Applications)
Show Figures

Figure 1

16 pages, 1397 KB  
Article
ODEL: An Experience-Augmented Self-Evolving Framework for Efficient Python-to-C++ Code Translation
by Kaiyuan Feng, Furong Peng and Jiayue Wu
Appl. Sci. 2026, 16(3), 1506; https://doi.org/10.3390/app16031506 - 2 Feb 2026
Viewed by 649
Abstract
Automated code translation plays an important role in improving software reusability and supporting system migration, particularly in scenarios where Python implementations need to be converted into efficient C++ programs. However, existing approaches often rely heavily on large external models or static inference pipelines, [...] Read more.
Automated code translation plays an important role in improving software reusability and supporting system migration, particularly in scenarios where Python implementations need to be converted into efficient C++ programs. However, existing approaches often rely heavily on large external models or static inference pipelines, which limits their ability to improve translation quality over time.To address these challenges, this paper proposes ODEL, an On-Demand Experience-enhanced Learning framework for Python-to-C++ code translation. ODEL adopts a hybrid inference architecture in which a lightweight internal model performs routine translation, while a more capable external model is selectively invoked upon verification failure to conduct error analysis and generate structured experience records. These experience records are accumulated and reused across subsequent translation phases, enabling progressive improvement through a closed-loop workflow that integrates generation, verification, consideration, and experience refinement. Experiments on the HumanEval-X benchmark demonstrate that ODEL significantly improves translation accuracy compared with competitive baselines. Specifically, the framework increases Pass@1 from 71.82% to 81.10% and Pass@10 from 74.30% to 89.02%, and exhibits a consistent performance improvement across multiple translation phases. These results indicate that experience reuse within a continuous task stream can effectively enhance automated code translation without modifying model parameters. Full article
(This article belongs to the Special Issue AI-Enabled Next-Generation Computing and Its Applications)
Show Figures

Figure 1

29 pages, 3225 KB  
Article
Towards 6G Roaming Security: Experimental Analysis of SUCI-Based DoS, Cost, and NF Stress
by Taeho Won, Hoseok Kwon, Yongho Ko, Jhury Kevin Lastre and Ilsun You
Appl. Sci. 2026, 16(1), 508; https://doi.org/10.3390/app16010508 - 4 Jan 2026
Cited by 1 | Viewed by 815
Abstract
This study investigates performance overheads and security threats in 6th Generation Mobile Communication (6G) roaming environments, which are expected to enable services such as autonomous driving, smart cities, and remote healthcare that demand ultra-low latency and high reliability. To bridge the gap between [...] Read more.
This study investigates performance overheads and security threats in 6th Generation Mobile Communication (6G) roaming environments, which are expected to enable services such as autonomous driving, smart cities, and remote healthcare that demand ultra-low latency and high reliability. To bridge the gap between standardization and real-world deployment, we built a realistic roaming testbed by separating the home and visited public land mobile networks (H-PLMN and V-PLMN) and simulating user equipment (UE) interactions. In this environment, we defined and measured roaming cost by comparing non-roaming and roaming procedures, and reproduced two Subscription Concealed Identifier (SUCI)-based denial-of-service (DoS) attacks: random generation and replay. Our experiments showed that intermediary functions such as the Security Edge Protection Proxy (SEPP) and Service Communication Proxy (SCP) introduced CPU/memory overhead and latency, highlighting performance degradation unique to roaming. Moreover, random SUCI generation concentrated load on the Authentication Server Function (AUSF) in the H-PLMN, whereas replay attacks distributed it across both the H-PLMN and the V-PLMN, consistently identifying the AUSF as a bottleneck. These findings demonstrate that roaming enlarges the attack surface and exposes vulnerabilities not fully addressed in current standards. We conclude that secure and reliable 6G roaming requires multi-layered defense strategies with inter-operator cooperation, providing empirical evidence to guide standardization and operational practice. Full article
(This article belongs to the Special Issue AI-Enabled Next-Generation Computing and Its Applications)
Show Figures

Figure 1

26 pages, 498 KB  
Article
CoFT: A Fair and Transparent Compensation Framework for Hierarchical Federated Learning
by Siwan Noh, Sang Uk Shin and Kyung-Hyune Rhee
Appl. Sci. 2025, 15(23), 12568; https://doi.org/10.3390/app152312568 - 27 Nov 2025
Viewed by 662
Abstract
Hierarchical Federated Learning (HFL) requires a scalable and transparent incentive mechanism, yet existing on-chain approaches are too costly and centralized solutions lack trust. To address this, we propose CoFT, a hybrid framework that manages rewards off-chain using a hierarchy of state channels and [...] Read more.
Hierarchical Federated Learning (HFL) requires a scalable and transparent incentive mechanism, yet existing on-chain approaches are too costly and centralized solutions lack trust. To address this, we propose CoFT, a hybrid framework that manages rewards off-chain using a hierarchy of state channels and a stablecoin while securing the system’s integrity with a single on-chain cryptographic anchor. This architecture dramatically reduces on-chain complexity from a prohibitive O(N×T) to a sustainable O(I+S+N), enabling cryptographic verifiability, automated trustless payouts, and fair burden-sharing. CoFT thus offers a robust and economically viable blueprint for the financial infrastructure of large-scale, trustworthy collaborative AI ecosystems. Full article
(This article belongs to the Special Issue AI-Enabled Next-Generation Computing and Its Applications)
Show Figures

Figure 1

25 pages, 1573 KB  
Article
Lightweight Multi-Class Autoencoder Model for Malicious Traffic Detection in Private 5G Networks
by Jinha Kim, Seungjoon Na and Hwankuk Kim
Appl. Sci. 2025, 15(22), 12242; https://doi.org/10.3390/app152212242 - 18 Nov 2025
Viewed by 964
Abstract
This study proposes a lightweight autoencoder-based detection framework for the efficient detection of multi-class malicious traffic within a private 5G network slicing environment. Conventional deep learning-based detection approaches encounter difficulties in real-time processing and edge environment applications because of their significant computational complexity [...] Read more.
This study proposes a lightweight autoencoder-based detection framework for the efficient detection of multi-class malicious traffic within a private 5G network slicing environment. Conventional deep learning-based detection approaches encounter difficulties in real-time processing and edge environment applications because of their significant computational complexity and resource demands. To address this issue, this study balances traffic data using slice-label-based hierarchical sampling and performs domain-specific feature grouping to reflect semantic similarity. Independent autoencoders are trained for each group, and the latent vectors from the encoder outputs are combined to be used as input for an SVM-based multi-class classifier. This structure reflects traffic differences between slices while also improving computational efficiency. Four sets of experiments were constructed to verify the model’s performance and evaluate its structural performance, resource usage efficiency, classifier generalization performance, and whether it met SLA constraints from various perspectives. As a result, the proposed Multi-AE model achieved an accuracy of 0.93, a balanced accuracy of 0.93, and an ECE of 0.03, demonstrating high stability and detection reliability. Regarding resource utilization efficiency, GPU utilization was under 7%, and the average memory usage was approximately 5.7 GB, demonstrating resource efficiency. In SLA verification, inference latency below 10 ms and a throughput of 564 samples/s were achieved based on URLLC. This study is significant in that it experimentally demonstrated a detection structure that achieves a balance of accuracy, lightweight design, and real-time performance in a 5G slicing environment. Full article
(This article belongs to the Special Issue AI-Enabled Next-Generation Computing and Its Applications)
Show Figures

Figure 1

16 pages, 462 KB  
Article
Exploring the Potential of Anomaly Detection Through Reasoning with Large Language Models
by Sungjune Park and Daeseon Choi
Appl. Sci. 2025, 15(19), 10384; https://doi.org/10.3390/app151910384 - 24 Sep 2025
Cited by 3 | Viewed by 2946
Abstract
In recent years, anomaly detection in digital environments has become a critical research area due to issues such as spam messages and fake news, which can lead to privacy breaches, social disruption, and undermined information reliability. Traditional anomaly detection models often require specific [...] Read more.
In recent years, anomaly detection in digital environments has become a critical research area due to issues such as spam messages and fake news, which can lead to privacy breaches, social disruption, and undermined information reliability. Traditional anomaly detection models often require specific training for each task, resulting in significant time and resource consumption and limited flexibility. This study explores the use of Prompt Engineering with Transformer-based Large Language Models (LLMs) to address these challenges more efficiently. By comparing techniques such as Zero-shot, Few-shot, Chain-of-Thought (CoT), Self-Consistency (SC), and Tree-of-Thought (ToT) prompting, the study identifies CoT and SC as particularly effective, achieving up to 0.96 accuracy in spam detection without the need for task-specific training. However, ToT exhibited limitations due to biases and misinterpretation. The findings emphasize the importance of selecting appropriate prompting strategies to optimize LLM performance across various tasks, highlighting the potential of Prompt Engineering to reduce costs and improve the adaptability of anomaly detection systems. Future research is needed to explore the broader applicability and scalability of these methods. Additionally, this study includes a survey of Prompt Engineering techniques applicable to anomaly detection, examining strategies such as Self-Refine and Retrieval-Augmented Generation to further enhance detection accuracy and adaptability. Full article
(This article belongs to the Special Issue AI-Enabled Next-Generation Computing and Its Applications)
Show Figures

Figure 1

21 pages, 2093 KB  
Article
Dual-Stream Time-Series Transformer-Based Encrypted Traffic Data Augmentation Framework
by Daeho Choi, Yeog Kim, Changhoon Lee and Kiwook Sohn
Appl. Sci. 2025, 15(18), 9879; https://doi.org/10.3390/app15189879 - 9 Sep 2025
Cited by 1 | Viewed by 1687
Abstract
We propose a Transformer-based data augmentation framework with a time-series dual-stream architecture to address performance degradation in encrypted network traffic classification caused by class imbalance between attack and benign traffic. The proposed framework independently processes the complete flow’s sequential packet information and statistical [...] Read more.
We propose a Transformer-based data augmentation framework with a time-series dual-stream architecture to address performance degradation in encrypted network traffic classification caused by class imbalance between attack and benign traffic. The proposed framework independently processes the complete flow’s sequential packet information and statistical characteristics by extracting and normalizing a local channel (comprising packet size, inter-arrival time, and direction) and a set of six global flow-level statistical features. These are used to generate a fixed-length multivariate sequence and an auxiliary vector. The sequence and vector are then fed into an encoder-only Transformer that integrates learnable positional embeddings with a FiLM + context token-based injection mechanism, enabling complementary representation of sequential patterns and global statistical distributions. Large-scale experiments demonstrate that the proposed method reduces reconstruction RMSE and additional feature restoration MSE by over 50%, while improving accuracy, F1-Score, and AUC by 5–7%p compared to classification on the original imbalanced datasets. Furthermore, the augmentation process achieves practical levels of processing time and memory overhead. These results show that the proposed approach effectively mitigates class imbalance in encrypted traffic classification and offers a promising pathway to achieving more robust model generalization in real-world deployment scenarios. Full article
(This article belongs to the Special Issue AI-Enabled Next-Generation Computing and Its Applications)
Show Figures

Figure 1

25 pages, 2908 KB  
Article
Secure and Scalable File Encryption for Cloud Systems via Distributed Integration of Quantum and Classical Cryptography
by Changjong Kim, Seunghwan Kim, Kiwook Sohn, Yongseok Son, Manish Kumar and Sunggon Kim
Appl. Sci. 2025, 15(14), 7782; https://doi.org/10.3390/app15147782 - 11 Jul 2025
Cited by 3 | Viewed by 2945
Abstract
We propose a secure and scalable file-encryption scheme for cloud systems by integrating Post-Quantum Cryptography (PQC), Quantum Key Distribution (QKD), and Advanced Encryption Standard (AES) within a distributed architecture. While prior studies have primarily focused on secure key exchange or authentication protocols (e.g., [...] Read more.
We propose a secure and scalable file-encryption scheme for cloud systems by integrating Post-Quantum Cryptography (PQC), Quantum Key Distribution (QKD), and Advanced Encryption Standard (AES) within a distributed architecture. While prior studies have primarily focused on secure key exchange or authentication protocols (e.g., layered PQC-QKD key distribution), our scheme extends beyond key management by implementing a distributed encryption architecture that protects large-scale files through integrated PQC, QKD, and AES. To support high-throughput encryption, our proposed scheme partitions the target file into fixed-size subsets and distributes them across slave nodes, each performing parallel AES encryption using a locally reconstructed key from a PQC ciphertext. Each slave node receives a PQC ciphertext that encapsulates the AES key, along with a PQC secret key masked using QKD based on the BB84 protocol, both of which are centrally generated and managed by the master node for secure coordination. In addition, an encryption and transmission pipeline is designed to overlap I/O, encryption, and communication, thereby reducing idle time and improving resource utilization. The master node performs centralized decryption by collecting encrypted subsets, recovering the AES key, and executing decryption in parallel. Our evaluation using a real-world medical dataset shows that the proposed scheme achieves up to 2.37× speedup in end-to-end runtime and up to 8.11× speedup in encryption time over AES (Original). In addition to performance gains, our proposed scheme maintains low communication cost, stable CPU utilization across distributed nodes, and negligible overhead from quantum key management. Full article
(This article belongs to the Special Issue AI-Enabled Next-Generation Computing and Its Applications)
Show Figures

Figure 1

19 pages, 1107 KB  
Article
AV-Teller: Browser Fingerprinting for Client-Side Security Software Identification
by Hyeong-Seok Jang, Mohsen Ali Alawami and Ki-Woong Park
Appl. Sci. 2025, 15(9), 5059; https://doi.org/10.3390/app15095059 - 2 May 2025
Viewed by 2913
Abstract
The rapid proliferation of digitalization and the growing reliance on internet-based technologies by individuals and organizations have led to a significant escalation in the frequency and sophistication of cyberattacks. As attackers continuously refine their methods to evade conventional defense mechanisms, antivirus solutions, despite [...] Read more.
The rapid proliferation of digitalization and the growing reliance on internet-based technologies by individuals and organizations have led to a significant escalation in the frequency and sophistication of cyberattacks. As attackers continuously refine their methods to evade conventional defense mechanisms, antivirus solutions, despite their widespread utilization as primary security tools, face increasing challenges in addressing these evolving threats. This study introduces AV-Teller, a novel framework designed for analyzing antivirus behavior through interactions with web browsers. AV-Teller reveals weaknesses in antivirus detection mechanisms by highlighting ways in which web browser interactions may inadvertently expose critical aspects of antivirus operations. The framework provides key insights into the vulnerabilities inherent to these detection processes and their implications for the interplay between antivirus systems and modern web technologies. To assess the efficacy of the AV-Teller in detecting antivirus via web browsers, the framework evaluates three detection scenarios: Document Object Model (DOM) Monitoring-Based Detection, Signature-Based Detection, and Phishing Page-Based Detection. The results revealed performance inconsistencies: 16 products (57%) failed to respond to any tested scenarios, exhibiting deficiencies in threat mitigation capabilities. Of the 12 products (43%) that successfully handled three scenarios, 9 (75%) inadvertently disclosed identifiable antivirus metadata during assessments, thereby enabling attackers to pinpoint specific antivirus solutions and exploit their vulnerabilities. These findings highlight critical gaps in the interaction between antivirus systems and web technologies, exposing systemic flaws in existing security mechanisms. The inadvertent exposure of sensitive antivirus data underscores the necessity for robust data handling protocols, necessitating collaboration between antivirus developers and web technology stakeholders to design secure frameworks. By exposing these risks, the AV-Teller framework elucidates the limitations of current defenses and establishes a foundation for the enhancement of antivirus technologies to address emerging cyber threats effectively. Full article
(This article belongs to the Special Issue AI-Enabled Next-Generation Computing and Its Applications)
Show Figures

Figure 1

24 pages, 1327 KB  
Article
ZigBeeNet: Decrypted Zigbee IoT Network Traffic Dataset in Smart Home Environment
by Nur Keleşoğlu and Łukasz Sobczak
Appl. Sci. 2024, 14(23), 10844; https://doi.org/10.3390/app142310844 - 23 Nov 2024
Cited by 7 | Viewed by 4608
Abstract
The number of smart homes is increasing steadily. One of the first technologies that comes to mind when talking about smart homes is Zigbee, which stands out for its low cost, low latency, low power consumption, and mesh networking capabilities. One of the [...] Read more.
The number of smart homes is increasing steadily. One of the first technologies that comes to mind when talking about smart homes is Zigbee, which stands out for its low cost, low latency, low power consumption, and mesh networking capabilities. One of the key features of Zigbee is the encryption of payloads within its frames for security purposes. However, being able to decrypt this payload is crucial for fully understanding its operation and for purposes such as testing the network’s security. Therefore, in this paper, we present the decrypted Zigbee IoT Network Traffic dataset, ZigBeeNet. We captured packets using Wireshark in real time from a smart home with 15 Zigbee devices over 20 days and saved them in pcap files. Additionally, we used a key extraction method to obtain the network key, decrypt the payload data, and analyze the characteristic features of network traffic, which we present in this paper. ZigBeeNet will be useful in wider areas than existing datasets with its ability to support network security research, pattern analysis, network performance analysis, and Zigbee traffic generator. We believe that this open-source dataset will contribute significantly to a wide range of industrial and academic research applications. Full article
(This article belongs to the Special Issue AI-Enabled Next-Generation Computing and Its Applications)
Show Figures

Figure 1

Back to TopTop