Sign in to use this feature.

Years

Between: -

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (126)

Search Parameters:
Journal = Computers
Section = Internet of Things (IoT) and Industrial IoT

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
24 pages, 23907 KiB  
Article
Optimizing Data Pipelines for Green AI: A Comparative Analysis of Pandas, Polars, and PySpark for CO2 Emission Prediction
by Youssef Mekouar, Mohammed Lahmer and Mohammed Karim
Computers 2025, 14(8), 319; https://doi.org/10.3390/computers14080319 - 7 Aug 2025
Viewed by 107
Abstract
This study evaluates the performance and energy trade-offs of three popular data processing libraries—Pandas, PySpark, and Polars—applied to GreenNav, a CO2 emission prediction pipeline for urban traffic. GreenNav is an eco-friendly navigation app designed to predict CO2 emissions and determine low-carbon [...] Read more.
This study evaluates the performance and energy trade-offs of three popular data processing libraries—Pandas, PySpark, and Polars—applied to GreenNav, a CO2 emission prediction pipeline for urban traffic. GreenNav is an eco-friendly navigation app designed to predict CO2 emissions and determine low-carbon routes using a hybrid CNN-LSTM model integrated into a complete pipeline for the ingestion and processing of large, heterogeneous geospatial and road data. Our study quantifies the end-to-end execution time, cumulative CPU load, and maximum RAM consumption for each library when applied to the GreenNav pipeline; it then converts these metrics into energy consumption and CO2 equivalents. Experiments conducted on datasets ranging from 100 MB to 8 GB demonstrate that Polars in lazy mode offers substantial gains, reducing the processing time by a factor of more than twenty, memory consumption by about two-thirds, and energy consumption by about 60%, while maintaining the predictive accuracy of the model (R2 ≈ 0.91). These results clearly show that the careful selection of data processing libraries can reconcile high computing performance and environmental sustainability in large-scale machine learning applications. Full article
(This article belongs to the Section Internet of Things (IoT) and Industrial IoT)
Show Figures

Figure 1

25 pages, 2870 KiB  
Article
Performance Evaluation and QoS Optimization of Routing Protocols in Vehicular Communication Networks Under Delay-Sensitive Conditions
by Alaa Kamal Yousif Dafhalla, Hiba Mohanad Isam, Amira Elsir Tayfour Ahmed, Ikhlas Saad Ahmed, Lutfieh S. Alhomed, Amel Mohamed essaket Zahou, Fawzia Awad Elhassan Ali, Duria Mohammed Ibrahim Zayan, Mohamed Elshaikh Elobaid and Tijjani Adam
Computers 2025, 14(7), 285; https://doi.org/10.3390/computers14070285 - 17 Jul 2025
Viewed by 325
Abstract
Vehicular Communication Networks (VCNs) are essential to intelligent transportation systems, where real-time data exchange between vehicles and infrastructure supports safety, efficiency, and automation. However, achieving high Quality of Service (QoS)—especially under delay-sensitive conditions—remains a major challenge due to the high mobility and dynamic [...] Read more.
Vehicular Communication Networks (VCNs) are essential to intelligent transportation systems, where real-time data exchange between vehicles and infrastructure supports safety, efficiency, and automation. However, achieving high Quality of Service (QoS)—especially under delay-sensitive conditions—remains a major challenge due to the high mobility and dynamic topology of vehicular environments. While some efforts have explored routing protocol optimization, few have systematically compared multiple optimization approaches tailored to distinct traffic and delay conditions. This study addresses this gap by evaluating and enhancing two widely used routing protocols, QOS-AODV and GPSR, through their improved versions, CM-QOS-AODV and CM-GPSR. Two distinct optimization models are proposed: the Traffic-Oriented Model (TOM), designed to handle variable and high-traffic conditions, and the Delay-Efficient Model (DEM), focused on reducing latency for time-critical scenarios. Performance was evaluated using key QoS metrics: throughput (rate of successful data delivery), packet delivery ratio (PDR) (percentage of successfully delivered packets), and end-to-end delay (latency between sender and receiver). Simulation results reveal that TOM-optimized protocols achieve up to 10% higher PDR, maintain throughput above 0.40 Mbps, and reduce delay to as low as 0.01 s, making them suitable for applications such as collision avoidance and emergency alerts. DEM-based variants offer balanced, moderate improvements, making them better suited for general-purpose VCN applications. These findings underscore the importance of traffic- and delay-aware protocol design in developing robust, QoS-compliant vehicular communication systems. Full article
(This article belongs to the Special Issue Application of Deep Learning to Internet of Things Systems)
Show Figures

Figure 1

18 pages, 533 KiB  
Article
Comparative Analysis of Deep Learning Models for Intrusion Detection in IoT Networks
by Abdullah Waqas, Sultan Daud Khan, Zaib Ullah, Mohib Ullah and Habib Ullah
Computers 2025, 14(7), 283; https://doi.org/10.3390/computers14070283 - 17 Jul 2025
Viewed by 327
Abstract
The Internet of Things (IoT) holds transformative potential in fields such as power grid optimization, defense networks, and healthcare. However, the constrained processing capacities and resource limitations of IoT networks make them especially susceptible to cyber threats. This study addresses the problem of [...] Read more.
The Internet of Things (IoT) holds transformative potential in fields such as power grid optimization, defense networks, and healthcare. However, the constrained processing capacities and resource limitations of IoT networks make them especially susceptible to cyber threats. This study addresses the problem of detecting intrusions in IoT environments by evaluating the performance of deep learning (DL) models under different data and algorithmic conditions. We conducted a comparative analysis of three widely used DL models—Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM), and Bidirectional LSTM (biLSTM)—across four benchmark IoT intrusion detection datasets: BoTIoT, CiCIoT, ToNIoT, and WUSTL-IIoT-2021. Each model was assessed under balanced and imbalanced dataset configurations and evaluated using three loss functions (cross-entropy, focal loss, and dual focal loss). By analyzing model efficacy across these datasets, we highlight the importance of generalizability and adaptability to varied data characteristics that are essential for real-world applications. The results demonstrate that the CNN trained using the cross-entropy loss function consistently outperforms the other models, particularly on balanced datasets. On the other hand, LSTM and biLSTM show strong potential in temporal modeling, but their performance is highly dependent on the characteristics of the dataset. By analyzing the performance of multiple DL models under diverse datasets, this research provides actionable insights for developing secure, interpretable IoT systems that can meet the challenges of designing a secure IoT system. Full article
(This article belongs to the Special Issue Application of Deep Learning to Internet of Things Systems)
Show Figures

Figure 1

21 pages, 2170 KiB  
Article
IoT-Driven Intelligent Energy Management: Leveraging Smart Monitoring Applications and Artificial Neural Networks (ANN) for Sustainable Practices
by Azza Mohamed, Ibrahim Ismail and Mohammed AlDaraawi
Computers 2025, 14(7), 269; https://doi.org/10.3390/computers14070269 - 9 Jul 2025
Cited by 1 | Viewed by 451
Abstract
The growing mismanagement of energy resources is a pressing issue that poses significant risks to both individuals and the environment. As energy consumption continues to rise, the ramifications become increasingly severe, necessitating urgent action. In response, the rapid expansion of Internet of Things [...] Read more.
The growing mismanagement of energy resources is a pressing issue that poses significant risks to both individuals and the environment. As energy consumption continues to rise, the ramifications become increasingly severe, necessitating urgent action. In response, the rapid expansion of Internet of Things (IoT) devices offers a promising and innovative solution due to their adaptability, low power consumption, and transformative potential in energy management. This study describes a novel, integrative strategy that integrates IoT and Artificial Neural Networks (ANNs) in a smart monitoring mobile application intended to optimize energy usage and promote sustainability in residential settings. While both IoT and ANN technologies have been investigated separately in previous research, the uniqueness of this work is the actual integration of both technologies into a real-time, user-adaptive framework. The application allows for continuous energy monitoring via modern IoT devices and wireless sensor networks, while ANN-based prediction models evaluate consumption data to dynamically optimize energy use and reduce environmental effect. The system’s key features include simulated consumption scenarios and adaptive user profiles, which account for differences in household behaviors and occupancy patterns, allowing for tailored recommendations and energy control techniques. The architecture allows for remote device control, real-time feedback, and scenario-based simulations, making the system suitable for a wide range of home contexts. The suggested system’s feasibility and effectiveness are proved through detailed simulations, highlighting its potential to increase energy efficiency and encourage sustainable habits. This study contributes to the rapidly evolving field of intelligent energy management by providing a scalable, integrated, and user-centric solution that bridges the gap between theoretical models and actual implementation. Full article
Show Figures

Figure 1

22 pages, 7580 KiB  
Article
Fuzzy-Based Multi-Modal Query-Forwarding in Mini-Datacenters
by Sami J. Habib and Paulvanna Nayaki Marimuthu
Computers 2025, 14(7), 261; https://doi.org/10.3390/computers14070261 - 1 Jul 2025
Viewed by 314
Abstract
The rapid growth of Internet of Things (IoT) enabled devices in industrial environments and the associated increase in data generation are paving the way for the development of localized, distributed datacenters. In this paper, we have proposed a novel mini-datacenter in the form [...] Read more.
The rapid growth of Internet of Things (IoT) enabled devices in industrial environments and the associated increase in data generation are paving the way for the development of localized, distributed datacenters. In this paper, we have proposed a novel mini-datacenter in the form of wireless sensor networks to efficiently handle query-based data collection from Industrial IoT (IIoT) devices. The mini-datacenter comprises a command center, gateways, and IoT sensors, designed to manage stochastic query-response traffic flow. We have developed a duplication/aggregation query flow model, tailored to emphasize reliable transmission. We have developed a dataflow management framework that employs a multi-modal query forwarding approach to forward queries from the command center to gateways under varying environments. The query forwarding includes coarse-grain and fine-grain strategies, where the coarse-grain strategy uses a direct data flow using a single gateway at the expense of reliability, while the fine-grain approach uses redundant gateways to enhance reliability. A fuzzy-logic-based intelligence system is integrated into the framework to dynamically select the appropriate granularity of the forwarding strategy based on the resource availability and network conditions, aided by a buffer watching algorithm that tracks real-time buffer status. We carried out several experiments with gateway nodes varying from 10 to 100 to evaluate the framework’s scalability and robustness in handling the query flow under complex environments. The experimental results demonstrate that the framework provides a flexible and adaptive solution that balances buffer usage while maintaining over 95% reliability in most queries. Full article
(This article belongs to the Section Internet of Things (IoT) and Industrial IoT)
Show Figures

Figure 1

26 pages, 3334 KiB  
Review
Simulation-Based Development of Internet of Cyber-Things Using DEVS
by Laurent Capocchi, Bernard P. Zeigler and Jean-Francois Santucci
Computers 2025, 14(7), 258; https://doi.org/10.3390/computers14070258 - 30 Jun 2025
Viewed by 460
Abstract
Simulation-based development is a structured approach that uses formal models to design and test system behavior before building the actual system. The Internet of Things (IoT) connects physical devices equipped with sensors and software to collect and exchange data. Cyber-Physical Systems (CPSs) integrate [...] Read more.
Simulation-based development is a structured approach that uses formal models to design and test system behavior before building the actual system. The Internet of Things (IoT) connects physical devices equipped with sensors and software to collect and exchange data. Cyber-Physical Systems (CPSs) integrate computing directly into physical processes to enable real-time control. This paper reviews the Discrete-Event System Specification (DEVS) formalism and explores how it can serve as a unified framework for designing, simulating, and implementing systems that combine IoT and CPS—referred to as the Internet of Cyber-Things (IoCT). Through case studies that include home automation, solar energy monitoring, conflict management, and swarm robotics, the paper reviews how DEVS enables construction of modular, scalable, and reusable models. The role of the System Entity Structure (SES) is also discussed, highlighting its contribution in organizing models and generating alternative system configurations. With this background as basis, the paper evaluates whether DEVS provides the necessary modeling power and continuity across stages to support the development of complex IoCT systems. The paper concludes that DEVS offers a robust and flexible foundation for developing IoCT systems, supporting both expressiveness and seamless transition from design to real-world deployment. Full article
(This article belongs to the Section Internet of Things (IoT) and Industrial IoT)
Show Figures

Figure 1

39 pages, 1839 KiB  
Review
The Integration of the Internet of Things (IoT) Applications into 5G Networks: A Review and Analysis
by Aymen I. Zreikat, Zakwan AlArnaout, Ahmad Abadleh, Ersin Elbasi and Nour Mostafa
Computers 2025, 14(7), 250; https://doi.org/10.3390/computers14070250 - 25 Jun 2025
Cited by 1 | Viewed by 1818
Abstract
The incorporation of Internet of Things (IoT) applications into 5G networks marks a significant step towards realizing the full potential of connected systems. 5G networks, with their ultra-low latency, high data speeds, and huge interconnection, provide a perfect foundation for IoT ecosystems to [...] Read more.
The incorporation of Internet of Things (IoT) applications into 5G networks marks a significant step towards realizing the full potential of connected systems. 5G networks, with their ultra-low latency, high data speeds, and huge interconnection, provide a perfect foundation for IoT ecosystems to thrive. This connectivity offers a diverse set of applications, including smart cities, self-driving cars, industrial automation, healthcare monitoring, and agricultural solutions. IoT devices can improve their reliability, real-time communication, and scalability by exploiting 5G’s advanced capabilities such as network slicing, edge computing, and enhanced mobile broadband. Furthermore, the convergence of IoT with 5G fosters interoperability, allowing for smooth communication across diverse devices and networks. This study examines the fundamental technical applications, obstacles, and future perspectives for integrating IoT applications with 5G networks, emphasizing the potential benefits while also addressing essential concerns such as security, energy efficiency, and network management. The results of this review and analysis will act as a valuable resource for researchers, industry experts, and policymakers involved in the progression of 5G technologies and their incorporation with IT solutions. Full article
Show Figures

Figure 1

27 pages, 3100 KiB  
Article
Reducing Delivery Times by Utilising On-Site Wire Arc Additive Manufacturing with Digital-Twin Methods
by Stefanie Sell, Kevin Villani and Marc Stautner
Computers 2025, 14(6), 221; https://doi.org/10.3390/computers14060221 - 6 Jun 2025
Viewed by 464
Abstract
The increasing demand for smaller batch sizes and mass customisation in production poses considerable challenges to logistics and manufacturing efficiency. Conventional methodologies are unable to address the need for expeditious, cost-effective distribution of premium-quality products tailored to individual specifications. Additionally, the reliability and [...] Read more.
The increasing demand for smaller batch sizes and mass customisation in production poses considerable challenges to logistics and manufacturing efficiency. Conventional methodologies are unable to address the need for expeditious, cost-effective distribution of premium-quality products tailored to individual specifications. Additionally, the reliability and resilience of global logistics chains are increasingly under pressure. Additive manufacturing is regarded as a potentially viable solution to these problems, as it enables on-demand, on-site production, with reduced resource usage in production. Nevertheless, there are still significant challenges to be addressed, including the assurance of product quality and the optimisation of production processes with respect to time and resource efficiency. This article examines the potential of integrating digital twin methodologies to establish a fully digital and efficient process chain for on-site additive manufacturing. This study focuses on wire arc additive manufacturing (WAAM), a technology that has been successfully implemented in the on-site production of naval ship propellers and excavator parts. The proposed approach aims to enhance process planning efficiency, reduce material and energy consumption, and minimise the expertise required for operational deployment by leveraging digital twin methodologies. The present paper details the current state of research in this domain and outlines a vision for a fully virtualised process chain, highlighting the transformative potential of digital twin technologies in advancing on-site additive manufacturing. In this context, various aspects and components of a digital twin framework for wire arc additive manufacturing are examined regarding their necessity and applicability. The overarching objective of this paper is to conduct a preliminary investigation for the implementation and further development of a comprehensive DT framework for WAAM. Utilising a real-world sample, current already available process steps are validated and actual missing technical solutions are pointed out. Full article
(This article belongs to the Section Internet of Things (IoT) and Industrial IoT)
Show Figures

Figure 1

41 pages, 4206 KiB  
Systematic Review
A Systematic Literature Review on Load-Balancing Techniques in Fog Computing: Architectures, Strategies, and Emerging Trends
by Danah Aldossary, Ezaz Aldahasi, Taghreed Balharith and Tarek Helmy
Computers 2025, 14(6), 217; https://doi.org/10.3390/computers14060217 - 2 Jun 2025
Viewed by 724
Abstract
Fog computing has emerged as a promising paradigm to extend cloud services toward the edge of the network, enabling low-latency processing and real-time responsiveness for Internet of Things (IoT) applications. However, the distributed, heterogeneous, and resource-constrained nature of fog environments introduces significant challenges [...] Read more.
Fog computing has emerged as a promising paradigm to extend cloud services toward the edge of the network, enabling low-latency processing and real-time responsiveness for Internet of Things (IoT) applications. However, the distributed, heterogeneous, and resource-constrained nature of fog environments introduces significant challenges in balancing workloads efficiently. This study presents a systematic literature review (SLR) of 113 peer-reviewed articles published between 2020 and 2024, aiming to provide a comprehensive overview of load-balancing strategies in fog computing. This review categorizes fog computing architectures, load-balancing algorithms, scheduling and offloading techniques, fault-tolerance mechanisms, security models, and evaluation metrics. The analysis reveals that three-layer (IoT–Fog–Cloud) architectures remain predominant, with dynamic clustering and virtualization commonly employed to enhance adaptability. Heuristic and hybrid load-balancing approaches are most widely adopted due to their scalability and flexibility. Evaluation frequently centers on latency, energy consumption, and resource utilization, while simulation is primarily conducted using tools such as iFogSim and YAFS. Despite considerable progress, key challenges persist, including workload diversity, security enforcement, and real-time decision-making under dynamic conditions. Emerging trends highlight the growing use of artificial intelligence, software-defined networking, and blockchain to support intelligent, secure, and autonomous load balancing. This review synthesizes current research directions, identifies critical gaps, and offers recommendations for designing efficient and resilient fog-based load-balancing systems. Full article
(This article belongs to the Special Issue Edge and Fog Computing for Internet of Things Systems (2nd Edition))
Show Figures

Figure 1

21 pages, 1337 KiB  
Article
Applications of Multi-Criteria Decision Making in Information Systems for Strategic and Operational Decisions
by Mitra Madanchian and Hamed Taherdoost
Computers 2025, 14(6), 208; https://doi.org/10.3390/computers14060208 - 26 May 2025
Viewed by 1204
Abstract
Business problems today are complicated and involve considering numerous dimensions to be weighed against each other, leading to opposing goals that must be compromised on to discover the best solution. Multi-Criteria Decision Making or MCDM plays an essential role in this situation here. [...] Read more.
Business problems today are complicated and involve considering numerous dimensions to be weighed against each other, leading to opposing goals that must be compromised on to discover the best solution. Multi-Criteria Decision Making or MCDM plays an essential role in this situation here. MCDM techniques and procedures analyze, score, and select between options that have various conflicting criteria. This systematic review investigates applications of MCDM methods within Management Information Systems (MIS) based on evidence from 40 peer-reviewed articles selected from the Scopus database. Key methods discussed are Analytic Hierarchy Process (AHP), TOPSIS, fuzzy logic-based methods, and Analytic Network Process (ANP). These methods were applied across MIS strategic planning, re-source assignment, risk assessment, and technology selection. The review contributes further by categorizing MCDM application into thematic decision domains, evaluating methodological directions, and mapping the strengths of each method against specific MIS problems. Theoretical guidelines are suggested to align the type of decision with an appropriate MCDM strategy. The study demonstrates how the addition of MCDM enhances MIS capability with data-driven, transparent decision-making power. Implications and directions for future research are presented to guide scholars and practitioners. Full article
Show Figures

Graphical abstract

24 pages, 4739 KiB  
Article
Secured Audio Framework Based on Chaotic-Steganography Algorithm for Internet of Things Systems
by Mai Helmy and Hanaa Torkey
Computers 2025, 14(6), 207; https://doi.org/10.3390/computers14060207 - 26 May 2025
Viewed by 478
Abstract
The exponential growth of interconnected devices in the Internet of Things (IoT) has raised significant concerns about data security, especially when transmitting sensitive information over wireless channels. Traditional encryption techniques often fail to meet the energy and processing constraints of resource-limited IoT devices. [...] Read more.
The exponential growth of interconnected devices in the Internet of Things (IoT) has raised significant concerns about data security, especially when transmitting sensitive information over wireless channels. Traditional encryption techniques often fail to meet the energy and processing constraints of resource-limited IoT devices. This paper proposes a novel hybrid security framework that integrates chaotic encryption and steganography to enhance confidentiality, integrity, and resilience in audio communication. Chaotic systems generate unpredictable keys for strong encryption, while steganography conceals the existence of sensitive data within audio signals, adding a covert layer of protection. The proposed approach is evaluated within an Orthogonal Frequency Division Multiplexing (OFDM)-based wireless communication system, widely recognized for its robustness against interference and channel impairments. By combining secure encryption with a practical transmission scheme, this work demonstrates the effectiveness of the proposed hybrid method in realistic IoT environments, achieving high performance in terms of signal integrity, security, and resistance to noise. Simulation results indicate that the OFDM system incorporating chaotic algorithm modes alongside steganography outperforms the chaotic algorithm alone, particularly at higher Eb/No values. Notably, with DCT-OFDM, the chaotic-CFB based on steganography algorithm achieves a performance gain of approximately 30 dB compared to FFT-OFDM and DWT-based systems at Eb/No = 8 dB. These findings suggest that steganography plays a crucial role in enhancing secure transmission, offering greater signal deviation, reduced correlation, a more uniform histogram, and increased resistance to noise, especially in high BER scenarios. This highlights the potential of hybrid cryptographic-steganographic methods in safeguarding sensitive audio information within IoT networks and provides a foundation for future advancements in secure IoT communication systems. Full article
(This article belongs to the Special Issue Edge and Fog Computing for Internet of Things Systems (2nd Edition))
Show Figures

Figure 1

32 pages, 4255 KiB  
Article
Improving Real-Time Economic Decisions Through Edge Computing: Implications for Financial Contagion Risk Management
by Ștefan Ionescu, Camelia Delcea and Ionuț Nica
Computers 2025, 14(5), 196; https://doi.org/10.3390/computers14050196 - 18 May 2025
Viewed by 873
Abstract
In the face of accelerating digitalization and growing systemic vulnerabilities, the ability to make accurate, real-time economic decisions has become a critical capability for financial and institutional stability. This study investigates how edge computing infrastructures influence decision-making accuracy, responsiveness, and risk containment in [...] Read more.
In the face of accelerating digitalization and growing systemic vulnerabilities, the ability to make accurate, real-time economic decisions has become a critical capability for financial and institutional stability. This study investigates how edge computing infrastructures influence decision-making accuracy, responsiveness, and risk containment in economic systems, particularly under the threat of financial contagion. A synthetic dataset simulating the interaction between economic indicators and edge performance metrics was constructed to emulate real-time decision environments. Composite indicators were developed to quantify key dynamics, and a range of machine learning models, including XGBoost, Random Forest, and Neural Networks, were applied to classify economic decision outcomes. The results indicate that low latency, efficient resource use, and balanced workload distribution are significantly associated with higher decision quality. XGBoost outperformed all other models, achieving 97% accuracy and a ROC-AUC of 0.997. The findings suggest that edge computing performance metrics can act as predictive signals for systemic fragility and may be integrated into early warning systems for financial risk management. This study contributes to the literature by offering a novel framework for modeling the economic implications of edge intelligence and provides policy insights for designing resilient, real-time financial infrastructures. Full article
Show Figures

Figure 1

20 pages, 3977 KiB  
Article
Investigation of Multiple Hybrid Deep Learning Models for Accurate and Optimized Network Slicing
by Ahmed Raoof Nasser and Omar Younis Alani
Computers 2025, 14(5), 174; https://doi.org/10.3390/computers14050174 - 2 May 2025
Viewed by 653
Abstract
In 5G wireless communication, network slicing is considered one of the key network elements, which aims to provide services with high availability, low latency, maximizing data throughput, and ultra-reliability and save network resources. Due to the exponential expansion of cellular networking in the [...] Read more.
In 5G wireless communication, network slicing is considered one of the key network elements, which aims to provide services with high availability, low latency, maximizing data throughput, and ultra-reliability and save network resources. Due to the exponential expansion of cellular networking in the number of users along with the new applications, delivering the desired Quality of Service (QoS) requires an accurate and fast network slicing mechanism. In this paper, hybrid deep learning (DL) approaches are investigated using convolutional neural networks (CNNs), Long Short-Term Memory (LSTM), recurrent neural networks (RNNs), and Gated Recurrent Units (GRUs) to provide an accurate network slicing model. The proposed hybrid approaches are CNN-LSTM, CNN-RNN, and CNN-GRU, where a CNN is initially used for effective feature extraction and then LSTM, an RNN, and GRUs are utilized to achieve an accurate network slice classification. To optimize the model performance in terms of accuracy and model complexity, the hyperparameters of each algorithm are selected using the Bayesian optimization algorithm. The obtained results illustrate that the optimized hybrid CNN-GRU algorithm provides the best performance in terms of slicing accuracy (99.31%) and low model complexity. Full article
Show Figures

Graphical abstract

28 pages, 2200 KiB  
Article
Fine-Tuning Network Slicing in 5G: Unveiling Mathematical Equations for Precision Classification
by Nikola Anđelić, Sandi Baressi Šegota and Vedran Mrzljak
Computers 2025, 14(5), 159; https://doi.org/10.3390/computers14050159 - 25 Apr 2025
Viewed by 582
Abstract
Modern 5G network slicing centers on the precise design of virtual, independent networks operating over a shared physical infrastructure, each configured to meet specific service requirements. This approach plays a vital role in enabling highly customized and flexible service delivery within the 5G [...] Read more.
Modern 5G network slicing centers on the precise design of virtual, independent networks operating over a shared physical infrastructure, each configured to meet specific service requirements. This approach plays a vital role in enabling highly customized and flexible service delivery within the 5G ecosystem. In this study, we present the application of a genetic programming symbolic classifier to a dedicated network slicing dataset, resulting in the generation of accurate symbolic expressions for classifying different network slice types. To address the issue of class imbalance, we employ oversampling strategies that produce balanced variations of the dataset. Furthermore, a random search strategy is used to explore the hyperparameter space comprehensively in pursuit of optimal classification performance. The derived symbolic models, refined through threshold tuning based on prediction correctness, are subsequently evaluated on the original imbalanced dataset. The proposed method demonstrates outstanding performance, achieving a perfect classification accuracy of 1.0. Full article
Show Figures

Figure 1

16 pages, 1226 KiB  
Article
Advanced Digital System for International Collaboration on Biosample-Oriented Research: A Multicriteria Query Tool for Real-Time Biosample and Patient Cohort Searches
by Alexandros Fridas, Anna Bourouliti, Loukia Touramanidou, Desislava Ivanova, Kostantinos Votis and Panagiotis Katsaounis
Computers 2025, 14(5), 157; https://doi.org/10.3390/computers14050157 - 23 Apr 2025
Viewed by 465
Abstract
The advancement of biomedical research depends on efficient data sharing, integration, and annotation to ensure reproducibility, accessibility, and cross-disciplinary collaboration. International collaborative research is crucial for advancing biomedical science and innovation but often faces significant barriers, such as data sharing limitations, inefficient sample [...] Read more.
The advancement of biomedical research depends on efficient data sharing, integration, and annotation to ensure reproducibility, accessibility, and cross-disciplinary collaboration. International collaborative research is crucial for advancing biomedical science and innovation but often faces significant barriers, such as data sharing limitations, inefficient sample management, and scalability challenges. Existing infrastructures for biosample and data repositories face challenges limiting large-scale research efforts. This study presents a novel platform designed to address these issues, enabling researchers to conduct high-quality research more efficiently and at reduced costs. The platform employs a modular, distributed architecture that ensures high availability, redundancy, and interoperability among diverse stakeholders, as well as integrates advanced features, including secure access management, comprehensive query functionalities, real-time availability reporting, and robust data mining capabilities. In addition, this platform supports dynamic, multi-criteria searches tailored to disease-specific patient profiles and biosample-related data across pre-analytical, post-analytical, and cryo-storage processes. By evaluating the platform’s modular architecture and pilot testing outcomes, this study demonstrates its potential to enhance interdisciplinary collaboration, streamline research workflows, and foster transformative advancements in biomedical research. The key is the innovation of a real-time dynamic e-consent (DRT e-consent) system, which allows donors to update their consent status in real time, ensuring compliance with ethical and regulatory frameworks such as GDPR and HIPAA. The system also supports multi-modal data integration, including genomic sequences, electronic health records (EHRs), and imaging data, enabling researchers to perform complex queries and generate comprehensive insights. Full article
(This article belongs to the Special Issue Future Systems Based on Healthcare 5.0 for Pandemic Preparedness 2024)
Show Figures

Figure 1

Back to TopTop