Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (729)

Search Parameters:
Keywords = radio management

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
24 pages, 5968 KiB  
Article
Life Cycle Assessment of a Digital Tool for Reducing Environmental Burdens in the European Milk Supply Chain
by Yuan Zhang, Junzhang Wu, Haida Wasim, Doris Yicun Wu, Filippo Zuliani and Alessandro Manzardo
Appl. Sci. 2025, 15(15), 8506; https://doi.org/10.3390/app15158506 (registering DOI) - 31 Jul 2025
Viewed by 109
Abstract
Food loss and waste from the European Union’s dairy supply chain, particularly in the management of fresh milk, imposes significant environmental burdens. This study demonstrates that implementing Radio Frequency Identification (RFID)-enabled digital decision-support tools can substantially reduce these impacts across the region. A [...] Read more.
Food loss and waste from the European Union’s dairy supply chain, particularly in the management of fresh milk, imposes significant environmental burdens. This study demonstrates that implementing Radio Frequency Identification (RFID)-enabled digital decision-support tools can substantially reduce these impacts across the region. A cradle-to-grave life cycle assessment (LCA) was used to quantify both the additional environmental burdens from RFID (tag production, usage, and disposal) and the avoided burdens due to reduced milk losses in the farm, processing, and distribution stages. Within the EU’s fresh milk supply chain, the implementation of digital tools could result in annual net reductions of up to 80,000 tonnes of CO2-equivalent greenhouse gas emissions, 81,083 tonnes of PM2.5-equivalent particulate matter, 84,326 tonnes of land use–related carbon deficit, and 80,000 cubic meters of freshwater-equivalent consumption. Spatial analysis indicates that regions with historically high spoilage rates, particularly in Southern and Eastern Europe, see the greatest benefits from RFID enabled digital-decision support tools. These environmental savings are most pronounced during the peak months of milk production. Overall, the study demonstrates that despite the environmental footprint of RFID systems, their integration into the EU’S dairy supply chain enhances transparency, reduces waste, and improves resource efficiency—supporting their strategic value. Full article
(This article belongs to the Special Issue Artificial Intelligence and Numerical Simulation in Food Engineering)
Show Figures

Figure 1

19 pages, 1072 KiB  
Article
Efficient and Reliable Identification of Probabilistic Cloning Attacks in Large-Scale RFID Systems
by Chu Chu, Rui Wang, Nanbing Deng and Gang Li
Micromachines 2025, 16(8), 894; https://doi.org/10.3390/mi16080894 (registering DOI) - 31 Jul 2025
Viewed by 172
Abstract
Radio Frequency Identification (RFID) technology is widely applied in various scenarios, including logistics tracking, supply chain management, and target monitoring. In these contexts, the malicious cloning of legitimate tag information can lead to sensitive data leakage and disrupt the normal acquisition of tag [...] Read more.
Radio Frequency Identification (RFID) technology is widely applied in various scenarios, including logistics tracking, supply chain management, and target monitoring. In these contexts, the malicious cloning of legitimate tag information can lead to sensitive data leakage and disrupt the normal acquisition of tag information by readers, thereby threatening personal privacy and corporate security and incurring significant economic losses. Although some efforts have been made to detect cloning attacks, the presence of missing tags in RFID systems can obscure cloned ones, resulting in a significant reduction in identification efficiency and accuracy. To address these problems, we propose the block-based cloned tag identification (BCTI) protocol for identifying cloning attacks in the presence of missing tags. First, we introduce a block indicator to sort all tags systematically and design a block mechanism that enables tags to respond repeatedly within a block with minimal time overhead. Then, we design a superposition strategy to further reduce the number of verification times, thereby decreasing the execution overhead. Through an in-depth analysis of potential tag response patterns, we develop a precise method to identify cloning attacks and mitigate interference from missing tags in probabilistic cloning attack scenarios. Moreover, we perform parameter optimization of the BCTI protocol and validate its performance across diverse operational scenarios. Extensive simulation results demonstrate that the BCTI protocol meets the required identification reliability threshold and achieves an average improvement of 24.01% in identification efficiency compared to state-of-the-art solutions. Full article
Show Figures

Figure 1

22 pages, 6452 KiB  
Article
A Blockchain and IoT-Enabled Framework for Ethical and Secure Coffee Supply Chains
by John Byrd, Kritagya Upadhyay, Samir Poudel, Himanshu Sharma and Yi Gu
Future Internet 2025, 17(8), 334; https://doi.org/10.3390/fi17080334 - 27 Jul 2025
Viewed by 437
Abstract
The global coffee supply chain is a complex multi-stakeholder ecosystem plagued by fragmented records, unverifiable origin claims, and limited real-time visibility. These limitations pose risks to ethical sourcing, product quality, and consumer trust. To address these issues, this paper proposes a blockchain and [...] Read more.
The global coffee supply chain is a complex multi-stakeholder ecosystem plagued by fragmented records, unverifiable origin claims, and limited real-time visibility. These limitations pose risks to ethical sourcing, product quality, and consumer trust. To address these issues, this paper proposes a blockchain and IoT-enabled framework for secure and transparent coffee supply chain management. The system integrates simulated IoT sensor data such as Radio-Frequency Identification (RFID) identity tags, Global Positioning System (GPS) logs, weight measurements, environmental readings, and mobile validations with Ethereum smart contracts to establish traceability and automate supply chain logic. A Solidity-based Ethereum smart contract is developed and deployed on the Sepolia testnet to register users and log batches and to handle ownership transfers. The Internet of Things (IoT) data stream is simulated using structured datasets to mimic real-world device behavior, ensuring that the system is tested under realistic conditions. Our performance evaluation on 1000 transactions shows that the model incurs low transaction costs and demonstrates predictable efficiency behavior of the smart contract in decentralized conditions. Over 95% of the 1000 simulated transactions incurred a gas fee of less than ETH 0.001. The proposed architecture is also scalable and modular, providing a foundation for future deployment with live IoT integrations and off-chain data storage. Overall, the results highlight the system’s ability to improve transparency and auditability, automate enforcement, and enhance consumer confidence in the origin and handling of coffee products. Full article
Show Figures

Figure 1

19 pages, 1887 KiB  
Review
Comparative Analysis of Beamforming Techniques and Beam Management in 5G Communication Systems
by Cristina Maria Andras, Gordana Barb and Marius Otesteanu
Sensors 2025, 25(15), 4619; https://doi.org/10.3390/s25154619 - 25 Jul 2025
Viewed by 526
Abstract
The advance of 5G technology marks a significant evolution in wireless communications, characterized by ultra-high data rates, low latency, and massive connectivity across varied areas. A fundamental enabler of these capabilities is represented by beamforming, an advanced signal processing technique that focuses radio [...] Read more.
The advance of 5G technology marks a significant evolution in wireless communications, characterized by ultra-high data rates, low latency, and massive connectivity across varied areas. A fundamental enabler of these capabilities is represented by beamforming, an advanced signal processing technique that focuses radio energy to a specific user equipment (UE), thereby enhancing signal quality—crucial for maximizing spectral efficiency. The work presents a classification of beamforming techniques, categorized according to the implementation within 5G New Radio (NR) architectures. Furthermore, the paper investigates beam management (BM) procedures, which are essential Layer 1 and Layer 2 mechanisms responsible for the dynamic configuration, monitoring, and maintenance of optimal beam pair links between gNodeBs and UEs. The article emphasizes the spectral spectrogram of Synchronization Signal Blocks (SSBs) generated under various deployment scenarios, illustrating how parameters such as subcarrier spacing (SCS), frequency band, and the number of SSBs influence the spectral occupancy and synchronization performance. These insights provide a technical foundation for optimizing initial access and beam tracking in high-frequency 5G deployments, particularly within Frequency Range (FR2). Additionally, the versatility of 5G’s time-frequency structure is demonstrated by the spectrogram analysis of SSBs in a variety of deployment scenarios. These results provide insight into how different configurations affect the synchronization signals’ temporal and spectral occupancy, which directly affects initial access, cell identification, and energy efficiency. Full article
Show Figures

Figure 1

18 pages, 4263 KiB  
Article
Clinical Characteristics, Diagnosis, and Management of Primary Malignant Lung Tumors in Children: A Single-Center Analysis
by Mihail Basa, Nemanja Mitrovic, Dragana Aleksic, Gordana Samardzija, Mila Stajevic, Ivan Dizdarevic, Marija Dencic Fekete, Tijana Grba and Aleksandar Sovtic
Biomedicines 2025, 13(8), 1824; https://doi.org/10.3390/biomedicines13081824 - 25 Jul 2025
Viewed by 360
Abstract
Background/Objectives: Primary malignant lung tumors in children are rare and diagnostically challenging. This study presents a single-center experience in the diagnosis and treatment of these tumors, emphasizing the role of histopathological and genetic profiling in informing individualized therapeutic strategies. Methods: We [...] Read more.
Background/Objectives: Primary malignant lung tumors in children are rare and diagnostically challenging. This study presents a single-center experience in the diagnosis and treatment of these tumors, emphasizing the role of histopathological and genetic profiling in informing individualized therapeutic strategies. Methods: We retrospectively reviewed records of seven pediatric patients (ages 2–18) treated from 2015 to 2025. Diagnostics included laboratory tests, chest CT, bronchoscopy, and histopathological/immunohistochemical analysis. Treatment primarily involved surgical resection, complemented by chemo-, radio-, or targeted therapies when indicated. Results: Inflammatory myofibroblastic tumor (IMT) represented the most commonly diagnosed entity (3/7 cases). The tumors presented with nonspecific symptoms, most frequently dry cough. Tumor type distribution was age-dependent, with aggressive forms such as pleuropulmonary blastoma predominantly affecting younger children, whereas IMT and carcinoid tumors were more common in older patients. Surgical resection remained the mainstay of treatment in the majority of cases. Bronchoscopy served as a valuable adjunct in the initial management of tumors exhibiting intraluminal growth, allowing for direct visualization, tissue sampling, and partial debulking to alleviate airway obstruction. In patients with an initially unresectable IMT harboring specific gene fusion rearrangement (e.g., TFG::ROS1), neoadjuvant targeted therapy with crizotinib enabled adequate tumor shrinkage to allow for subsequent surgical resection. Two patients in the study cohort died as a result of disease progression. Conclusions: A multidisciplinary diagnostic approach—integrating radiologic, bronchoscopic, histopathological, and genetic evaluations—ensures high diagnostic accuracy. While conventional treatments remain curative in many cases, targeted therapies directed at specific molecular alterations may offer essential therapeutic options for selected patients. Full article
(This article belongs to the Section Cancer Biology and Oncology)
Show Figures

Figure 1

19 pages, 43909 KiB  
Article
DualBranch-AMR: A Semi-Supervised AMR Method Based on Dual-Student Consistency Regularization with Dynamic Stability Evaluation
by Jiankun Ma, Zhenxi Zhang, Linrun Zhang, Yu Li, Haoyue Tan, Xiaoran Shi and Feng Zhou
Sensors 2025, 25(15), 4553; https://doi.org/10.3390/s25154553 - 23 Jul 2025
Viewed by 227
Abstract
Modulation recognition, as one of the key technologies in the field of wireless communications, holds significant importance in applications such as spectrum resource management, interference suppression, and cognitive radio. While deep learning has substantially improved the performance of Automatic Modulation Recognition (AMR), it [...] Read more.
Modulation recognition, as one of the key technologies in the field of wireless communications, holds significant importance in applications such as spectrum resource management, interference suppression, and cognitive radio. While deep learning has substantially improved the performance of Automatic Modulation Recognition (AMR), it heavily relies on large amounts of labeled data. Given the high annotation costs and privacy concerns, researching semi-supervised AMR methods that leverage readily available unlabeled data for training is of great significance. This study constructs a semi-supervised AMR method based on dual-student. Specifically, we first adopt a dual-branch co-training architecture to fully exploit unlabeled data and effectively learn deep feature representations. Then, we develop a dynamic stability evaluation module using strong and weak augmentation strategies to improve the accuracy of generated pseudo-labels. Finally, based on the dual-student semi-supervised framework and pseudo-label stability evaluation, we propose a stability-guided consistency regularization constraint method and conduct semi-supervised AMR model training. The experimental results demonstrate that the proposed DualBranch-AMR method significantly outperforms traditional supervised baseline approaches on benchmark datasets. With only 5% labeled data, it achieves a recognition accuracy of 55.84%, reaching over 90% of the performance of fully supervised training. This validates the superiority of the proposed method under semi-supervised conditions. Full article
(This article belongs to the Section Communications)
Show Figures

Figure 1

14 pages, 4648 KiB  
Article
Cyber-Physical System and 3D Visualization for a SCADA-Based Drinking Water Supply: A Case Study in the Lerma Basin, Mexico City
by Gabriel Sepúlveda-Cervantes, Eduardo Vega-Alvarado, Edgar Alfredo Portilla-Flores and Eduardo Vivanco-Rodríguez
Future Internet 2025, 17(7), 306; https://doi.org/10.3390/fi17070306 - 17 Jul 2025
Viewed by 335
Abstract
Cyber-physical systems such as Supervisory Control and Data Acquisition (SCADA) have been applied in industrial automation and infrastructure management for decades. They are hybrid tools for administration, monitoring, and continuous control of real physical systems through their computational representation. SCADA systems have evolved [...] Read more.
Cyber-physical systems such as Supervisory Control and Data Acquisition (SCADA) have been applied in industrial automation and infrastructure management for decades. They are hybrid tools for administration, monitoring, and continuous control of real physical systems through their computational representation. SCADA systems have evolved along with computing technology, from their beginnings with low-performance computers, monochrome monitors and communication networks with a range of a few hundred meters, to high-performance systems with advanced 3D graphics and wired and wireless computer networks. This article presents a methodology for the design of a SCADA system with a 3D Visualization for Drinking Water Supply, and its implementation in the Lerma Basin System of Mexico City as a case study. The monitoring of water consumption from the wells is presented, as well as the pressure levels throughout the system. The 3D visualization is generated from the GIS information and the communication is carried out using a hybrid radio frequency transmission system, satellite, and telephone network. The pumps that extract water from each well are teleoperated and monitored in real time. The developed system can be scaled to generate a simulator of water behavior of the Lerma Basin System and perform contingency planning. Full article
Show Figures

Figure 1

17 pages, 2769 KiB  
Article
Service-Based Architecture for 6G RAN: A Cloud Native Platform That Provides Everything as a Service
by Guangyi Liu, Na Li, Chunjing Yuan, Siqi Chen and Xuan Liu
Sensors 2025, 25(14), 4428; https://doi.org/10.3390/s25144428 - 16 Jul 2025
Viewed by 334
Abstract
The 5G network’s commercialization has revealed challenges in providing customized and personalized deployment and services for diverse vertical industrial use cases, leading to high cost, low resource efficiency and management efficiency, and long time to market. Although the 5G core network (CN) has [...] Read more.
The 5G network’s commercialization has revealed challenges in providing customized and personalized deployment and services for diverse vertical industrial use cases, leading to high cost, low resource efficiency and management efficiency, and long time to market. Although the 5G core network (CN) has adopted a service-based architecture (SBA) to enhance agility and elasticity, the radio access network (RAN) keeps the traditional integrated and rigid architecture and suffers the difficulties of customizing and personalizing the functions and capabilities. Open RAN attempted to introduce cloudification, openness, and intelligence to RAN but faced limitations due to 5G RAN specifications. To address this, this paper analyzes the experience and insights from 5G SBA and conducts a systematic study on the service-based RAN, including service definition, interface protocol stacks, impact analysis on the air interface, radio capability exposure, and joint optimization with CN. Performance verification shows significant improvements of service-based user plane design in resource utilization and scalability. Full article
(This article belongs to the Special Issue Future Horizons in Networking: Exploring the Potential of 6G)
Show Figures

Figure 1

32 pages, 1277 KiB  
Article
Distributed Prediction-Enhanced Beamforming Using LR/SVR Fusion and MUSIC Refinement in 5G O-RAN Systems
by Mustafa Mayyahi, Jordi Mongay Batalla, Jerzy Żurek and Piotr Krawiec
Appl. Sci. 2025, 15(13), 7428; https://doi.org/10.3390/app15137428 - 2 Jul 2025
Viewed by 388
Abstract
Low-latency and robust beamforming are vital for sustaining signal quality and spectral efficiency in emerging high-mobility 5G and future 6G wireless networks. Conventional beam management approaches, which rely on periodic Channel State Information feedback and static codebooks, as outlined in 3GPP standards, are [...] Read more.
Low-latency and robust beamforming are vital for sustaining signal quality and spectral efficiency in emerging high-mobility 5G and future 6G wireless networks. Conventional beam management approaches, which rely on periodic Channel State Information feedback and static codebooks, as outlined in 3GPP standards, are insufficient in rapidly varying propagation environments. In this work, we propose a Dominance-Enforced Adaptive Clustered Sliding Window Regression (DE-ACSW-R) framework for predictive beamforming in O-RAN Split 7-2x architectures. DE-ACSW-R leverages a sliding window of recent angle of arrival (AoA) estimates, applying in-window change-point detection to segment user trajectories and performing both Linear Regression (LR) and curvature-adaptive Support Vector Regression (SVR) for short-term and non-linear prediction. A confidence-weighted fusion mechanism adaptively blends LR and SVR outputs, incorporating robust outlier detection and a dominance-enforced selection regime to address strong disagreements. The Open Radio Unit (O-RU) autonomously triggers localised MUSIC scans when prediction confidence degrades, minimising unnecessary full-spectrum searches and saving delay. Simulation results demonstrate that the proposed DE-ACSW-R approach significantly enhances AoA tracking accuracy, beamforming gain, and adaptability under realistic high-mobility conditions, surpassing conventional LR/SVR baselines. This AI-native modular pipeline aligns with O-RAN architectural principles, enabling scalable and real-time beam management for next-generation wireless deployments. Full article
Show Figures

Figure 1

21 pages, 2236 KiB  
Article
Behavioral Responses of Migratory Fish to Environmental Cues: Evidence from the Heishui River
by Jiawei Xu, Yilin Jiao, Shan-e-hyder Soomro, Xiaozhang Hu, Dongqing Li, Jianping Wang, Bingjun Liu, Chenyu Lin, Senfan Ke, Yujiao Wu and Xiaotao Shi
Fishes 2025, 10(7), 310; https://doi.org/10.3390/fishes10070310 - 30 Jun 2025
Viewed by 296
Abstract
Hydropower infrastructure has profoundly altered riverine connectivity, posing challenges to the migratory behavior of aquatic species. This study examined the post-passage migration efficiency of Schizothorax wangchiachii in a regulated river system, focusing on upstream and downstream reaches of the Songxin Hydropower Station on [...] Read more.
Hydropower infrastructure has profoundly altered riverine connectivity, posing challenges to the migratory behavior of aquatic species. This study examined the post-passage migration efficiency of Schizothorax wangchiachii in a regulated river system, focusing on upstream and downstream reaches of the Songxin Hydropower Station on the Heishui River, a tributary of the Jinsha River. We used radio-frequency identification (RFID) tagging to track individuals after fishway passage and coupled this with environmental monitoring data. A Cox proportional hazards model was applied to identify key abiotic drivers of migration success and to develop a predictive framework. The upstream success rate was notably low (15.6%), with a mean passage time of 438 h, while downstream success reached 81.1%, with an average of 142 h. Fish exhibited distinct diel migration patterns; upstream movements were largely nocturnal, whereas downstream migration mainly occurred during daylight. Water temperature (HR = 0.535, p = 0.028), discharge (HR = 0.801, p = 0.050), water level (HR = 0.922, p = 0.040), and diel timing (HR = 0.445, p = 0.088) emerged as significant factors shaping the upstream movement. Our findings highlight that fishways alone may not ensure functional connectivity restoration. Instead, coordinated habitat interventions in upstream tributaries, alongside improved passage infrastructure, are crucial. A combined telemetry and modeling approach offers valuable insights for river management in fragmented systems. Full article
(This article belongs to the Special Issue Behavioral Ecology of Fishes)
Show Figures

Figure 1

24 pages, 649 KiB  
Systematic Review
Algorithms for Load Balancing in Next-Generation Mobile Networks: A Systematic Literature Review
by Juan Ochoa-Aldeán, Carlos Silva-Cárdenas, Renato Torres, Jorge Ivan Gonzalez and Sergio Fortes
Future Internet 2025, 17(7), 290; https://doi.org/10.3390/fi17070290 - 28 Jun 2025
Viewed by 434
Abstract
Background: Machine learning methods are increasingly being used in mobile network optimization systems, especially next-generation mobile networks. The need for enhanced radio resource allocation schemes, improved user mobility and increased throughput, driven by a rising demand for data, has necessitated the development of [...] Read more.
Background: Machine learning methods are increasingly being used in mobile network optimization systems, especially next-generation mobile networks. The need for enhanced radio resource allocation schemes, improved user mobility and increased throughput, driven by a rising demand for data, has necessitated the development of diverse algorithms that optimize output values based on varied input parameters. In this context, we identify the main topics related to cellular networks and machine learning algorithms in order to pinpoint areas where the optimization of parameters is crucial. Furthermore, the wide range of available algorithms often leads to confusion and disorder during classification processes. It is crucial to note that next-generation networks are expected to require reduced latency times, especially for sensitive applications such as Industry 4.0. Research Question: An analysis of the existing literature on mobile network load balancing methods was conducted to identify systems that operate using semi-automatic, automatic and hybrid algorithms. Our research question is as follows: What are the automatic, semi-automatic and hybrid load balancing algorithms that can be applied to next-generation mobile networks? Contribution: This paper aims to present a comprehensive analysis and classification of the algorithms used in this area of study; in order to identify the most suitable for load balancing optimization in next-generation mobile networks, we have organized the classification into three categories, automatic, semi-automatic and hybrid, which will allow for a clear and concise idea of both theoretical and field studies that relate these three types of algorithms with next-generation networks. Figures and tables illustrate the number of algorithms classified by type. In addition, the most important articles related to this topic from five different scientific databases are summarized. Methodology: For this research, we employed the PRISMA method to conduct a systematic literature review of the aforementioned study areas. Findings: The results show that, despite the scarce literature on the subject, the use of load balancing algorithms significantly influences the deployment and performance of next-generation mobile networks. This study highlights the critical role that algorithm selection should play in 5G network optimization, in particular to address latency reduction, dynamic resource allocation and scalability in dense user environments, key challenges for applications such as industrial automation and real-time communications. Our classification framework provides a basis for operators to evaluate algorithmic trade-offs in scenarios such as network fragmentation or edge computing. To fill existing gaps, we propose further research on AI-driven hybrid models that integrate real-time data analytics with predictive algorithms, enabling proactive load management in ultra-reliable 5G/6G architectures. Given this background, it is crucial to conduct further research on the effects of technologies used for load balancing optimization. This line of research is worthy of consideration. Full article
(This article belongs to the Section Smart System Infrastructure and Applications)
Show Figures

Figure 1

21 pages, 2973 KiB  
Article
Machine Learning Approach for Ground-Level Estimation of Electromagnetic Radiation in the Near Field of 5G Base Stations
by Oluwole John Famoriji and Thokozani Shongwe
Appl. Sci. 2025, 15(13), 7302; https://doi.org/10.3390/app15137302 - 28 Jun 2025
Viewed by 268
Abstract
Electromagnetic radiation measurement and management emerge as crucial factors in the economical deployment of fifth-generation (5G) infrastructure, as the new 5G network emerges as a network of services. By installing many base stations in strategic locations that operate in the millimeter-wave range, 5G [...] Read more.
Electromagnetic radiation measurement and management emerge as crucial factors in the economical deployment of fifth-generation (5G) infrastructure, as the new 5G network emerges as a network of services. By installing many base stations in strategic locations that operate in the millimeter-wave range, 5G services are able to meet serious demands for bandwidth. To evaluate the ground-plane radiation level of electromagnetics close to 5G base stations, we propose a unique machine-learning-based approach. Because a machine learning algorithm is trained by utilizing data obtained from numerous 5G base stations, it exhibits the capability to estimate the strength of the electric field effectively at every point of arbitrary radiation, while the base station generates a network and serves various numbers of 5G terminals running in different modes of service. The model requires different numbers of inputs, including the antenna’s transmit power, antenna gain, terminal service modes, number of 5G terminals, distance between the 5G terminals and 5G base station, and environmental complexity. Based on experimental data, the estimation method is both feasible and effective; the machine learning model’s mean absolute percentage error is about 5.89%. The degree of correctness shows how dependable the developed technique is. In addition, the developed approach is less expensive when compared to measurements taken on-site. The results of the estimates can be used to save test costs and offer useful guidelines for choosing the best location, which will make 5G base station electromagnetic radiation management or radio wave coverage optimization easier. Full article
(This article belongs to the Special Issue Recent Advances in Antennas and Propagation)
Show Figures

Figure 1

35 pages, 2102 KiB  
Article
Enhancing Spectrum Utilization in Cognitive Radio Networks Using Reinforcement Learning with Snake Optimizer: A Meta-Heuristic Approach
by Haider Farhi, Abderraouf Messai and Tarek Berghout
Electronics 2025, 14(13), 2525; https://doi.org/10.3390/electronics14132525 - 21 Jun 2025
Viewed by 571
Abstract
The rapid development of sixth-generation mobile communication systems has brought about significant advancements in both Quality of Service (QoS) and Quality of Experience (QoE) for users, largely due to the extremely high data rates and a diverse range of service offerings. However, these [...] Read more.
The rapid development of sixth-generation mobile communication systems has brought about significant advancements in both Quality of Service (QoS) and Quality of Experience (QoE) for users, largely due to the extremely high data rates and a diverse range of service offerings. However, these advancements have also introduced challenges, especially concerning the growing demand for a wireless spectrum and the limited availability of resources. Various efforts have been made and research has attempted to tackle this issue such as the use of Cognitive Radio Networks (CRNs), which allows opportunistic spectrum access and intelligent resource management. This work demonstrate a new method in the optimization of allocation resource in CRNs based on the Snake Optimizer (SO) along with reinforcement learning (RL), which is an effective meta-heuristic algorithm that simulates snake cloning behavior. SO is tested over three different scenarios with varying numbers of secondary users (SUs), primary users (PUs), and frequency bands available. The obtained results reveal that the proposed approach is able to largely satisfy the aforementioned requirements and ensures high spectrum utilization efficiency and low collision rates, which eventually lead to the maximum possible spectral capacity. The study also demonstrates that SO is versatile and resilient and thus indicates its capability of serving as an effective method for augmenting resource management in next-generation wireless communication systems. Full article
Show Figures

Figure 1

12 pages, 732 KiB  
Systematic Review
Gut-Microbiome Signatures Predicting Response to Neoadjuvant Chemoradiotherapy in Locally Advanced Rectal Cancer: A Systematic Review
by Ielmina Domilescu, Bogdan Miutescu, Florin George Horhat, Alina Popescu, Camelia Nica, Ana Maria Ghiuchici, Eyad Gadour, Ioan Sîrbu and Delia Hutanu
Metabolites 2025, 15(6), 412; https://doi.org/10.3390/metabo15060412 - 18 Jun 2025
Viewed by 550
Abstract
Background and Objectives: Rectal cancer management increasingly relies on watch-and-wait strategies after neoadjuvant chemoradiotherapy (nCRT). Accurate, non-invasive prediction of pathological complete response (pCR) remains elusive. Emerging evidence suggests that gut-microbiome composition modulates radio-chemosensitivity. We systematically reviewed primary studies that correlated baseline or on-treatment [...] Read more.
Background and Objectives: Rectal cancer management increasingly relies on watch-and-wait strategies after neoadjuvant chemoradiotherapy (nCRT). Accurate, non-invasive prediction of pathological complete response (pCR) remains elusive. Emerging evidence suggests that gut-microbiome composition modulates radio-chemosensitivity. We systematically reviewed primary studies that correlated baseline or on-treatment gut-microbiome features with nCRT response in locally advanced rectal cancer (LARC). Methods: MEDLINE, Embase and PubMed were searched from inception to 30 April 2025. Eligibility required (i) prospective or retrospective human studies of LARC, (ii) faecal or mucosal microbiome profiling by 16S, metagenomics, or metatranscriptomics, and (iii) response assessment using tumour-regression grade or pCR. Narrative synthesis and random-effects proportion meta-analysis were performed where data were homogeneous. Results: Twelve studies (n = 1354 unique patients, median sample = 73, range 22–735) met inclusion. Four independent machine-learning models achieved an Area Under the Receiver Operating Characteristic curve AUROC ≥ 0.85 for pCR prediction. Consistently enriched taxa in responders included Lachnospiraceae bacterium, Blautia wexlerae, Roseburia spp., and Intestinimonas butyriciproducens. Non-responders showed over-representation of Fusobacterium nucleatum, Bacteroides fragilis, and Prevotella spp. Two studies linked butyrate-producing modules to radiosensitivity, whereas nucleotide-biosynthesis pathways conferred resistance. Pooled pCR rate in patients with a “butyrate-rich” baseline profile was 44% (95% CI 35–54) versus 21% (95% CI 15–29) in controls (I2 = 18%). Conclusions: Despite heterogeneity, convergent functional and taxonomic signals underpin a microbiome-based radiosensitivity axis in LARC. Multi-centre validation cohorts and intervention trials manipulating these taxa, such as prebiotics or live-biotherapeutics, are warranted before clinical deployment. Full article
(This article belongs to the Special Issue Advances in Gut Microbiome Metabolomics)
Show Figures

Figure 1

59 pages, 4517 KiB  
Review
Artificial Intelligence Empowering Dynamic Spectrum Access in Advanced Wireless Communications: A Comprehensive Overview
by Abiodun Gbenga-Ilori, Agbotiname Lucky Imoize, Kinzah Noor and Paul Oluwadara Adebolu-Ololade
AI 2025, 6(6), 126; https://doi.org/10.3390/ai6060126 - 13 Jun 2025
Viewed by 1903
Abstract
This review paper examines the integration of artificial intelligence (AI) in wireless communication, focusing on cognitive radio (CR), spectrum sensing, and dynamic spectrum access (DSA). As the demand for spectrum continues to rise with the expansion of mobile users and connected devices, cognitive [...] Read more.
This review paper examines the integration of artificial intelligence (AI) in wireless communication, focusing on cognitive radio (CR), spectrum sensing, and dynamic spectrum access (DSA). As the demand for spectrum continues to rise with the expansion of mobile users and connected devices, cognitive radio networks (CRNs), leveraging AI-driven spectrum sensing and dynamic access, provide a promising solution to improve spectrum utilization. The paper reviews various deep learning (DL)-based spectrum-sensing methods, highlighting their advantages and challenges. It also explores the use of multi-agent reinforcement learning (MARL) for distributed DSA networks, where agents autonomously optimize power allocation (PA) to minimize interference and enhance quality of service. Additionally, the paper discusses the role of machine learning (ML) in predicting spectrum requirements, which is crucial for efficient frequency management in the fifth generation (5G) networks and beyond. Case studies show how ML can help self-optimize networks, reducing energy consumption while improving performance. The review also introduces the potential of generative AI (GenAI) for demand-planning and network optimization, enhancing spectrum efficiency and energy conservation in wireless networks (WNs). Finally, the paper highlights future research directions, including improving AI-driven network resilience, refining predictive models, and addressing ethical considerations. Overall, AI is poised to transform wireless communication, offering innovative solutions for spectrum management (SM), security, and network performance. Full article
(This article belongs to the Special Issue Artificial Intelligence for Network Management)
Show Figures

Figure 1

Back to TopTop