Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (309)

Search Parameters:
Keywords = next-generation IoT

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
15 pages, 584 KB  
Article
A Scheme for Covert Communication with a Reconfigurable Intelligent Surface in Cognitive Radio Networks
by Yan Xu, Jin Qian and Pengcheng Zhu
Sensors 2025, 25(20), 6490; https://doi.org/10.3390/s25206490 - 21 Oct 2025
Abstract
This paper proposes a scheme for enhancing covert communication in cognitive radio networks (CRNs) using a reconfigurable intelligent surface (RIS), which ensures that transmissions by secondary users (SUs) remains statistically undetectable by adversaries (e.g., wardens like Willie). However, there exist stringent challenges in [...] Read more.
This paper proposes a scheme for enhancing covert communication in cognitive radio networks (CRNs) using a reconfigurable intelligent surface (RIS), which ensures that transmissions by secondary users (SUs) remains statistically undetectable by adversaries (e.g., wardens like Willie). However, there exist stringent challenges in CRNs due to the dual constraints of avoiding detection and preventing harmful interference to primary users (PUs). Leveraging the RIS’s ability to dynamically reconfigure the wireless propagation environment, our scheme jointly optimizes the SU’s transmit power, communication block length, and RIS’s passive beamforming (phase shifts) to maximize the effective covert throughput (ECT) under rigorous covertness constraints quantified by detection error probability or relative entropy while strictly adhering to PU interference limits. Crucially, the RIS configuration is explicitly designed to simultaneously enhance signal quality at the legitimate SU receiver and degrade signal quality at the warden, thereby relaxing the inherent trade-off between covertness and throughput imposed by the fundamental square root law. Furthermore, we analyze the impact of unequal transmit prior probabilities (UTPPs), demonstrating their superiority over equal priors (ETPPs) in flexibly balancing throughput and covertness, and extend the framework to practical scenarios with Poisson packet arrivals typical of IoT networks. Extensive results confirm that RIS assistance significantly boosts ECT compared to non-RIS baselines and establishes the RIS as a key enabler for secure and spectrally efficient next-generation cognitive networks. Full article
(This article belongs to the Section Communications)
Show Figures

Figure 1

15 pages, 846 KB  
Article
Machine-Learning-Based Adaptive Wireless Network Selection for Terrestrial and Non-Terrestrial Networks in 5G and Beyond
by Ahmet Yazar
Telecom 2025, 6(4), 71; https://doi.org/10.3390/telecom6040071 - 30 Sep 2025
Viewed by 323
Abstract
Non-terrestrial networks (NTNs) have become increasingly crucial, particularly with the standardization of fifth-generation (5G) technology. In parallel, the rise of Internet of Things (IoT) technologies has amplified the need for human-centric solutions in 5G and beyond (5 GB) systems. To address diverse communication [...] Read more.
Non-terrestrial networks (NTNs) have become increasingly crucial, particularly with the standardization of fifth-generation (5G) technology. In parallel, the rise of Internet of Things (IoT) technologies has amplified the need for human-centric solutions in 5G and beyond (5 GB) systems. To address diverse communication requirements from a human-centric perspective, leveraging the advantages of both terrestrial networks (TNs) and NTNs has emerged as a key focus for 5 GB communications. In this paper, a machine learning (ML)-based approach is proposed to facilitate decision making between TN and NTN networks within a multi-connectivity scenario, aiming to provide a human-centric solution. For this approach, a novel synthetic dataset is constructed using various sensing information, based on the assumption that numerous interconnected sensor systems will be available in smart city networks with sixth-generation (6G) technologies. The ML results are derived from this newly generated dataset. These simulation results demonstrate that the proposed approach, designed to meet the requirements of next-generation systems, can be effectively utilized with 6G. Full article
Show Figures

Figure 1

28 pages, 2116 KB  
Article
Interference- and Demand-Aware Full-Duplex MAC for Next-Generation IoT: A Dual-Phase Contention Framework with Dynamic Priority Scheduling
by Liwei Tian, Zijie Liu, Shuhan Qi and Qinglin Zhao
Electronics 2025, 14(19), 3901; https://doi.org/10.3390/electronics14193901 - 30 Sep 2025
Viewed by 182
Abstract
The continuous evolution of advanced wireless IoT systems necessitates novel network protocols capable of enhancing resource efficiency and performance to support increasingly demanding applications. Full-duplex (FD) communication emerges as a key advanced wireless technology to address these needs by doubling spectral efficiency. However, [...] Read more.
The continuous evolution of advanced wireless IoT systems necessitates novel network protocols capable of enhancing resource efficiency and performance to support increasingly demanding applications. Full-duplex (FD) communication emerges as a key advanced wireless technology to address these needs by doubling spectral efficiency. However, unlocking this potential is non-trivial, as it introduces complex interference scenarios and requires sophisticated management of heterogeneous Quality of Service (QoS) demands, presenting a significant challenge for existing MAC protocols. To overcome these limitations through protocol optimization, this paper proposes IDA-FDMAC, a novel MAC architecture tailored for FD-enabled IoT networks. At its core, IDA-FDMAC employs a dynamic priority scheduling mechanism that concurrently manages interference and provisions for diverse QoS requirements. A comprehensive theoretical model is developed and validated through extensive simulations, demonstrating that our proposed architecture significantly boosts system throughput and ensures QoS guarantees. This work thus contributes a robust, high-performance solution aligned with the development of next-generation wireless IoT systems. Full article
Show Figures

Figure 1

36 pages, 5130 KB  
Article
SecureEdge-MedChain: A Post-Quantum Blockchain and Federated Learning Framework for Real-Time Predictive Diagnostics in IoMT
by Sivasubramanian Ravisankar and Rajagopal Maheswar
Sensors 2025, 25(19), 5988; https://doi.org/10.3390/s25195988 - 27 Sep 2025
Viewed by 663
Abstract
The burgeoning Internet of Medical Things (IoMT) offers unprecedented opportunities for real-time patient monitoring and predictive diagnostics, yet the current systems struggle with scalability, data confidentiality against quantum threats, and real-time privacy-preserving intelligence. This paper introduces Med-Q Ledger, a novel, multi-layered framework [...] Read more.
The burgeoning Internet of Medical Things (IoMT) offers unprecedented opportunities for real-time patient monitoring and predictive diagnostics, yet the current systems struggle with scalability, data confidentiality against quantum threats, and real-time privacy-preserving intelligence. This paper introduces Med-Q Ledger, a novel, multi-layered framework designed to overcome these critical limitations in the Medical IoT domain. Med-Q Ledger integrates a permissioned Hyperledger Fabric for transactional integrity with a scalable Holochain Distributed Hash Table for high-volume telemetry, achieving horizontal scalability and sub-second commit times. To fortify long-term data security, the framework incorporates post-quantum cryptography (PQC), specifically CRYSTALS-Di lithium signatures and Kyber Key Encapsulation Mechanisms. Real-time, privacy-preserving intelligence is delivered through an edge-based federated learning (FL) model, utilizing lightweight autoencoders for anomaly detection on encrypted gradients. We validate Med-Q Ledger’s efficacy through a critical application: the prediction of intestinal complications like necrotizing enterocolitis (NEC) in preterm infants, a condition frequently necessitating emergency colostomy. By processing physiological data from maternal wearable sensors and infant intestinal images, our integrated Random Forest model demonstrates superior performance in predicting colostomy necessity. Experimental evaluations reveal a throughput of approximately 3400 transactions per second (TPS) with ~180 ms end-to-end latency, a >95% anomaly detection rate with <2% false positives, and an 11% computational overhead for PQC on resource-constrained devices. Furthermore, our results show a 0.90 F1-score for colostomy prediction, a 25% reduction in emergency surgeries, and 31% lower energy consumption compared to MQTT baselines. Med-Q Ledger sets a new benchmark for secure, high-performance, and privacy-preserving IoMT analytics, offering a robust blueprint for next-generation healthcare deployments. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

23 pages, 3141 KB  
Article
Machine Learning-Assisted Cryptographic Security: A Novel ECC-ANN Framework for MQTT-Based IoT Device Communication
by Kalimu Karimunda, Jean de Dieu Marcel Ufitikirezi, Roman Bumbálek, Tomáš Zoubek, Petr Bartoš, Radim Kuneš, Sandra Nicole Umurungi, Anozie Chukwunyere, Mutagisha Norbelt and Gao Bo
Computation 2025, 13(10), 227; https://doi.org/10.3390/computation13100227 - 26 Sep 2025
Viewed by 516
Abstract
The Internet of Things (IoT) has surfaced as a revolutionary technology, enabling ubiquitous connectivity between devices and revolutionizing traditional lifestyles through smart automation. As IoT systems proliferate, securing device-to-device communication and server–client data exchange has become crucial. This paper presents a novel security [...] Read more.
The Internet of Things (IoT) has surfaced as a revolutionary technology, enabling ubiquitous connectivity between devices and revolutionizing traditional lifestyles through smart automation. As IoT systems proliferate, securing device-to-device communication and server–client data exchange has become crucial. This paper presents a novel security framework that integrates elliptic curve cryptography (ECC) with artificial neural networks (ANNs) to enhance the Message Queuing Telemetry Transport (MQTT) protocol. Our study evaluated multiple machine learning algorithms, with ANN demonstrating superior performance in anomaly detection and classification. The hybrid approach not only encrypts communications but also employs the optimized ANN model to detect and classify anomalous traffic patterns. The proposed model demonstrates robust security features, successfully identifying and categorizing various attack types with 90.38% accuracy while maintaining message confidentiality through ECC encryption. Notably, this framework retains the lightweight characteristics essential for IoT devices, making it especially relevant for environments where resources are constrained. To our knowledge, this represents the first implementation of an integrated ECC-ANN approach for securing MQTT-based IoT communications, offering a promising solution for next-generation IoT security requirements. Full article
(This article belongs to the Section Computational Engineering)
Show Figures

Figure 1

28 pages, 987 KB  
Article
Foundation Models for Cybersecurity: A Comprehensive Multi-Modal Evaluation of TabPFN and TabICL for Tabular Intrusion Detection
by Pablo García, J. de Curtò, I. de Zarzà, Juan Carlos Cano and Carlos T. Calafate
Electronics 2025, 14(19), 3792; https://doi.org/10.3390/electronics14193792 - 24 Sep 2025
Viewed by 469
Abstract
While traditional ensemble methods have dominated tabular intrusion detection systems (IDSs), recent advances in foundation models present new opportunities for enhanced cybersecurity applications. This paper presents a comprehensive multi-modal evaluation of foundation models—specifically TabPFN (Tabular Prior-Data Fitted Network), TabICL (Tabular In-Context Learning), and [...] Read more.
While traditional ensemble methods have dominated tabular intrusion detection systems (IDSs), recent advances in foundation models present new opportunities for enhanced cybersecurity applications. This paper presents a comprehensive multi-modal evaluation of foundation models—specifically TabPFN (Tabular Prior-Data Fitted Network), TabICL (Tabular In-Context Learning), and large language models—against traditional machine learning approaches across three cybersecurity datasets: CIC-IDS2017, N-BaIoT, and CIC-UNSW. Our rigorous experimental framework addresses critical methodological challenges through model-appropriate evaluation protocols and comprehensive assessment across multiple data variants. Results demonstrate that foundation models achieve superior and more consistent performance compared with traditional approaches, with TabPFN and TabICL establishing new state-of-the-art results across all datasets. Most significantly, these models uniquely achieve non-zero recall across all classes, including rare threats like Heartbleed and Infiltration, while traditional ensemble methods—despite achieving >99% overall accuracy—completely fail on several minority classes. TabICL demonstrates particularly strong performance on CIC-IDS2017 (99.59% accuracy), while TabPFN maintains consistent performance across all datasets, suggesting robust generalization capabilities. Both foundation models achieve these results using only fractions of the available training data and requiring no hyperparameter tuning, representing a paradigm shift toward training-light, hyperparameter-free adaptive IDS architectures, where TabPFN requires no task-specific fitting and TabICL leverages efficient in-context adaptation without retraining. Cross-dataset validation reveals that foundation models maintain performance advantages across diverse threat landscapes, while traditional methods exhibit significant dataset-specific variations. These findings challenge the cybersecurity community’s reliance on tree-based ensembles and demonstrate that foundation models offer superior capabilities for next-generation intrusion detection systems in IoT environments. Full article
(This article belongs to the Special Issue Wireless Sensor Network: Latest Advances and Prospects)
Show Figures

Graphical abstract

36 pages, 8706 KB  
Review
AI-Enabled Microfluidics for Respiratory Pathogen Detection
by Daoguangyao Zhang, Xuefei Lv, Hao Jiang, Yunlong Fan, Kexin Liu, Hao Wang and Yulin Deng
Sensors 2025, 25(18), 5791; https://doi.org/10.3390/s25185791 - 17 Sep 2025
Viewed by 964
Abstract
Respiratory infectious diseases, such as COVID-19, influenza, and tuberculosis, continue to impose a significant global health burden, underscoring the urgent demand for rapid, sensitive, and cost-effective diagnostic technologies. Integrated microfluidic platforms offer compelling advantages through miniaturization, automation, and high-throughput processing, enabling “sample-in, answer-out” [...] Read more.
Respiratory infectious diseases, such as COVID-19, influenza, and tuberculosis, continue to impose a significant global health burden, underscoring the urgent demand for rapid, sensitive, and cost-effective diagnostic technologies. Integrated microfluidic platforms offer compelling advantages through miniaturization, automation, and high-throughput processing, enabling “sample-in, answer-out” workflows suitable for point-of-care applications. However, their clinical deployment faces challenges, including the complexity of sample matrices, low-abundance target detection, and the need for reliable multiplexing. The convergence of artificial intelligence (AI) with microfluidic systems has emerged as a transformative paradigm, addressing these limitations by optimizing chip design, automating sample pre-processing, enhancing signal interpretation, and enabling real-time feedback control. This critical review surveys AI-enabled strategies across each functional layer of respiratory pathogen diagnostics: from chip architecture and fluidic control to amplification analysis, signal prediction, and smartphone/IoT-linked decision support. We highlight key areas where AI offers measurable benefits over conventional methods. To transition from research prototypes to clinical tools, future systems must become more adaptive, data-efficient, and clinically insightful. Advances such as sensor-integrated chips, privacy-preserving machine learning, and multimodal data fusion will be essential to ensure robust performance and meaningful outputs across diverse scenarios. This review outlines recent progress, current limitations, and future directions. The rapid development of AI and microfluidics presents exciting opportunities for next-generation pathogen diagnostics, and we hope this work contributes to the advancement of intelligent, point-of-care testing (POCT) solutions. Full article
(This article belongs to the Special Issue Advances in Microfluidic Biosensing Technology)
Show Figures

Figure 1

15 pages, 4560 KB  
Article
Harmonic-Recycling Passive RF Energy Harvester with Integrated Power Management
by Ruijiao Li, Yuquan Hu, Hui Li, Haiyan Jin and Dan Liao
Micromachines 2025, 16(9), 1053; https://doi.org/10.3390/mi16091053 - 15 Sep 2025
Viewed by 713
Abstract
The rapid growth of low-power Internet of Things (IoT) applications has created an urgent demand for compact, battery-free power solutions. However, most existing RF energy harvesters rely on active rectifiers, multi-phase topologies, or complex tuning networks, which increase circuit complexity and static power [...] Read more.
The rapid growth of low-power Internet of Things (IoT) applications has created an urgent demand for compact, battery-free power solutions. However, most existing RF energy harvesters rely on active rectifiers, multi-phase topologies, or complex tuning networks, which increase circuit complexity and static power overhead while struggling to maintain high efficiency under microwatt-level inputs. To address this challenge, this work proposes a harmonic-recycling, passive, RF-energy-harvesting system with integrated power management (HR-P-RFEH). The system adopts a planar microstrip architecture compatible with MEMS fabrication, integrating a dual-stage voltage multiplier rectifier (VMR) and a stub-based harmonic suppression–recycling network. The design was verified through combined electromagnetic/circuit co-simulations, PCB prototyping, and experimental measurements. Operating at 915 MHz under a 0 dBm input and a 2 kΩ load, the HR-P-RFEH achieves a stable 1.4 V DC output and a peak rectification efficiency of 70.7%. Compared with a conventional single-stage rectifier, it improves the output voltage by 22.5% and the efficiency by 16.4%. The rectified power is further regulated by a BQ25570-based unit to provide a stable 3.3 V supply buffered by a 47 mF supercapacitor, ensuring continuous operation under intermittent RF input. In comparison with the state of the art, the proposed fully passive, harmonic-recycling design achieves competitive efficiency without active bias or adaptive tuning while remaining MEMS- and LTCC-ready. These results highlight HR-P-RFEH as a scalable and fabrication-friendly building block for next-generation energy-autonomous IoT and MEMS systems. Full article
(This article belongs to the Special Issue Micro-Energy Harvesting Technologies and Self-Powered Sensing Systems)
Show Figures

Figure 1

16 pages, 623 KB  
Review
A Digital Twin Architecture for Forest Restoration: Integrating AI, IoT, and Blockchain for Smart Ecosystem Management
by Nophea Sasaki and Issei Abe
Future Internet 2025, 17(9), 421; https://doi.org/10.3390/fi17090421 - 15 Sep 2025
Viewed by 1151
Abstract
Meeting global forest restoration targets by 2030 requires a transition from labor-intensive and opaque practices to scalable, intelligent, and verifiable systems. This paper introduces a cyber–physical digital twin architecture for forest restoration, structured across four layers: (i) a Physical Layer with drones and [...] Read more.
Meeting global forest restoration targets by 2030 requires a transition from labor-intensive and opaque practices to scalable, intelligent, and verifiable systems. This paper introduces a cyber–physical digital twin architecture for forest restoration, structured across four layers: (i) a Physical Layer with drones and IoT-enabled sensors for in situ environmental monitoring; (ii) a Data Layer for secure and structured transmission of spatiotemporal data; (iii) an Intelligence Layer applying AI-driven modeling, simulation, and predictive analytics to forecast biomass, biodiversity, and risk; and (iv) an Application Layer providing stakeholder dashboards, milestone-based smart contracts, and automated climate finance flows. Evidence from Dronecoria, Flash Forest, and AirSeed Technologies shows that digital twins can reduce per-tree planting costs from USD 2.00–3.75 to USD 0.11–1.08, while enhancing accuracy, scalability, and community participation. The paper further outlines policy directions for integrating digital MRV systems into the Enhanced Transparency Framework (ETF) and Article 5 of the Paris Agreement. By embedding simulation, automation, and participatory finance into a unified ecosystem, digital twins offer a resilient, interoperable, and climate-aligned pathway for next-generation forest restoration. Full article
(This article belongs to the Special Issue Advances in Smart Environments and Digital Twin Technologies)
Show Figures

Figure 1

28 pages, 1812 KB  
Article
An Integrated Hybrid Deep Learning Framework for Intrusion Detection in IoT and IIoT Networks Using CNN-LSTM-GRU Architecture
by Doaa Mohsin Abd Ali Afraji, Jaime Lloret and Lourdes Peñalver
Computation 2025, 13(9), 222; https://doi.org/10.3390/computation13090222 - 14 Sep 2025
Viewed by 1197
Abstract
Intrusion detection systems (IDSs) are critical for securing modern networks, particularly in IoT and IIoT environments where traditional defenses such as firewalls and encryption are insufficient against evolving cyber threats. This paper proposes an enhanced hybrid deep learning model that integrates convolutional neural [...] Read more.
Intrusion detection systems (IDSs) are critical for securing modern networks, particularly in IoT and IIoT environments where traditional defenses such as firewalls and encryption are insufficient against evolving cyber threats. This paper proposes an enhanced hybrid deep learning model that integrates convolutional neural networks (CNNs), Long Short-Term Memory (LSTM), and Gated Recurrent Units (GRU) in a multi-branch architecture designed to capture spatial and temporal dependencies while minimizing redundant computations. Unlike conventional hybrid approaches, the proposed parallel–sequential fusion framework leverages the strengths of each component independently before merging features, thereby improving detection granularity and learning efficiency. A rigorous preprocessing pipeline is employed to handle real-world data challenges: missing values are imputed using median filling, class imbalance is mitigated through SMOTE (Synthetic Minority Oversampling Technique), and feature scaling is performed with Min–Max normalization to ensure convergence consistency. The methodology is validated on the TON_IoT and CICIDS2017 dataset, chosen for its diversity and realism in IoT/IIoT attack scenarios. Three hybrid models—CNN-LSTM, CNN-GRU, and the proposed CNN-LSTM-GRU—are assessed for binary and multiclass intrusion detection. Experimental results demonstrate that the CNN-LSTM-GRU architecture achieves superior performance, attaining 100% accuracy in binary classification and 97% in multiclass detection, with balanced precision, recall, and F1-scores across all classes. Furthermore, evaluation on the CICIDS2017 dataset confirms the model’s generalization ability, achieving 99.49% accuracy with precision, recall, and F1-scores of 0.9954, 0.9943, and 0.9949, respectively, outperforming CNN-LSTM and CNN-GRU baselines. Compared to existing IDS models, our approach delivers higher robustness, scalability, and adaptability, making it a promising candidate for next-generation IoT/IIoT security. Full article
(This article belongs to the Section Computational Engineering)
Show Figures

Figure 1

23 pages, 3843 KB  
Article
Leveraging Reconfigurable Massive MIMO Antenna Arrays for Enhanced Wireless Connectivity in Biomedical IoT Applications
by Sunday Enahoro, Sunday Cookey Ekpo, Yasir Al-Yasir and Mfonobong Uko
Sensors 2025, 25(18), 5709; https://doi.org/10.3390/s25185709 - 12 Sep 2025
Viewed by 580
Abstract
The increasing demand for real-time, energy-efficient, and interference-resilient communication in smart healthcare environments has intensified interest in Biomedical Internet of Things (Bio-IoT) systems. However, ensuring reliable wireless connectivity for wearable and implantable biomedical sensors remains a challenge due to mobility, latency sensitivity, power [...] Read more.
The increasing demand for real-time, energy-efficient, and interference-resilient communication in smart healthcare environments has intensified interest in Biomedical Internet of Things (Bio-IoT) systems. However, ensuring reliable wireless connectivity for wearable and implantable biomedical sensors remains a challenge due to mobility, latency sensitivity, power constraints, and multi-user interference. This paper addresses these issues by proposing a reconfigurable massive multiple-input multiple-output (MIMO) antenna architecture, incorporating hybrid analog–digital beamforming and adaptive signal processing. The methodology combines conventional algorithms—such as Least Mean Square (LMS), Zero-Forcing (ZF), and Minimum Variance Distortionless Response (MVDR)—with a novel mobility-aware beamforming scheme. System-level simulations under realistic channel models (Rayleigh, Rician, 3GPP UMa) evaluate signal-to-interference-plus-noise ratio (SINR), bit error rate (BER), energy efficiency, outage probability, and fairness index across varying user loads and mobility scenarios. Results show that the proposed hybrid beamforming system consistently outperforms benchmarks, achieving up to 35% higher throughput, a 65% reduction in packet drop rate, and sub-10 ms latency even under high-mobility conditions. Beam pattern analysis confirms robust nulling of interference and dynamic lobe steering. This architecture is well-suited for next-generation Bio-IoT deployments in smart hospitals, enabling secure, adaptive, and power-aware connectivity for critical healthcare monitoring applications. Full article
(This article belongs to the Special Issue Challenges and Future Trends in Antenna Technology)
Show Figures

Figure 1

19 pages, 2548 KB  
Article
Random Access Preamble Design for 6G Satellite–Terrestrial Integrated Communication Systems
by Min Hua, Zhongqiu Wu, Cong Zhang, Zeyang Xu, Xiaoming Liu and Wen Zhou
Sensors 2025, 25(17), 5602; https://doi.org/10.3390/s25175602 - 8 Sep 2025
Viewed by 859
Abstract
Satellite–terrestrial integrated communication systems (STICSs) are envisioned to provide ubiquitous, seamless connectivity in next-generation (6G) wireless communication networks for massive-scale Internet of Things (IoT) deployments. This global coverage extends beyond densely populated areas to remote regions (e.g., polar zones, open oceans, deserts) and [...] Read more.
Satellite–terrestrial integrated communication systems (STICSs) are envisioned to provide ubiquitous, seamless connectivity in next-generation (6G) wireless communication networks for massive-scale Internet of Things (IoT) deployments. This global coverage extends beyond densely populated areas to remote regions (e.g., polar zones, open oceans, deserts) and disaster-prone areas, supporting diverse IoT applications, including remote sensing, smart cities, intelligent agriculture/forestry, environmental monitoring, and emergency reporting. Random access signals, which constitute the initial transmission from access IoT devices to base station for unscheduled transmissions or network entry in terrestrial networks (TNs), encounter significant challenges in STICSs due to inherent satellite characteristics: wide coverage, large-scale access, substantial round-trip delay, and high carrier frequency offset (CFO). Consequently, conventional TN preamble designs based on Zadoff–Chu (ZC) sequences, as used in 4G LTE and 5G NR systems, are unsuitable for direct deployment in 6G STICSs. This paper first analyzes the challenges in adapting terrestrial designs to STICSs. It then proposes a CFO-resistant preamble design specifically tailored for STICSs and details its detection procedure. Furthermore, a dedicated root set selection algorithm for the proposed preambles is presented, generating an expanded pool of random access signals to meet the demands of increasing IoT device access. The developed analytical framework provides a foundation for performance analysis of random access signals in 6G STICSs. Full article
(This article belongs to the Special Issue 5G/6G Networks for Wireless Communication and IoT)
Show Figures

Figure 1

29 pages, 5850 KB  
Article
Optimisation of Sensor and Sensor Node Positions for Shape Sensing with a Wireless Sensor Network—A Case Study Using the Modal Method and a Physics-Informed Neural Network
by Sören Meyer zu Westerhausen, Imed Hichri, Kevin Herrmann and Roland Lachmayer
Sensors 2025, 25(17), 5573; https://doi.org/10.3390/s25175573 - 6 Sep 2025
Viewed by 1274
Abstract
Data of operational conditions of structural components, acquired, e.g., in structural health monitoring (SHM), is of great interest to optimise products from one generation to the next, for example, by adapting them to occurring operational loads. To acquire data for this purpose in [...] Read more.
Data of operational conditions of structural components, acquired, e.g., in structural health monitoring (SHM), is of great interest to optimise products from one generation to the next, for example, by adapting them to occurring operational loads. To acquire data for this purpose in the desired quality, an optimal sensor placement for so-called shape and load sensing is required. In the case of large-scale structural components, wireless sensor networks (WSN) could be used to process and transmit the acquired data for real-time monitoring, which furthermore requires an optimisation of sensor node positions. Since most publications focus only on the optimal sensor placement or the optimisation of sensor node positions, a methodology for both is implemented in a Python tool, and an optimised WSN is realised on a demonstration part, loaded at a test bench. For this purpose, the modal method is applied for shape sensing as well as a physics-informed neural network for solving inverse problems in shape sensing (iPINN). The WSN is realised with strain gauges, HX711 analogue-digital (A/D) converters, and Arduino Nano 33 IoT microprocessors for data submission to a server, which allows real-time visualisation and data processing on a Python Flask server. The results demonstrate the applicability of the presented methodology and its implementation in the Python tool for achieving high-accuracy shape sensing with WSNs. Full article
Show Figures

Figure 1

15 pages, 37613 KB  
Article
Wideband Reconfigurable Reflective Metasurface with 1-Bit Phase Control Based on Polarization Rotation
by Zahid Iqbal, Xiuping Li, Zihang Qi, Wenyu Zhao, Zaid Akram and Muhammad Ishfaq
Telecom 2025, 6(3), 65; https://doi.org/10.3390/telecom6030065 - 3 Sep 2025
Viewed by 860
Abstract
The rapid expansion of broadband wireless communication systems, including 5G, satellite networks, and next-generation IoT platforms, has created a strong demand for antenna architectures capable of real-time beam control, compact integration, and broad frequency coverage. Traditional reflectarrays, while effective for narrowband applications, often [...] Read more.
The rapid expansion of broadband wireless communication systems, including 5G, satellite networks, and next-generation IoT platforms, has created a strong demand for antenna architectures capable of real-time beam control, compact integration, and broad frequency coverage. Traditional reflectarrays, while effective for narrowband applications, often face inherent limitations such as fixed beam direction, high insertion loss, and complex phase-shifting networks, making them less viable for modern adaptive and reconfigurable systems. Addressing these challenges, this work presents a novel wideband planar metasurface that operates as a polarization rotation reflective metasurface (PRRM), combining 90° polarization conversion with 1-bit reconfigurable phase modulation. The metasurface employs a mirror-symmetric unit cell structure, incorporating a cross-shaped patch with fan-shaped stub loading and integrated PIN diodes, connected through vertical interconnect accesses (VIAs). This design enables stable binary phase control with minimal loss across a significantly wide frequency range. Full-wave electromagnetic simulations confirm that the proposed unit cell maintains consistent cross-polarized reflection performance and phase switching from 3.83 GHz to 15.06 GHz, achieving a remarkable fractional bandwidth of 118.89%. To verify its applicability, the full-wave simulation analysis of a 16 × 16 array was conducted, demonstrating dynamic two-dimensional beam steering up to ±60° and maintaining a 3 dB gain bandwidth of 55.3%. These results establish the metasurface’s suitability for advanced beamforming, making it a strong candidate for compact, electronically reconfigurable antennas in high-speed wireless communication, radar imaging, and sensing systems. Full article
Show Figures

Figure 1

21 pages, 1293 KB  
Article
Dynamic Resource Management in 5G-Enabled Smart Elderly Care Using Deep Reinforcement Learning
by Krishnapriya V. Shaji, Srilakshmi S. Rethy, Simi Surendran, Livya George, Namita Suresh and Hrishika Dayan
Future Internet 2025, 17(9), 402; https://doi.org/10.3390/fi17090402 - 2 Sep 2025
Viewed by 519
Abstract
The increasing elderly population presents major challenges to traditional healthcare due to the need for continuous care, a shortage of skilled professionals, and increasing medical costs. To address this, smart elderly care homes where multiple residents live with the support of caregivers and [...] Read more.
The increasing elderly population presents major challenges to traditional healthcare due to the need for continuous care, a shortage of skilled professionals, and increasing medical costs. To address this, smart elderly care homes where multiple residents live with the support of caregivers and IoT-based assistive technologies have emerged as a promising solution. For their effective operation, a reliable high speed network like 5G is essential, along with intelligent resource allocation to ensure efficient service delivery. This study proposes a deep reinforcement learning (DRL)-based resource management framework for smart elderly homes, formulated as a Markov decision process. The framework dynamically allocates computing and network resources in response to real-time application demands and system constraints. We implement and compare two DRL algorithms, emphasizing their strengths in optimizing edge utilization and throughput. System performance is evaluated across balanced, high-demand, and resource-constrained scenarios. The results demonstrate that the proposed DRL approach effectively learns adaptive resource management policies, making it a promising solution for next-generation intelligent elderly care environments. Full article
Show Figures

Figure 1

Back to TopTop