Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (5,746)

Search Parameters:
Keywords = wireless data

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 2236 KB  
Article
An AI-Driven System for Learning MQTT Communication Protocols with Python Programming
by Zihao Zhu, Nobuo Funabiki, Htoo Htoo Sandi Kyaw, I Nyoman Darma Kotama, Anak Agung Surya Pradhana, Alfiandi Aulia Rahmadani and Noprianto
Electronics 2025, 14(24), 4967; https://doi.org/10.3390/electronics14244967 - 18 Dec 2025
Abstract
With rapid developments of wireless communication and Internet of Things (IoT) technologies, an increasing number of devices and sensors are interconnected, generating massive amounts of data in real time. Among the underlying protocols, Message Queuing Telemetry Transport (MQTT) has become a widely adopted [...] Read more.
With rapid developments of wireless communication and Internet of Things (IoT) technologies, an increasing number of devices and sensors are interconnected, generating massive amounts of data in real time. Among the underlying protocols, Message Queuing Telemetry Transport (MQTT) has become a widely adopted lightweight publish–subscribe standard due to its simplicity, minimal overhead, and scalability. Then, understanding such protocols is essential for students and engineers engaging in IoT application system designs. However, teaching and learning MQTT remains challenging for them. Its asynchronous architecture, hierarchical topic structure, and constituting concepts such as retained messages, Quality of Service (QoS) levels, and wildcard subscriptions are often difficult for beginners. Moreover, traditional learning resources emphasize theory and provide limited hands-on guidance, leading to a steep learning curve. To address these challenges, we propose an AI-assisted, exercise-based learning platform for MQTT. This platform provides interactive exercises with intelligent feedback to bridge the gap between theory and practice. To lower the barrier for learners, all code examples for executing MQTT communication are implemented in Python for readability, and Docker is used to ensure portable deployments of the MQTT broker and AI assistant. For evaluations, we conducted a usability study using two groups. The first group, who has no prior experience, focused on fundamental concepts with AI-guided exercises. The second group, who has relevant background, engaged in advanced projects to apply and reinforce their knowledge. The results show that the proposed platform supports learners at different levels, reduces frustrations, and improves both engagement and efficiency. Full article
Show Figures

Figure 1

16 pages, 1514 KB  
Article
IoT-Controlled Upflow Filtration Achieves High Removal of Fine Particles and Phosphorus in Stormwater
by Kyungjin Han, Dongyoung Choi, Jeongdong Choi and Junho Lee
Water 2025, 17(24), 3580; https://doi.org/10.3390/w17243580 - 17 Dec 2025
Abstract
Urban stormwater runoff, particularly during first-flush events, carries high loads of fine suspended solids and phosphorus that are difficult to remove with conventional best management practices (BMPs). This study developed and evaluated a laboratory-scale high-efficiency up-flow filtration system with Internet of Things (IoT)-based [...] Read more.
Urban stormwater runoff, particularly during first-flush events, carries high loads of fine suspended solids and phosphorus that are difficult to remove with conventional best management practices (BMPs). This study developed and evaluated a laboratory-scale high-efficiency up-flow filtration system with Internet of Things (IoT)-based autonomous control. The system employed 20 mm fiber-ball media in a modular dual-stage up-flow configuration with optimized coagulant dosing to target fine particles (<3 μm) and total phosphorus (TP). Real-time turbidity and pressure monitoring via sensor networks connected to a microcontroller enabled wireless data logging and automated backwash initiation when thresholds were exceeded. Under manual operation, the two-stage filter achieved removals of 96.6% turbidity, 98.8% suspended solids (SS), and 85.6% TP while maintaining head loss below 10 cm. In IoT-controlled single-stage runs with highly polluted influent (turbidity ~400 NTU, SS > 1000 mg/L, TP ~1.6 mg/L), the system maintained >90% SS and ~58% TP removal with stable head loss (~8 cm) and no manual intervention. Turbidity correlated strongly with SS (R2 ≈ 0.94) and TP (R2 ≈ 0.87), validating its use as a surrogate control parameter. Compared with conventional BMPs, the developed filter demonstrated superior solids capture, competitive phosphorus removal, and the novel capability of real-time autonomous operation, providing proof-of-concept for next-generation smart BMPs capable of meeting regulatory standards while reducing maintenance. Full article
(This article belongs to the Section Urban Water Management)
Show Figures

Figure 1

13 pages, 6272 KB  
Article
A Design of 1.2–3.6 GHz Power Amplifier Based on Filters of Negative Feedback Network
by Zhenghao Yang, Chucai Cai, Zhengxian Meng, Zhiyong Ding, Quanbin Fu, Xiaogang Wang and Zhiqun Cheng
Electronics 2025, 14(24), 4944; https://doi.org/10.3390/electronics14244944 - 17 Dec 2025
Abstract
This work proposes a broadband, high-efficiency extended continuous class-F (ECCF) power amplifier (PA) with a negative-feedback network structure. Compared with the traditional direct cascade connection of a PA and a filter, the design introduces a novel negative feedback filter structure. The transistor and [...] Read more.
This work proposes a broadband, high-efficiency extended continuous class-F (ECCF) power amplifier (PA) with a negative-feedback network structure. Compared with the traditional direct cascade connection of a PA and a filter, the design introduces a novel negative feedback filter structure. The transistor and filter synthesis network co-design method aims to compensate for the gain and efficiency drop of this PA in both high and low frequency bands, resulting in relatively flat gain and efficiency performance over a wide band. Consequently, there is a need to enhance the security and efficiency of wireless communication systems. This work verifies the proposed method using a designed and fabricated 10 W GaN HEMT device. The measured data reveal that the designed PA achieves 100% relative bandwidth from 1.2 GHz to 3.6 GHz, with a drain efficiency (DE) of 59.5~67.4%, an output power of 38.8~41.8 dBm, and a large signal gain of 8.8~11.8 dB. Full article
(This article belongs to the Section Microwave and Wireless Communications)
Show Figures

Figure 1

17 pages, 38027 KB  
Article
Model-Driven Wireless Planning for Farm Monitoring: A Mixed-Integer Optimization Approach
by Gerardo Cortez, Milton Ruiz, Edwin García and Alexander Aguila
Eng 2025, 6(12), 369; https://doi.org/10.3390/eng6120369 - 17 Dec 2025
Abstract
This study presents an optimization-driven design of a wireless communications network to continuously transmit environmental variables—temperature, humidity, weight, and water usage—in poultry farms. The reference site is a four-shed facility in Quito, Ecuador (each shed 120m×12m) with a [...] Read more.
This study presents an optimization-driven design of a wireless communications network to continuously transmit environmental variables—temperature, humidity, weight, and water usage—in poultry farms. The reference site is a four-shed facility in Quito, Ecuador (each shed 120m×12m) with a data center located 200m from the sheds. Starting from a calibrated log-distance path-loss model, coverage is declared when the received power exceeds the receiver sensitivity of the selected technology. Gateway placement is cast as a mixed-integer optimization that minimizes deployment cost while meeting target coverage and per-gateway capacity; a capacity-aware greedy heuristic provides a robust fallback when exact solvers stall or instances become too large for interactive use. Sensing instruments are Tekon devices using the Tinymesh protocol (IEEE 802.15.4g), selected for low-power operation and suitability for elongated farm layouts. Model parameters and technology presets inform a pre-optimization sizing step—based on range and coverage probability—that seeds candidate gateway locations. The pipeline integrates MATLAB R2024b and LpSolve 5.5.2.0 for the optimization core, Radio Mobile for network-coverage simulations, and Wireshark for on-air packet analysis and verification. On the four-shed case, the algorithm identifies the number and positions of gateways that maximize coverage probability within capacity limits, reducing infrastructure while enabling continuous monitoring. The final layout derived from simulation was implemented onsite, and end-to-end tests confirmed correct operation and data delivery to the farm’s data center. By combining technology-aware modeling, optimization, and field validation, the work provides a practical blueprint to right-size wireless infrastructure for agricultural monitoring. Quantitatively, the optimization couples coverage with capacity and scales with the number of endpoints M and candidate sites N (binaries M+N+MN). On the four-shed case, the planner serves 72 environmental endpoints and 41 physical-variable endpoints while keeping the gateway count fixed and reducing the required link ports from 16 to 4 and from 16 to 6, respectively, corresponding to optimization gains of up to 82% and 70% versus dense baseline plans. Definitions and a measurement plan for packet delivery ratio (PDR), one-way latency, throughput, and energy per delivered sample are included; detailed long-term numerical results for these metrics are left for future work, since the present implementation was validated through short-term acceptance tests. Full article
(This article belongs to the Section Electrical and Electronic Engineering)
Show Figures

Figure 1

22 pages, 1380 KB  
Article
Selection of Optimal Cluster Head Using MOPSO and Decision Tree for Cluster-Oriented Wireless Sensor Networks
by Rahul Mishra, Sudhanshu Kumar Jha, Shiv Prakash and Rajkumar Singh Rathore
Future Internet 2025, 17(12), 577; https://doi.org/10.3390/fi17120577 - 15 Dec 2025
Viewed by 59
Abstract
Wireless sensor networks (WSNs) consist of distributed nodes to monitor various physical and environmental parameters. The sensor nodes (SNs) are usually resource constrained such as power source, communication, and computation capacity. In WSN, energy consumption varies depending on the distance between sender and [...] Read more.
Wireless sensor networks (WSNs) consist of distributed nodes to monitor various physical and environmental parameters. The sensor nodes (SNs) are usually resource constrained such as power source, communication, and computation capacity. In WSN, energy consumption varies depending on the distance between sender and receiver SNs. Communication among SNs having long distance requires significantly additional energy that negatively affects network longevity. To address these issues, WSNs are deployed using multi-hop routing. Using multi-hop routing solves various problems like reduced communication and communication cost but finding an optimal cluster head (CH) and route remain an issue. An optimal CH reduces energy consumption and maintains reliable data transmission throughout the network. To improve the performance of multi-hop routing in WSN, we propose a model that combines Multi-Objective Particle Swarm Optimization (MOPSO) and a Decision Tree for dynamic CH selection. The proposed model consists of two phases, namely, the offline phase and the online phase. In the offline phase, various network scenarios with node densities, initial energy levels, and BS positions are simulated, required features are collected, and MOPSO is applied to the collected features to generate a Pareto front of optimal CH nodes to optimize energy efficiency, coverage, and load balancing. Each node is labeled as selected CH or not by the MOPSO, and the labelled dataset is then used to train a Decision Tree classifier, which generates a lightweight and interpretable model for CH prediction. In the online phase, the trained model is used in the deployed network to quickly and adaptively select CHs using features of each node and classifying them as a CH or non-CH. The predicted nodes broadcast the information and manage the intra-cluster communication, data aggregation, and routing to the base station. CH selection is re-initiated based on residual energy drop below a threshold, load saturation, and coverage degradation. The simulation results demonstrate that the proposed model outperforms protocols such as LEACH, HEED, and standard PSO regarding energy efficiency and network lifetime, making it highly suitable for applications in green computing, environmental monitoring, precision agriculture, healthcare, and industrial IoT. Full article
(This article belongs to the Special Issue Clustered Federated Learning for Networks)
Show Figures

Figure 1

23 pages, 3582 KB  
Article
Compact Onboard Telemetry System for Real-Time Re-Entry Capsule Monitoring
by Nesrine Gaaliche, Christina Georgantopoulou, Ahmed M. Abdelrhman and Raouf Fathallah
Aerospace 2025, 12(12), 1105; https://doi.org/10.3390/aerospace12121105 - 14 Dec 2025
Viewed by 182
Abstract
This paper describes a compact low-cost telemetry system featuring ready-made sensors and an acquisition unit based on the ESP32, which makes use of the LoRa/Wi-Fi wireless standard for communication, and autonomous fallback logging to guarantee data recovery during communication loss. Ensuring safe atmospheric [...] Read more.
This paper describes a compact low-cost telemetry system featuring ready-made sensors and an acquisition unit based on the ESP32, which makes use of the LoRa/Wi-Fi wireless standard for communication, and autonomous fallback logging to guarantee data recovery during communication loss. Ensuring safe atmospheric re-entry requires reliable onboard monitoring of capsule conditions during descent. The system is intended for sub-orbital, low-cost educational capsules and experimental atmospheric descent missions rather than full orbital re-entry at hypersonic speeds, where the environmental loads and communication constraints differ significantly. The novelty of this work is the development of a fully self-contained telemetry system that ensures continuous monitoring and fallback logging without external infrastructure, bridging the gap in compact solutions for CubeSat-scale capsules. In contrast to existing approaches built around UAVs or radar, the proposed design is entirely self-contained, lightweight, and tailored to CubeSat-class and academic missions, where costs and infrastructure are limited. Ground test validation consisted of vertical drop tests, wind tunnel runs, and hardware-in-the-loop simulations. In addition, high-temperature thermal cycling tests were performed to assess system reliability under rapid temperature transitions between −20 °C and +110 °C, confirming stable operation and data integrity under thermal stress. Results showed over 95% real-time packet success with full data recovery in blackout events, while acceleration profiling confirmed resilience to peak decelerations of ~9 g. To complement telemetry, the TeleCapsNet dataset was introduced, facilitating a CNN recognition of descent states via 87% mean Average Precision, and an F1-score of 0.82, which attests to feasibility under constrained computational power. The novelty of this work is twofold: having reliable dual-path telemetry in real-time with full post-mission recovery and producing a scalable platform that explicitly addresses the lack of compact, infrastructure-independent proposals found in the existing literature. Results show an independent and cost-effective system for small re-entry capsule experimenters with reliable data integrity (without external infrastructure). Future work will explore AI systems deployment as a means to prolong the onboard autonomy, as well as to broaden the applicability of the presented approach into academic and low-resource re- entry investigations. Full article
Show Figures

Figure 1

19 pages, 968 KB  
Article
UAV-Assisted Cooperative Charging and Data Collection Strategy for Heterogeneous Wireless Sensor Networks
by Yuanxue Xin, Liang Li, Yue Ning, Yi Yang and Pengfei Shi
Drones 2025, 9(12), 859; https://doi.org/10.3390/drones9120859 - 13 Dec 2025
Viewed by 135
Abstract
Unmanned Aerial Vehicles (UAVs) are playing an increasingly crucial role in large-scale Wireless Sensor Networks (WSNs) due to their high mobility and flexible deployment capabilities. To enhance network sustainability and profitability, this paper proposes a coordinated charging and data-collection system that integrates a [...] Read more.
Unmanned Aerial Vehicles (UAVs) are playing an increasingly crucial role in large-scale Wireless Sensor Networks (WSNs) due to their high mobility and flexible deployment capabilities. To enhance network sustainability and profitability, this paper proposes a coordinated charging and data-collection system that integrates a green energy base station, Wireless Charging Vehicles (WCVs), and UAVs, ensuring full coverage of all sensor nodes in the target region. On the other hand, the economic feasibility of charging strategies is an essential factor, which is usually neglected. Thus, we further design a joint optimization algorithm to simultaneously maximize system profit and node survivability. To this end, we design a cylindrical-sector-based charging sequence for WCVs. In particular, we develop a dynamic cluster head selection algorithm that accounts for buffer size, residual energy, and inter-node distance. This scheme prevents cluster-head running out of energy before the charging devices arrive, thereby ensuring reliable data transmission. Simulation results demonstrate that the proposed strategy not only maximizes overall profit but also significantly improves node survivability and enhances the sustainability of the wireless sensor network. Full article
(This article belongs to the Section Drone Communications)
Show Figures

Figure 1

16 pages, 640 KB  
Systematic Review
A Systematic Review of Building Energy Management Systems (BEMSs): Sensors, IoT, and AI Integration
by Leyla Akbulut, Kubilay Taşdelen, Atılgan Atılgan, Mateusz Malinowski, Ahmet Coşgun, Ramazan Şenol, Adem Akbulut and Agnieszka Petryk
Energies 2025, 18(24), 6522; https://doi.org/10.3390/en18246522 - 12 Dec 2025
Viewed by 216
Abstract
The escalating global demand for energy-efficient and sustainable built environments has catalyzed the advancement of Building Energy Management Systems (BEMSs), particularly through their integration with cutting-edge technologies. This review presents a comprehensive and critical synthesis of the convergence between BEMSs and enabling tools [...] Read more.
The escalating global demand for energy-efficient and sustainable built environments has catalyzed the advancement of Building Energy Management Systems (BEMSs), particularly through their integration with cutting-edge technologies. This review presents a comprehensive and critical synthesis of the convergence between BEMSs and enabling tools such as the Internet of Things (IoT), wireless sensor networks (WSNs), and artificial intelligence (AI)-based decision-making architectures. Drawing upon 89 peer-reviewed publications spanning from 2019 to 2025, the study systematically categorizes recent developments in HVAC optimization, occupancy-driven lighting control, predictive maintenance, and fault detection systems. It further investigates the role of communication protocols (e.g., ZigBee, LoRaWAN), machine learning-based energy forecasting, and multi-agent control mechanisms within residential, commercial, and institutional building contexts. Findings across multiple case studies indicate that hybrid AI–IoT systems have achieved energy efficiency improvements ranging from 20% to 40%, depending on building typology and control granularity. Nevertheless, the widespread adoption of such intelligent BEMSs is hindered by critical challenges, including data security vulnerabilities, lack of standardized interoperability frameworks, and the complexity of integrating heterogeneous legacy infrastructure. Additionally, there remain pronounced gaps in the literature related to real-time adaptive control strategies, trust-aware federated learning, and seamless interoperability with smart grid platforms. By offering a rigorous and forward-looking review of current technologies and implementation barriers, this paper aims to serve as a strategic roadmap for researchers, system designers, and policymakers seeking to deploy the next generation of intelligent, sustainable, and scalable building energy management solutions. Full article
Show Figures

Figure 1

24 pages, 29424 KB  
Article
High-Degree Connectivity Sensor Networks: Applications in Pastured Cow Herd Monitoring
by Geunho Lee, Teruyuki Yamane, Kota Okabe, Fumiaki Sugino and Yeunwoong Kyung
Future Internet 2025, 17(12), 569; https://doi.org/10.3390/fi17120569 - 12 Dec 2025
Viewed by 151
Abstract
This paper explores the application of mobile sensor networks in cow herds, focusing on the challenge of achieving local communication under minimal computational constraints such as restricted locality, limited memory, and implicit coordination. To address this, we propose a high connectivity based sensor [...] Read more.
This paper explores the application of mobile sensor networks in cow herds, focusing on the challenge of achieving local communication under minimal computational constraints such as restricted locality, limited memory, and implicit coordination. To address this, we propose a high connectivity based sensor network scheme that enables individual sensors to self-organize and dynamically adapt to topological variations caused by cow movements. In this scheme, each sensor acquires local distribution data from neighboring sensors, identifies those with high connectivity, and forms a local network with a star topology. The overlap of these local networks results in a globally interconnected mesh topology. Furthermore, information exchanged through broadcasting and overhearing allows each sensor to incrementally update and adapt to dynamic changes in its local network. To validate the proposed scheme, a custom wireless sensor tag was developed and mounted on the necks of individual cows for experimental testing. Furthermore, large-scale simulations were performed to evaluate performance in herd environments. Both experimental and simulation results confirmed that the scheme effectively maintains network coverage and connectivity under dynamic herd conditions. Full article
(This article belongs to the Special Issue Intelligent Telecommunications Mobile Networks)
Show Figures

Graphical abstract

17 pages, 5905 KB  
Article
Internet of Plants: Machine Learning System for Bioimpedance-Based Plant Monitoring
by Łukasz Matuszewski, Jakub Nikonowicz, Jakub Bonczyk, Mateusz Tychowski, Tomasz P. Wyka and Clément Duhart
Sensors 2025, 25(24), 7549; https://doi.org/10.3390/s25247549 - 12 Dec 2025
Viewed by 226
Abstract
Sensors in plant and crop monitoring play a key role in improving agricultural efficiency by enabling the collection of data on environmental conditions, soil moisture, temperature, sunlight, and nutrient levels. Traditionally, wide-scale wireless sensor networks (WSNs) gather this information in real-time, supporting the [...] Read more.
Sensors in plant and crop monitoring play a key role in improving agricultural efficiency by enabling the collection of data on environmental conditions, soil moisture, temperature, sunlight, and nutrient levels. Traditionally, wide-scale wireless sensor networks (WSNs) gather this information in real-time, supporting the optimization of cultivation processes and plant management. Our paper proposes a novel “plant-to-machine” interface, which uses a plant-based biosensor as a primary data source. This model allows for direct monitoring of the plant’s physiological parameters and environmental interactions via Electrical Impedance Spectroscopy (EIS), aiming to reduce the reliance on extensive sensor networks. We present simple data-gathering hardware, a non-invasive single-wire connection, and a machine learning-based framework that supports the automatic analysis and interpretation of collected data. This approach seeks to simplify monitoring infrastructure and decrease the cost of digitizing crop monitoring. Preliminary results demonstrate the feasibility of the proposed model in monitoring plant responses to sunlight exposure. Full article
(This article belongs to the Section Smart Agriculture)
Show Figures

Figure 1

22 pages, 13391 KB  
Article
LSCNet: A Lightweight Shallow Feature Cascade Network for Small Object Detection in UAV Imagery
by Zening Wang and Amiya Nayak
Future Internet 2025, 17(12), 568; https://doi.org/10.3390/fi17120568 - 11 Dec 2025
Viewed by 210
Abstract
Unmanned Aerial Vehicles have become essential mobile sensing nodes in Internet of Things ecosystems, with applications ranging from disaster monitoring to traffic surveillance. However, wireless bandwidth is severely strained when sending enormous amounts of high-resolution aerial video to ground stations. To address these [...] Read more.
Unmanned Aerial Vehicles have become essential mobile sensing nodes in Internet of Things ecosystems, with applications ranging from disaster monitoring to traffic surveillance. However, wireless bandwidth is severely strained when sending enormous amounts of high-resolution aerial video to ground stations. To address these communication limitations, the current research paradigm is shifting toward UAV-assisted edge computing, where visual data is processed locally to extract semantic information for transmitting results to the ground or making autonomous decisions. Although deep detection is the dominant trend in general object detection, the heavy computational burden of these deep detection methods struggles to meet the stringent efficiency requirements of airborne edge platforms. Consequently, although recently proposed single-stage models like YOLOv10 can quickly detect objects in natural images, their over-dependence on deep features for computation results in wasted computational resources, as shallow information is crucial for small object detection in aerial scenes. In this paper, we propose LSCNet (Lightweight Shallow Feature Cascade Network), a novel lightweight architecture designed for UAV edge computing to handle aerial object detection tasks. Our lightweight Cascade Network focuses on feature extraction and shallow feature enhancement. LSCNet achieves 44.6% mAP50 on VisDrone2019 and 36.1% mAP50 on UAVDT, while decreasing parameters by 33% to 1.48 M. These results not only show how effective LSCNet is for real-time object detection but also provide a foundation for future developments in semantic communication within aerial networks. Full article
Show Figures

Figure 1

20 pages, 324 KB  
Review
LPWAN Technologies for IoT: Real-World Deployment Performance and Practical Comparison
by Dmitrijs Orlovs, Artis Rusins, Valters Skrastiņš and Janis Judvaitis
IoT 2025, 6(4), 77; https://doi.org/10.3390/iot6040077 - 10 Dec 2025
Viewed by 367
Abstract
Low Power Wide Area Networks (LPWAN) have emerged as essential connectivity solutions for the Internet of Things (IoT), addressing requirements for long range, energy efficient communication that traditional wireless technologies cannot meet. With LPWAN connections projected to grow at 26% compound annual growth [...] Read more.
Low Power Wide Area Networks (LPWAN) have emerged as essential connectivity solutions for the Internet of Things (IoT), addressing requirements for long range, energy efficient communication that traditional wireless technologies cannot meet. With LPWAN connections projected to grow at 26% compound annual growth rate until 2027, understanding real-world performance is crucial for technology selection. This review examines four leading LPWAN technologies—LoRaWAN, Sigfox, Narrowband IoT (NB-IoT), and LTE-M. This review analyzes 20 peer reviewed studies from 2015–2025 reporting real-world deployment metrics across power consumption, range, data rate, scalability, availability, and security. Across these studies, practical performance diverges from vendor specifications. In the cited rural and urban LoRaWAN deployments LoRaWAN achieves 2+ year battery life and 11 km rural range but suffers collision limitations above 1000 devices per gateway. Sigfox demonstrates exceptional range (280 km record) with minimal power consumption but remains constrained by 12 byte payloads and security vulnerabilities. NB-IoT provides robust performance with 96–100% packet delivery ratios at −127 dBm on the tested commercial networks, and supports tens of thousands devices per cell, though mobility increases energy consumption. In the cited trials LTE-M offers highest throughput and sub 200 ms latency but fails beyond −113 dBm where NB-IoT maintains connectivity. NB-IoT emerges optimal for large scale stationary deployments, while LTE-M suits high throughput mobile applications. Full article
15 pages, 6102 KB  
Article
Design and Analysis of a Dual-Band Implantable Receiving Antenna for Wireless Power Transfer and Data Communication at 1.32 GHz and 2.58 GHz
by Ashfaq Ahmad, Sun-Woong Kim and Dong-You Choi
Sensors 2025, 25(24), 7507; https://doi.org/10.3390/s25247507 - 10 Dec 2025
Viewed by 230
Abstract
This paper presents the design and performance evaluation of a compact dual-band implantable antenna (Rx) operating at 1.32 GHz and 2.58 GHz for biomedical applications. The proposed antenna is designed to receive power and data from an external transmitting (Tx) antenna operating at [...] Read more.
This paper presents the design and performance evaluation of a compact dual-band implantable antenna (Rx) operating at 1.32 GHz and 2.58 GHz for biomedical applications. The proposed antenna is designed to receive power and data from an external transmitting (Tx) antenna operating at 1.32 GHz. The measured impedance bandwidths of the Rx antenna are 190 MHz (1.23–1.42 GHz) and 230 MHz (2.47–2.70 GHz), covering both the power transfer and data communication bands. The wireless power transfer efficiency, represented by the transmission coefficient (S21), is observed to be −40 dB at a spacing of 40 mm, where the Rx is located in the far-field region of the Tx. Specific Absorption Rate (SAR) analysis is performed to ensure electromagnetic safety compliance, and the results are within the acceptable exposure limits. The proposed antenna achieves a realized gain of −25 dB at 1.32 GHz and −25.8 dB at 2.58 GHz, demonstrating suitable performance for low-power implantable medical device communication and power transfer systems. The proposed design offers a promising solution for reliable biotelemetry and wireless power transfer in implantable biomedical systems. Full article
(This article belongs to the Special Issue Novel Implantable Sensors and Biomedical Applications)
Show Figures

Figure 1

14 pages, 725 KB  
Article
IRS-Assisted Dual-Mode Relay-Based Adaptive Transmission
by Dabao Wang, Yanhong Xu, Zhangbo Gao, Hanqing Ding, Shitong Zhu and Zhao Li
Sensors 2025, 25(24), 7492; https://doi.org/10.3390/s25247492 - 9 Dec 2025
Viewed by 229
Abstract
To address the challenges posed by increased power consumption in traditional active relays and the difficulties associated with countering channel fading for Intelligent Reflecting Surfaces (IRSs), we propose a dual-mode relay (DMR). This relay can dynamically switch between two operational modes: active relaying [...] Read more.
To address the challenges posed by increased power consumption in traditional active relays and the difficulties associated with countering channel fading for Intelligent Reflecting Surfaces (IRSs), we propose a dual-mode relay (DMR). This relay can dynamically switch between two operational modes: active relaying and passive IRS reflection. The DMR allows its units (DMRUs) to select their operational modes based on channel conditions. This capability enables the transmission of composite-mode signals, which consist of both active relaying components and IRS-reflected components. This dynamic switching enhances adaptation to the wireless environment. Furthermore, under the constraint of limited transmit power, we introduce a DMR-based Adaptive Transmission (DMRAT) method. This approach explores all possible DMR operational modes and employs the Alternating Optimization (AO) algorithm in each mode to jointly optimize the beamforming matrices of both the transmitter and the DMR, along with the reflection coefficient matrix of the IRS. Consequently, this maximizes the data transmission rate for the target communication pair. The optimal DMR mode can then be determined based on the optimized data rate for the target communication across various operational modes. Simulation results demonstrate that the proposed method significantly enhances the data transmission rate for the target communication pair. Full article
(This article belongs to the Special Issue Future Wireless Communication Networks: 3rd Edition)
Show Figures

Figure 1

20 pages, 3599 KB  
Article
An Adaptative Wavelet Time–Frequency Transform with Mamba Network for OFDM Automatic Modulation Classification
by Hongji Xing, Xiaogang Tang, Lu Wang, Binquan Zhang and Yuepeng Li
AI 2025, 6(12), 323; https://doi.org/10.3390/ai6120323 - 9 Dec 2025
Viewed by 250
Abstract
Background: With the development of wireless communication technologies, the rapid advancement of 5G and 6G communication systems has spawned an urgent demand for low latency and high data rates. Orthogonal Frequency Division Multiplexing (OFDM) communication using high-order digital modulation has become a key [...] Read more.
Background: With the development of wireless communication technologies, the rapid advancement of 5G and 6G communication systems has spawned an urgent demand for low latency and high data rates. Orthogonal Frequency Division Multiplexing (OFDM) communication using high-order digital modulation has become a key technology due to its characteristics, such as high reliability, high data rate, and low latency, and has been widely applied in various fields. As a component of cognitive radios, automatic modulation classification (AMC) plays an important role in remote sensing and electromagnetic spectrum sensing. However, under current complex channel conditions, there are issues such as low signal-to-noise ratio (SNR), Doppler frequency shift, and multipath propagation. Methods: Coupled with the inherent problem of indistinct characteristics in high-order modulation, these currently make it difficult for AMC to focus on OFDM and high-order digital modulation. Existing methods are mainly based on a single model-driven approach or data-driven approach. The Adaptive Wavelet Mamba Network (AWMN) proposed in this paper attempts to combine model-driven adaptive wavelet transform feature extraction with the Mamba deep learning architecture. A module based on the lifting wavelet scheme effectively captures discriminative time–frequency features using learnable operations. Meanwhile, a Mamba network constructed based on the State Space Model (SSM) can capture long-term temporal dependencies. This network realizes a combination of model-driven and data-driven methods. Results: Tests conducted on public datasets and a custom-built real-time received OFDM dataset show that the proposed AWMN achieves a performance reaching higher accuracies of 62.39%, 64.50%, and 74.95% on the public Rml2016(a) and Rml2016(b) datasets and our formulated EVAS dataset, while maintaining a compact parameter size of 0.44 M. Conclusions: These results highlight its potential for improving the automatic modulation classification of high-order OFDM modulation in 5G/6G systems. Full article
(This article belongs to the Topic AI-Driven Wireless Channel Modeling and Signal Processing)
Show Figures

Figure 1

Back to TopTop