1. Introduction
The advancement of the digital age has significantly transformed the management of healthcare services. Wireless communication has become one of the most indispensable enablers for modern eHealth systems, supporting continuous patient monitoring, real-time data exchange, and seamless connectivity between healthcare professionals and medical devices. Hospitals and care facilities currently rely heavily on wireless technologies to connect a wide variety of devices ranging from wearable sensors to ventilators and other critical medical equipment. In most modern healthcare settings, Wi-Fi serves as a backbone for connectivity due to its relatively lower cost, high accessibility, and ability to support many devices simultaneously. However, with the rapid proliferation of interconnected devices, the existing Wi-Fi systems are approaching their operational limits. The core challenge addressed in this paper is whether legacy Wi-Fi 5 (IEEE 802.11ac) can reliably support dense (large number of stations) Internet of Medical Things (IoMT) deployments with strict latency, reliability, and energy constraints, and to what extent Wi-Fi 6/7 mechanisms, specifically OFDMA and Target Wake Time (TWT) can mitigate these limitations.
Most existing healthcare facilities and hospitals still operate Wi-Fi 5, that is based on Orthogonal Frequency-Division Multiplexing (OFDM). Although OFDM has served effectively for the last decade, it was primarily designed for general-purpose connectivity rather than high-density networks and energy-constrained devices. This, combined with the stringent Quality of Service (QoS) requirements of modern eHealth and medical IoMT applications, limited its suitability for operating in such settings. A fundamental limitation of OFDM is its use of a single user per channel transmission. Therefore, in multi-user scenarios, such as hospital settings with numerous devices transmitting data simultaneously, channel access contention increases causing excessive packet loss, delay, and jitter.
To overcome the challenges faced by Wi-Fi 5, IEEE introduced Wi-Fi 6 (IEEE 802.11ax) and, more recently, Wi-Fi 7 (IEEE 802.11be) incorporating transformative technologies such as OFDMA and TWT. Unlike OFDM that allows a single user per channel, OFDMA enables multiple users to share the same channel, improving spectral efficiency and reducing contention. This capability is particularly relevant in high-density IoMT environments as timely and reliable data delivery is critical. Complementing OFDMA, the TWT mechanism allows devices to negotiate specific wake and sleep intervals with the access point, resulting in significant energy savings that are an essential consideration for battery-powered medical devices.
To ensure realistic simulations, the traffic model reflects typical IoMT communication patterns, characterized by small, periodic payloads commonly observed in wearable sensors and medical monitoring devices. This allows for a practical assessment of network performance under conditions that closely mimic real-world hospital environments.
While the potential attributes of benefits of OFDMA and TWT are individually documented in general IoT contexts, their synergistic trade-offs under the specific constraints of dense IoMT, characterized by high-frequency, small-payload (64-byte) telemetry have remained under-explored. Previous foundational research has established that 802.11ax offers significant performance gains over 802.11ac for real-time traffic applications [
1]; however, this study’s unique contribution lies in quantifying the threshold edge where TWT’s energy preservation begins to compromise the aggregate reliability of transmission (i.e., QoS performance) required for clinical applications. By synthesizing these findings into a targeted deployment framework, this study has provided a decision-matrix for hospital IT infrastructure that balances the battery longevity against mission-critical telemetry integrity.
Despite advantages of digital healthcare, dense eHealth and IoMT networks frequently suffer from critical performance bottlenecks, including excessive latency, packet loss, and rapid battery depletion. While Wi-Fi 6/7 introduces advanced features to mitigate these issues, further studies are needed to improve understanding of operational trade-offs between OFDMA and TWT. Specifically, existing studies do not clearly demonstrate the precise conditions under which a network designer should prioritise OFDMA for throughput and reliability versus TWT for energy preservation. This study investigated that gap by quantifying these trade-offs in a simulated clinical environment. IoMT traffic was modelled in NS-3 using a 64-byte payload, reflecting realistic medical device data, ensuring that simulations accurately capture the behaviour of small, frequent IoMT transmissions. This approach bridges the gap between generic IoT studies and the specific needs of healthcare environments. In addressing these challenges, this study provided a comprehensive evaluation of Wi-Fi 6 Quality of Service (QoS) tailored specifically to high-density eHealth traffic. The study offered a detailed comparative performance analysis between OFDMA and legacy OFDM to identify the specific density thresholds where multi-user access becomes superior. Furthermore, the study investigated the impact of Target Wake Time (TWT) on device energy consumption and its subsequent effect on network-wide QoS. The paper concludes with evidence-based deployment recommendations intended to guide hospital IT administrators and engineers in the transition from legacy Wi-Fi 5 to modern standards. Contributions of this study are as follows:
It examines how uplink OFDMA and traditional OFDM behave under high device density, using 64-byte medical telemetry packets to represent IoMT traffic in clinical environments.
It shows that, although OFDMA improves performance as the number of devices increases, the signalling and scheduling overhead can reduce its efficiency when packet sizes are small. This highlights a practical trade-off between spectral efficiency and protocol overhead in dense IoMT networks.
It provides a joint analysis of OFDMA and Target Wake Time (TWT), showing how energy savings achieved through TWT can affect throughput and latency when many medical devices transmit concurrently.
Based on these findings, the paper presents practical guidance to help network designers decide when OFDMA and selective TWT are suitable for different classes of medical IoT devices in hospital deployments.
The remainder of this paper is organized as follows:
Section 2 reviews related literature on wireless communication protocols and IoMT networking in healthcare environments.
Section 3 explains the methodology and details the NS-3 simulation setup, including ethical considerations.
Section 4 presents the simulation results and analysis, comparing the performance of Wi-Fi 5 and Wi-Fi 6/7.
Section 5 concludes with key findings, recommendations for implementation, and directions for future research.
2. Literature Review
2.1. Medical IoT Devices and Network Demands
Integration of eHealth and medical IoMT devices necessitates a robust, reliable, and low-latency wireless infrastructure. According to the Scottish Government’s eHealth strategy [
2], major objectives for eHealth in the NHS include improving patient information availability, supporting communication within the NHS, and enhancing care integration. Successful realisation of these objectives depends on reliable and timely data transmission that remains essential despite advances in IoT technologies. Beyond the physical and MAC layer connectivity discussed in this study, the broader IoMT ecosystem requires integrated intelligence for data management. For instance, recent advancements suggest the use of autonomous online learning methods for tutorial generation in healthcare education [
3] and the implementation of federated learning combined with blockchain to secure access control for sensitive genomic and clinical data [
4]. While these higher-layer security and learning frameworks are critical for IoMT ecosystem integrity, they rely fundamentally on the stable, low-latency Wi-Fi 6/7 foundations evaluated in this research. Recent studies have highlighted the manner emerging wireless technologies can address these demands in IoMT networks. A study demonstrated how Industrial IoT (IIoT) systems leverage Wi-Fi 6 and OFDMA to significantly reduce channel access contention by scheduling multiple users simultaneously [
5]. Without OFDMA, stations (STAs) contend to access a channel one at a time thus causing congestion and inefficient spectrum utilization in legacy OFDM networks.
Several studies emphasized the stringent requirements of medical IoMT systems. It has been reported that eHealth diagnostic devices must reliably transmit accurate and secure data to prevent compromised clinical decisions and ensure patient safety [
6]. Similarly, the importance of low-cost and efficient devices to scale IoMT infrastructure effectively has been highlighted [
7]. Portability is also crucial for ubiquitous patient monitoring. The feasibility of developing a low-cost Raspberry Pi–based health monitoring system has been demonstrated [
8]; however, real-time operations still require consistent high QoS from the underlying network.
2.2. Legacy Technology: OFDM
Given the real-time data requirements of medical IoMT devices, the performance of the underlying wireless protocol is critical. Many healthcare networks continue to rely on Wi-Fi 5 (IEEE 802.11ac) that uses OFDM. OFDM divides a 20 MHz channel into 64 subcarriers, transmitting data to a single user at a time [
9]. While effective for low-density networks, OFDM becomes inefficient when many devices operate simultaneously. It is reported that OFDM struggles to meet low-latency, reliable, and scalable communication requirements in modern clinical settings [
10]. As hospitals increasingly deploy dense IoMT networks, channel contention under OFDM leads to increased delay, jitter, and packet loss, emphasizing the need for advanced multi-user wireless protocols.
2.3. Wi-Fi 6 Multi-User Protocol: OFDMA
To overcome OFDM limitations in dense medical IoMT environments, Wi-Fi 6 introduced OFDMA, which divides channels into smaller Resource Units (RUs) that can be allocated to multiple users simultaneously [
11]. This parallel transmission capability reduces contention, improves spectral efficiency, and increases overall network throughput. It has been reported that OFDMA reduces medium access latency by approximately 4 ms and improves throughput by about 10% as compared to the legacy OFDM networks [
12]. However OFDMA performance depends on channel availability and precise synchronization to minimize inter-carrier interference (ICI).
System-level simulation models in NS-3, including the vBerlinV2N approach [
13] and updated OFDMA implementations [
14], provide high-fidelity, replicable evaluations of multi-user scheduling under realistic scenarios, including uplink/downlink behaviour, node mobility, and contention management. Additional schedulers such as Fixed Bandwidth (FBW) and Flexible High Order Layer (FBW) optimize RU allocation depending on network load [
15]. Recent studies explored advanced Medium Access Control (MAC) protocols (HTFA, ERA, PRS) and polling algorithms (A2P) that leverage OFDMA to meet QoS demands in high-density, time-sensitive networks [
16,
17]. These approaches ensure steady throughput, low packet loss, and reduced retransmissions, directly addressing IoMT requirements. Emerging Wi-Fi 7 (IEEE 802.11be) continues to support OFDMA and introduces enhanced multi-user features, indicating that the findings from Wi-Fi 6 OFDMA/TWT evaluations are relevant for upcoming network standards, reinforcing the applicability of this study to future IoMT deployments.
2.4. Power Efficiency Protocol: Target Wake Time
TWT is a key feature in Wi-Fi 6 that complements OFDMA by addressing energy consumption challenges in IoMT devices. TWT allows devices to negotiate specific wake and sleep intervals with the access point (AP). During sleep periods, devices remain in a low-power “doze” state thus significantly reducing redundant radio activity and overall energy consumption. This is a critical consideration for battery-powered medical IoMT devices. This capability is particularly valuable in hospital settings where multiple wearable and monitoring devices operate regularly.
Scheduled access under TWT uses a Trigger Frame (TF), which contains the list of selected STAs and their defined transmission parameters [
18,
19]. This mechanism allows multiple STAs to perform simultaneous uplink transmissions without packet collisions, improving both efficiency and reliability. By coordinating device transmissions, TWT minimizes channel contention, thereby reducing packet loss and improving overall QoS.
While TWT provides substantial merits, several challenges remain. For devices with infrequent traffic, clock drift can lead to missed wake-up intervals, resulting in failed transmissions [
20,
21]. Adjusting the wake interval and duration can help mitigate this, but may involve trade-offs between energy efficiency, throughput, and latency [
22,
23]. Additionally, TWT’s effectiveness depends on accurate scheduling in dense networks. Misconfigurations can reduce its benefits, particularly in high-traffic hospital environments.
This study simulated the networks in NS-3. In NS-3 simulations, TWT can be integrated alongside OFDMA to evaluate energy efficiency and QoS simultaneously. By combining multi-user scheduling with controlled wake/sleep intervals, the simulator can replicate real-world medical IoMT scenarios, where devices transmit small payloads (e.g., 64 bytes) frequently but must remain energy efficient. These simulations help quantify the trade-offs between reduced power consumption and potential throughput decline, particularly under high-density conditions. For example, studies have shown that TWT can reduce device power consumption by up to 90%, but under extreme STA density, throughput may decrease by approximately 75% [
24].
Overall, TWT is a critical mechanism for energy-constrained medical IoMT devices, and when combined with OFDMA, it provides a framework for scalable, low-power, high-performance wireless networks suitable for dense healthcare environments. These simulations in this study explicitly modelled both uplink and downlink behaviours under TWT scheduling, allowing quantitative assessments of its benefits and limitations in realistic deployment scenarios.
2.5. Gaps in Current Research
Despite the documented benefits of IEEE 802.11ax, significant research gaps remain regarding the efficiency-to-overhead ratio in dense IoMT environments. While existing evaluations demonstrate the advantages of OFDMA and TWT in multi-client scenarios, they often highlight critical trade-offs, such as unexpected power spikes which suggest that real-world network behaviour can deviate from theoretical models [
25,
26]. Furthermore, most studies have focused on general-purpose settings, frequently overlooking the unique constraints of clinical environments where numerous devices must transmit under strict QoS expectations [
25]. While Ref. [
25] assesses the general readiness of Wi-Fi 7 for eHealth, it does not provide an empirical performance evaluation of the combined OFDMA and TWT synergy; our study addresses this by quantifying the trade-offs of using both features simultaneously in a dense medical environment. A specific contradiction, which we term the ‘Efficiency Paradox’, remains under-explored: as station density increases in a hospital ward, the management overhead required for Resource Unit (RU) allocation can become disproportionate to the small 64-byte payloads being sent. This may eventually negate the efficiency gains typically expected from the transition from OFDM to OFDMA. However, this behaviour has not yet been examined through a focused threshold-based analysis of the combined interaction between OFDMA and TWT under dense medical telemetry.
Comparative studies across Wi-Fi generations highlighted that latency and throughput may vary under high contention and interference. Deployment complexities, including interference, scheduling limitations, and clock drift, are rarely addressed, yet directly impact patient’s monitoring and safety [
18,
20,
22]. As Wi-Fi 7 enhances OFDMA and TWT, further scenario-specific evaluations are required for practical IoMT deployment [
27]. Energy efficiency for battery-powered devices is also underexplored, despite the prevalence of wearable and bedside medical devices.
NS-3 simulations have been widely used for Wi-Fi 6/7 studies, but few considered OFDMA and TWT together under realistic IoMT conditions, including small payload traffic, uplink/downlink behaviour, and network contention [
13,
14,
28].
In summary, existing studies confirm that OFDMA and TWT improve efficiency in general Wi-Fi networks. However, there remains limited understanding of how their combined use behaves under dense, uplink-dominated IoMT traffic with very small payloads. In particular, the point at which protocol overhead begins to offset efficiency gains has not been clearly quantified. This unresolved issue motivates the focused evaluation presented in this study.
3. Methodology
3.1. Research Design
This study employs a quantitative experimental framework to investigate the performance of IEEE 802.11ax (Wi-Fi 6) protocols within dense medical Internet of Medical Things (IoMT) settings. It utilized two distinct simulation experiments to isolate the impact of specific protocol features on network reliability and energy efficiency:
Simulation 1 (Access Mechanism): Evaluated the transition from Contention-based OFDM to Scheduled OFDMA.
Simulation 2 (Power Management): Evaluated the impact of Target Wake Time (TWT) by comparing a baseline against a TWT-enabled environment.
The NS-3 simulation platform was selected as it allows accurate and repeatable modelling of the 802.11ax PHY and MAC layers. This approach overcame the ethical and safety constraints of conducting live experiments in hospital wards while providing granular control over network density and traffic patterns [
12]. To ensure statistical significance and minimize the impact of transient network fluctuations, each scenario was executed over an average of five simulation runs using varying random seeds, with the results aggregated to provide the final performance metrics.
3.2. Network Architecture
The network architecture, as illustrated in
Figure 1, represents a single hospital ward or clinical monitoring zone. A single Access Point (AP) is positioned at the centre, with IoMT stations (STAs) distributed at a fixed distance of 10 metres from the AP. By maintaining this constant 10 m radius, we ensure a consistently high Signal-to-Noise Ratio (SNR) across all devices. This configuration is critical for our research design, as it ensures that recorded packet loss, jitter, and delay are strictly results of MAC-layer protocol logic and channel contention, rather than physical layer signal fading or path loss. We employ NS-3 simulations to model protocol behaviour under controlled conditions, which allows precise evaluation of QoS metrics and identification of threshold effects in dense IoMT networks. While real hardware and environmental interference are not included, this approach provides a robust proof-of-concept for the performance trade-offs of OFDMA and TWT under idealized conditions.
To represent a real-world medical environment, the network density was scaled incrementally, with the number of devices (N) set at 4, 8, 12, 16, 20, 24, 28, and 32.
All STAs were configured using the constant position mobility model in NS-3. Using static positioning allows for the precise isolation of OFDMA and TWT scheduling efficiency, preventing the unpredictable variables of patient mobility from skewing the protocol comparison.
The traffic consists of 64-byte UDP packets sent in the uplink direction (STA to AP). This size is representative of real-world IoMT telemetry once encapsulated with IPv4 (20 bytes), UDP (8 bytes), and lightweight protocol overhead (e.g., MQTT-SN or CoAP). Medical sensor communication studies report that typical ECG sensors output data streams on the order of tens of kilobits per second, while SpO
2 sensors operate at similar low rates, implying frequent small sensor updates rather than large bursts of data [
29]. Furthermore, surveys of wearable sensor systems show that payloads in wireless body area networks commonly fall within the tens-of-bytes range for healthcare monitoring applications [
30]. Thus, the 64-byte size is a realistic choice to model small uplink IoMT traffic, enabling meaningful evaluation of protocol overhead and performance under dense device conditions. While the focus of this study is on uplink traffic as the primary bottleneck for periodic sensor telemetry, future work will incorporate downlink traffic and control messages to capture bidirectional communication effects in dense IoMT deployments.
3.3. Simulation 1: Contention vs. Scheduling (OFDM/OFDMA)
This experiment focused on how the MAC layer handles channel access for multiple devices. The operational logic for this experiment is detailed in
Figure 2.
Phase 1 (OFDM): Devices utilized the standard-compliant Station and Access Point MAC architectures. Although multiple stations were present, they competed for the entire 80 MHz channel using the Carrier Sense Multiple Access with Collision Avoidance (CSMA/CA) protocol. In this legacy mode, only one station can occupy the channel at any given time, leading to significant contention overhead and “wait-time” bottlenecks as the device density is increased.
Phase 2 (OFDMA): Maintaining identical physical parameters, the Uplink Orthogonal Frequency-Division Multiple Access (UL-OFDMA) capability was activated. The Access Point (AP) employed a Round Robin Multi-User Scheduler to partition the 80 MHz channel into discrete Resource Units (RUs). As illustrated in
Figure 2 flowchart, this configuration enabled parallel transmissions from multiple stations (STAs) through the coordination of Trigger-Based (TB) Physical Layer Protocol Data Units (PPDUs). This shift from contention to scheduled access allows for simultaneous medium occupancy, significantly increasing spectral efficiency.
The specific technical configurations of OFDM/OFDMA used to execute these phases in the NS-3 environment are summarized in
Table 1.
3.4. Simulation 2: Energy Conservation (TWT)
Simulation 2 investigated the reduction of “idle listening” energy consumption. To accurately model the sub-carrier behavior of Wi-Fi 6, a high-fidelity multi-model spectrum channel was utilized. This provided a high-fidelity representation of the 80 MHz band, accounting for the frequency–domain interactions between Resource Units (RUs) during simultaneous multi-user transmissions [
31]. The operational flow for this power-management study is shown in the Simulation 2 Flowchart (
Figure 3).
Phase 1 (Baseline): The standard active/idle power management where the radio remains ready to receive at all times was represented.
Phase 2 (TWT Enabled): A TWT scheduler with a 20% duty cycle was implemented. STAs enter a deep sleep state for 80% of the simulation cycle.
The energy-specific parameters, including the currents and duty cycle logic used in the TWT code, are detailed in
Table 2.
3.5. Metrics Computation
The performance metrics were defined individually. These metrics were calculated using the NS-3 Flow Monitor and Energy Source data.
- 2.
Packet Loss Ratio (PLR): The percentage of packets that failed to reach the AP relative to the total packets transmitted (Ptx):
- 3.
Mean Delay and Jitter: Delay represents the average time for a packet to travel from the STA to the AP. Jitter is the variance in this delay between consecutive packets, calculated as the means of the absolute differences between delays.
- 4.
Power Consumption (PdBm): Based on the energy consumed (Econs), the average power was calculated and converted to a decibel scale relative to 1 milliwatt:
Traffic is always uplink from STA to AP. TWT scheduling is applied in the TWT scenario to simulate energy-saving mechanisms, where STAs are awake only during scheduled intervals. NS-3 allows controlled and repeatable simulations, ensuring that OFDM, OFDMA, and TWT are evaluated under identical conditions. While real-world experiments could capture additional environmental variability, practical constraints in hospitals, including limited access and patient safety, make NS-3 the preferred tool. The simulation setup also allows precise control of payload sizes, device counts, and channel conditions, which is essential for IoMT-focused studies.
4. Results
The performance characteristics of the wireless protocols under investigation, i.e., OFDM, OFDMA, and TWT were evaluated using NS-3 simulations. Networks with a single access point (AP) and varying numbers of IoMT devices (STAs: 2, 4, 8, 12, 16, 20, 24, 28, 32) were simulated to reflect practical hospital deployments. The traffic consisted of uplink UDP transmissions with 64-byte payloads to emulate typical IoMT data, and each configuration was validated by calculating the mean performance across five independent simulation runs. This approach ensured that the captured metrics, such as the 37% delay reduction in OFDMA, reflecting the steady-state behavioral characteristics of the protocols rather than isolated occurrences. The results are presented in two groups: (a) OFDM vs. OFDMA (Wi-Fi 5 vs. Wi-Fi 6/7) without TWT, and (b) TWT evaluation (Wi-Fi 6/7) comparing enabled and disabled TWT. In all figures, the x-axis represents the number of STAs, while the y-axis represents the metric under evaluation.
4.1. Performance Comparison of OFDMA vs. OFDM
The following subsections present a comparison between OFDM (baseline, Wi-Fi 5) and OFDMA (Wi-Fi 6/7) across key QoS metrics, using simulations with STA densities of 2, 4, 8, 12, 16, 20, 24, 28, and 32. All results are illustrated in
Figure 4,
Figure 5,
Figure 6,
Figure 7 and
Figure 8, with OFDM shown in blue and OFDMA in black.
4.1.1. Delay
As shown in
Figure 4, the performance gap between the two methods becomes increasingly evident as the network scales. While both protocols exhibit similar latency at low station densities, OFDM experiences a sharp rise in average delay, reaching roughly 175 ms at 32 STAs due to heavy channel contention. Conversely, OFDMA maintains a much more stable profile, peaking at approximately 110 ms. This represents a 37% reduction in delay compared to OFDM, an advantage that becomes more pronounced as the number of STAs increases. This improvement is primarily due to OFDMA’s use of Resource Unit (RU) allocation, which enables coordinated multi-user scheduling and allows multiple stations to transmit simultaneously allocation, which reduces the contention window (CW) size and collision probability. By mitigating the ‘wait-time’ bottlenecks inherent in OFDM, OFDMA ensures the reliable, low-latency communication required for high-density IoMT medical environments.
4.1.2. Power Consumption
As shown in
Figure 5, the network’s power consumption exhibits a comparable upward trend for both protocols as the number of stations increases. However, OFDMA maintains a slight but consistent efficiency advantage, achieving an approximate 3% reduction in energy usage compared to OFDM. This improvement is primarily due to OFDMA’s ability to serve multiple STAs simultaneously through Resource Unit (RU) allocation. By grouping transmissions, the access point (AP) significantly reduces the cumulative overhead associated with individual channel access and “listen-before-talk” contention periods required by OFDM. While the margin remains modest, this enhanced efficiency is particularly beneficial for energy-constrained IoMT devices, where even marginal power savings can extend the operational longevity of critical medical sensors in dense deployments.
4.1.3. Jitter
The jitter performance for both protocols is presented in
Figure 6, where both methods maintain a stable and relatively low jitter profile as the network expands. While the two curves appear nearly identical, a closer inspection reveals that OFDMA provides a marginal improvement of approximately 2% over OFDM. This slight reduction in packet delay variation is primarily due to the more disciplined scheduling of Resource Units (RUs), which ensures a more consistent transmission rhythm compared to the random nature of contention-based access. Although the performance difference appeared to be minor in this metric, the enhanced stability offered by OFDMA provides a measurable benefit for mission-critical IoMT applications. In real-time patient monitoring, reducing delay variation, even marginally, can contribute to a more deterministic data flow, which can be beneficial for the temporal accuracy of continuous health data collection. The improvement in jitter by 2% in OFDMA is attributed to the access point (AP) acting as a centralized coordinator. Unlike the random backoff intervals in OFDM, the AP assigns Resource Units (RU) at precise, deterministic intervals, thus reducing the variance in packet arrival times.
4.1.4. Packet Loss Ratio
As shown in
Figure 7, the packet loss ratio (PLR) highlights a critical divergence in the network’s reliability between the two protocols as the environment becomes more congested. While both methods maintain negligible packet loss at low station counts, OFDM experiences a sharp increase in packet loss beyond 16 STAs, reaching a peak PLR of approximately 0.18% at 32 stations. Conversely, OFDMA exhibits a far more stable performance, maintaining a significantly lower packet loss rate of roughly 0.025% under the same conditions. This equates to an approximate 85% reduction in packet loss compared to OFDM, an advantage that becomes increasingly important as the number of STAs increases. This superior reliability is primarily due to OFDMA’s multi-user scheduling and Resource Unit (RU) allocation, which effectively minimizes the collisions and subsequent retransmissions inherent in the contention-based access of OFDM. Consequently, OFDMA ensures the robust data integrity required for high-density IoMT applications where packet loss could compromise patient monitoring.
4.1.5. Throughput
The network throughput for both protocols is illustrated in
Figure 8, which demonstrates that while both methods perform similarly at low densities, a substantial performance gap develops as the number of stations exceeds 20. In high-density scenarios, OFDM experiences a steep decline in throughput due to increased channel contention and collision overhead, falling to approximately 60 Mbps at 32 STAs. In contrast, OFDMA exhibits superior resilience, maintaining a higher throughput of roughly 63 Mbps under identical conditions. This represents an approximate 20% improvement in comparative throughput efficiency at peak density by mitigating the degradation typically seen in single-user access models. This enhanced performance is achieved by Resource Unit (RU) allocation that maximizes spectral efficiency and allows for parallel data streams. Consequently, OFDMA provided a more resilient data transfer capability better suited for supporting high densities of concurrent devices in a hospital environment compared to the legacy contention-based models.
4.2. Evaluation of TWT Performance
This section presents the impact of Target Wake Time (TWT) on key QoS metrics. Simulations were conducted with STA densities of 2, 4, 8, 12, 16, 20, 24, 28, and 32, and the results are illustrated in
Figure 9,
Figure 10,
Figure 11,
Figure 12 and
Figure 13, with no-TWT in blue and TWT-enabled in black.
4.2.1. TWT Delay
The impact of Target Wake Time (TWT) on average network delay is illustrated in
Figure 9, highlighting a significant divergence in performance as the station density scales. While the network without TWT experiences a sharp increase in delay—climbing to approximately 200 ms at 32 STAs—the TWT-enabled configuration maintains a remarkably low and stable profile. Specifically, at peak density, TWT limits the delay to roughly 30 ms, representing a substantial reduction compared to the non-TWT scenario. This enhanced stability is primarily due to TWT’s scheduled access mechanism, which allows stations to wake and transmit within dedicated time windows. By coordinating these transmission periods, TWT effectively mitigates the channel contention and congestion inherent in dense environments, ensuring the reliable, low-latency communication necessary for real-time medical IoMT applications.
The remarkably stable 30 ms delay profile under TWT is a result of contention-free access. Because stations only wake during their specific TWT Service Period (SP), they do not have to sense the medium for other users, enabling immediate transmission upon wake-up.
4.2.2. TWT Power Consumption
The impact of Target Wake Time (TWT) on power consumption is presented in
Figure 10, demonstrating a substantial reduction in energy usage across all station densities. While the network without TWT exhibits high power consumption, rising to approximately 35.5 dBm as the number of stations increases, the TWT-enabled configuration maintains a much lower energy profile, consistently remaining below 26.5 dBm. This performance represents a dramatic efficiency gain, achieving power savings of up to 90% in comparison to non-TWT deployments. Such a significant improvement is due to the TWT scheduling mechanism that allows stations to remain in a low-power sleep mode for extended periods and only wake during negotiated time slots. By minimizing unnecessary channel sensing and contention overhead, TWT provides an essential energy-saving framework for battery-constrained IoMT devices operating in dense medical environments.
4.2.3. TWT Jitter
As shown in
Figure 11, the jitter performance with TWT enabled reveals a distinct trend as station density increases. While both configurations maintain nearly identical jitter levels at lower densities, a divergence occurs beyond 20 STAs. Specifically, the TWT-enabled network experiences a more pronounced increase in jitter, reaching approximately 0.85 ms at 32 STAs compared to roughly 0.65 ms for the no-TWT scenario. This increase is primarily due to the inherent timing variability introduced by the scheduled wake-up mechanism, as STAs must wait for their precise negotiated windows to transmit. Although this introduces a slight trade-off in timing consistency, the overall network stability and reliability remain superior to the no-TWT model when considering the significant gains in delay and power efficiency. Consequently, while TWT-induced jitter requires careful management in latency-sensitive applications, it does not undermine the protocol’s effectiveness in supporting high-density IoMT environments.
4.2.4. TWT Packet Loss Ratio
As shown in
Figure 12, the packet loss ratio (PLR) highlights a stark contrast in network reliability when Target Wake Time (TWT) is employed. In the absence of TWT, the PLR rises sharply as the number of stations exceeds 16, peaking at approximately 0.185% at 32 STAs due to severe channel congestion. In comparison, the TWT-enabled network maintains a remarkably low and stable PLR, staying below 0.05% even at peak density. This represents an approximate 80% reduction in packet loss, a gain achieved by TWT’s ability to assign dedicated transmission windows to specific stations while others remain in a doze state. By effectively eliminating overlapping transmissions and reducing collisions, TWT ensures the robust data integrity and high reliability necessary for critical patient monitoring in dense IoMT environments.
4.2.5. TWT Throughput
As shown in
Figure 13, the network throughput for both configurations remains relatively stable at low station densities but diverges significantly as the network scales. While the network without TWT maintains a higher raw throughput, gradually declining to approximately 63 Mbps at 32 STAs, the TWT-enabled network experiences a much sharper reduction, dropping to roughly 31 Mbps. This represents a throughput decrease of approximately 75% under high-density conditions when compared to peak performance levels. This decline represents a performance ceiling inherent to TWT’s scheduling mechanism. By enforcing restricted wake-up windows to prioritize energy efficiency and contention-free access, the protocol naturally limited the total aggregate throughput. However, for low-bandwidth medical telemetry, the observed 31 Mbps remained significantly above the bit-rate requirements of typical IoMT sensors, suggesting that the reliability gains outweighed the loss in raw capacity for this specific application. Although non-TWT networks achieve higher aggregate speeds, they do so at the expense of significantly higher collisions, packet loss, and power drain. Ultimately, these results demonstrate that TWT prioritizes robust data delivery and battery longevity essential for dense medical IoMT environments over raw throughput capacity.
5. Discussion
While the individual benefits of OFDMA and TWT have been reported in earlier studies, this work shows how their interaction behaves when both are applied simultaneously to dense, small-packet medical telemetry. In particular, the results highlight how protocol overhead and scheduling decisions influence performance as device density increases. This study systematically evaluated the performance of Wi-Fi protocols, specifically OFDM (Wi-Fi 5), OFDMA (Wi-Fi 6/7), and TWT within a dense medical IoMT environment. Simulations conducted in NS-3 considered STA densities ranging from 2 to 32, with realistic IoMT traffic (64-byte UDP payloads), capturing key QoS metrics including delay, jitter, packet loss ratio (PLR), throughput, and power consumption. While 32 nodes may not represent the total device count of a full hospital, this limit was selected to evaluate the peak contention threshold of a single Access Point (AP) cell, providing a fundamental benchmark for multi-AP infrastructure planning.
The comparative analysis between OFDM and OFDMA demonstrated that OFDMA consistently outperformed OFDM across most metrics, particularly as network density increases. Average delay and jitter were significantly lower with OFDMA due to its coordinated Resource Unit (RU) scheduling and parallel multi-user transmissions, ensuring timely and consistent delivery of medical data [
32]. This gain was primarily attributed to the shift from stochastic contention to a deterministic sub-carrier allocation; by partitioning the 80 MHz channel, OFDMA minimizes the Trigger Frame (TF) overhead and random backoff periods that typically plague legacy OFDM in dense environments. Throughput improvements of approximately 20% further confirm the efficiency gains achieved by dynamic channel allocation and reduced contention. While OFDM performs adequately in low-density scenarios, its performance degrades sharply under high contention, highlighting the scalability advantage of OFDMA in dense hospital networks.
Power consumption presented a trade-off: while OFDM remained relatively stable across STA densities, OFDMA showed higher variability and slightly increased energy usage at high densities. This reflects the more active channel utilization and simultaneous transmissions characteristic of OFDMA. Nevertheless, for mission-critical IoMT applications, this increase is a minor trade-off against the substantial gains in latency, reliability, and throughput.
The evaluation of TWT revealed its crucial role in energy efficiency for dense networks. By scheduling wake-up intervals and allowing STAs to enter sleep mode when not transmitting, TWT achieved a 75% reduction in power consumption while maintaining low delay and PLR. However, throughput decreased under high-density conditions (dropping to 31 Mbps, a 75% reduction compared to peak performance), reflecting a prioritization of reliability and energy efficiency over raw data rates. This trade-off is particularly relevant in hospital IoMT scenarios where battery-powered devices are prevalent, and predictable, stable communication is more critical than maximum throughput.
While the results indicate a 75% reduction in aggregate throughput when TWT is enabled, this must be interpreted as a strategic trade-off rather than a performance failure. For low-bandwidth medical telemetry, the observed 31 Mbps remains significantly above the bit-rate requirements of typical IoMT sensors (such as periodic heart rate or SpO2 monitors), suggesting that the reliability and battery gains outweigh the loss in raw capacity for this specific application. By enforcing scheduled ‘quiet’ periods, TWT eliminates the ‘collision storms’ typical of legacy OFDM contention. This ensures that critical, low-bandwidth packets—such as heart rate alerts—are delivered with 80% higher reliability (PLR < 0.05%) and significant battery longevity, which is more clinically valuable than high raw throughput for non-streaming medical sensors. From a network engineering perspective, this represents an optimization of ‘Goodput’ over raw capacity, where the focus shifts to successful delivery within specific QoS bounds.
While 5G NR and legacy 802.11n provide alternative connectivity options for IoMT, this study focuses specifically on the 802.11ac-to-ax transition, as it represents the most common and cost-effective infrastructure upgrade currently facing hospital IT departments. Although this study utilizes 802.11ax parameters, these findings provide a vital performance floor for emerging Wi-Fi 7 (802.11be) standards. The transition to Wi-Fi 7 is expected to further enhance these QoS metrics through Multi-Link Operation (MLO), which could theoretically mitigate the 30 ms latency floor observed in our TWT results by utilizing frequency diversity across the 2.4, 5, and 6 GHz bands simultaneously [
33].
To translate these technical findings into clinical practice,
Table 3 outlines specific deployment strategies based on the varied requirements of a modern hospital ward.
6. Limitations and Future Work
While this study offers a detailed quantitative look at Wi-Fi 6/7 mechanisms in dense IoMT settings, there are a few boundaries to our current approach that should be noted.
Fixed vs. Mobile Environments: A key decision in our research design was to use fixed station positions rather than a mobility model. The primary aim of this study was to isolate and evaluate the fundamental performance differences between OFDM, OFDMA, and TWT protocols when handling varying numbers of IoMT devices. By maintaining static positions, we were able to ensure that the observed fluctuations in QoS and power consumption resulted directly from MAC-layer logic and Resource Unit (RU) scheduling rather than signal fading or path loss caused by movement. However, we acknowledge that in a live hospital ward, patient mobility is a reality. Future work should build on these findings by incorporating the Gauss–Markov Mobility Model to test how Doppler shifts and frequent Access Point (AP) handovers might impact the precision of RU synchronization.
Hardware Validation: These results are based on high-fidelity NS-3 simulations, which allow for a level of density and repeatability that is difficult to achieve in a live clinical environment. While NS-3 is widely trusted in the field, we recognize that real-world hardware chipsets may introduce additional processing latencies not fully captured here.
Wider Interference and Wi-Fi 7: This study focused on the 80 MHz band in a controlled environment. Modern hospitals are inherently ‘noisy,’ with interference from medical devices and other wireless equipment. While our simulations considered up to 32 STAs per AP, larger deployments may experience increased Resource Unit (RU) management overhead. Furthermore, as we move toward Wi-Fi 7, the introduction of Multi-Link Operation (MLO) offers a promising avenue to leverage multiple bands (2.4, 5, and 6 GHz) simultaneously, which could reduce the latency floors observed in our current TWT results. Future work will examine how RU scheduling and MLO interact under high-density IoMT conditions.
7. Conclusions
This research provided a comprehensive quantitative evaluation of IEEE 802.11ax (Wi-Fi 6) features—specifically OFDMA and Target Wake Time (TWT) within the context of dense medical Internet of Medical Things (IoMT) networks. By simulating a scaling hospital ward environment from 2 to 32 nodes using 64-byte telemetry payloads, this study concludes that legacy OFDM (Wi-Fi 5) is insufficient for high-density clinical monitoring due to a sharp rise in delay (175 ms) and packet loss (0.18%) beyond 16 stations. The transition to OFDMA-based scheduling yields a 37% reduction in average latency and a 20% improvement in throughput efficiency by replacing random contention with deterministic Resource Unit (RU) allocation. Furthermore, the implementation of TWT demonstrated an improved efficacy for energy conservation, achieving an 8.5 dBm absolute reduction in average power levels compared to the baseline non-TWT operating conditions. This resulted in approximately 90% energy saving in the simulated environment, achieved by utilizing a 20% duty cycle and a 100 µA sleep state. While TWT introduces a significant throughput trade-off (dropping to 31 Mbps at peak density), it ensures a stable Packet Loss Ratio (PLR) of under 0.05%, providing the reliability required for life-critical sensors. These findings provide a technical roadmap for hospital IT infrastructure [
34]: OFDMA should be prioritized for real-time diagnostic streams, while TWT is the optimal protocol for wearable battery-powered sensors. Future work will expand this framework to include Wi-Fi 7 Multi-Link Operation (MLO) and the impact of physical hospital obstacles on RU synchronization.