Next Issue
Volume 10, September
Previous Issue
Volume 10, March
 
 

J. Sens. Actuator Netw., Volume 10, Issue 2 (June 2021) – 15 articles

Cover Story (view full-size image): In this paper, we used network calculus to carry out the worst-case bound analysis for GTS utilization of IEEE 802.15.7 and complement our model with in-depth performance analysis. From our results, we were able to infer that the VLC communication for the star topology works better in the lower superframe orders, and the performance varies predominantly based on the burst size of the data packets. Furthermore, we also explored the variance in the performance for the different PHY layers of this protocol. Despite the larger throughput provided by the high data rate supporting PHY layers, they exhibit a very similar behavior to their lower data rate counterparts. Based on our results, we can understand that these PHY layers can support even the video transmissions that usually demand larger data rates. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
Article
Digital Twin-Driven Decision Making and Planning for Energy Consumption
J. Sens. Actuator Netw. 2021, 10(2), 37; https://doi.org/10.3390/jsan10020037 - 20 Jun 2021
Cited by 13 | Viewed by 2617
Abstract
The Internet of Things (IoT) is revolutionising how energy is delivered from energy producers and used throughout residential households. Optimising the residential energy consumption is a crucial step toward having greener and sustainable energy production. Such optimisation requires a household-centric energy management system [...] Read more.
The Internet of Things (IoT) is revolutionising how energy is delivered from energy producers and used throughout residential households. Optimising the residential energy consumption is a crucial step toward having greener and sustainable energy production. Such optimisation requires a household-centric energy management system as opposed to a one-rule-fits all approach. In this paper, we propose a data-driven multi-layer digital twin of the energy system that aims to mirror households’ actual energy consumption in the form of a household digital twin (HDT). When linked to the energy production digital twin (EDT), HDT empowers the household-centric energy optimisation model to achieve the desired efficiency in energy use. The model intends to improve the efficiency of energy production by flattening the daily energy demand levels. This is done by collaboratively reorganising the energy consumption patterns of residential homes to avoid peak demands whilst accommodating the resident needs and reducing their energy costs. Indeed, our system incorporates the first HDT model to gauge the impact of various modifications on the household energy bill and, subsequently, on energy production. The proposed energy system is applied to a real-world IoT dataset that spans over two years and covers seventeen households. Our conducted experiments show that the model effectively flattened the collective energy demand by 20.9% on synthetic data and 20.4% on a real dataset. At the same time, the average energy cost per household was reduced by 10.7% for the synthetic data and 17.7% for the real dataset. Full article
(This article belongs to the Special Issue Machine Learning in IoT Networking and Communications)
Show Figures

Figure 1

Article
Design, Analysis, and Experimental Evaluation of a New Secure Rejoin Mechanism for LoRaWAN Using Elliptic-Curve Cryptography
J. Sens. Actuator Netw. 2021, 10(2), 36; https://doi.org/10.3390/jsan10020036 - 18 Jun 2021
Cited by 4 | Viewed by 1657
Abstract
LoRaWAN (Long Range Wide Area Network) is a Low-Power Wide Area Networks (LPWAN) technology with very rapid uptake during the previous years, developed by the LoRa (Long Range) Alliance as an open standard operating over the unlicensed band. Current LoRaWAN architecture foresees specific [...] Read more.
LoRaWAN (Long Range Wide Area Network) is a Low-Power Wide Area Networks (LPWAN) technology with very rapid uptake during the previous years, developed by the LoRa (Long Range) Alliance as an open standard operating over the unlicensed band. Current LoRaWAN architecture foresees specific techniques for bootstrapping end-to-end encryption during network initialization. In particular, this work focuses on the Over-The-Air Activation (OTAA) method, which uses two keys (Network key (NwkKey) and Application key (AppKey)) that are hard-coded into the device and do not change throughout the entire lifetime of the deployment. The inability to refresh these two keys is as a weak point in terms of the overall security of the network especially when considering deployments that are expected to operate for at least 10–15 years. In this paper, the security issues of OTAA are presented in detail highlighting the vulnerabilities against the specific type of attacks. A new scheme for network activation is proposed that builds upon the current LoRaWAN architecture in a way that maintains backwards compatibility while resolving certain vulnerabilities. Under the new mechanism, the devices periodically negotiate new keys securely based on elliptic-curve cryptography. The security properties of the proposed mechanism are analyzed against a specific type of attacks. The analysis indicates that the new secure rejoin mechanism guarantees (i) computational key secrecy, (ii) decisional key secrecy, and (iii) key independence, forward and backward, for both root keys thus properly addressing the considered security vulnerabilities of LoRaWAN. Moreover, the method is implemented in software using the RIOT-OS, a hardware-independent operating system that supports many different architectures for 8 bit, 16 bit, 32 bit and 64 bit processors. The resulting software is evaluated on the FIT IoT-Lab real-world experimentation facility under a diverse set of ARM Cortex-M* devices targeting a broad range of IoT applications, ranging from advanced wearable devices to interactive entertainment devices, home automation and industrial cyber-physical systems. The experiments indicate that the overall overhead incurred in terms of energy and time by the proposed rejoin mechanism is acceptable given the low frequency of execution and the improvements to the overall security of the LoRaWAN1.1 OTAA method. Full article
(This article belongs to the Special Issue Journal of Sensor and Actuator Networks: 10th Year Anniversary)
Show Figures

Figure 1

Article
Capacity Control in Indoor Spaces Using Machine Learning Techniques Together with BLE Technology
J. Sens. Actuator Netw. 2021, 10(2), 35; https://doi.org/10.3390/jsan10020035 - 14 Jun 2021
Cited by 2 | Viewed by 1401
Abstract
At present, capacity control in indoor spaces is critical in the current situation in which we are living in, due to the pandemic. In this work, we propose a new solution using machine learning techniques with BLE technology. This study presents a real [...] Read more.
At present, capacity control in indoor spaces is critical in the current situation in which we are living in, due to the pandemic. In this work, we propose a new solution using machine learning techniques with BLE technology. This study presents a real experiment in a university environment and we study three different prediction models using machine learning techniques—specifically, logistic regression, decision trees and artificial neural networks. As a conclusion, the study shows that machine learning techniques, in particular decision trees, together with BLE technology, provide a solution to the problem. The contribution of this research work shows that the prediction model obtained is capable of detecting when the COVID capacity of an enclosed space is exceeded. In addition, it ensures that no false negatives are produced, i.e., all the people inside the laboratory will be correctly counted. Full article
(This article belongs to the Special Issue Smart City Applications of Sensor Networks and Intelligent Systems)
Show Figures

Figure 1

Article
Networking for Cloud Robotics: The DewROS Platform and Its Application
J. Sens. Actuator Netw. 2021, 10(2), 34; https://doi.org/10.3390/jsan10020034 - 14 Jun 2021
Cited by 3 | Viewed by 1896
Abstract
With the advances in networking technologies, robots can use the almost unlimited resources of large data centers, overcoming the severe limitations imposed by onboard resources: this is the vision of Cloud Robotics. In this context, we present DewROS, a framework based on the [...] Read more.
With the advances in networking technologies, robots can use the almost unlimited resources of large data centers, overcoming the severe limitations imposed by onboard resources: this is the vision of Cloud Robotics. In this context, we present DewROS, a framework based on the Robot Operating System (ROS) which embodies the three-layer, Dew-Robotics architecture, where computation and storage can be distributed among the robot, the network devices close to it, and the Cloud. After presenting the design and implementation of DewROS, we show its application in a real use-case called SHERPA, which foresees a mixed ground and aerial robotic platform for search and rescue in an alpine environment. We used DewROS to analyze the video acquired by the drones in the Cloud and quickly spot signs of human beings in danger. We perform a wide experimental evaluation using different network technologies and Cloud services from Google and Amazon. We evaluated the impact of several variables on the performance of the system. Our results show that, for example, the video length has a minimal impact on the response time with respect to the video size. In addition, we show that the response time depends on the Round Trip Time (RTT) of the network connection when the video is already loaded into the Cloud provider side. Finally, we present a model of the annotation time that considers the RTT of the connection used to reach the Cloud, discussing results and insights into how to improve current Cloud Robotics applications. Full article
Show Figures

Figure 1

Perspective
Agents and Robots for Reliable Engineered Autonomy:A Perspective from the Organisers of AREA 2020
J. Sens. Actuator Netw. 2021, 10(2), 33; https://doi.org/10.3390/jsan10020033 - 14 May 2021
Cited by 1 | Viewed by 1564
Abstract
Multi-agent systems, robotics and software engineering are large and active research areas with many applications in academia and industry. The First Workshop on Agents and Robots for reliable Engineered Autonomy (AREA), organised the first time in 2020, aims at encouraging cross-disciplinary collaborations and [...] Read more.
Multi-agent systems, robotics and software engineering are large and active research areas with many applications in academia and industry. The First Workshop on Agents and Robots for reliable Engineered Autonomy (AREA), organised the first time in 2020, aims at encouraging cross-disciplinary collaborations and exchange of ideas among researchers working in these research areas. This paper presents a perspective of the organisers that aims at highlighting the latest research trends, future directions, challenges, and open problems. It also includes feedback from the discussions held during the AREA workshop. The goal of this perspective is to provide a high-level view of current research trends for researchers that aim at working in the intersection of these research areas. Full article
(This article belongs to the Special Issue Agents and Robots for Reliable Engineered Autonomy)
Obituary
Obituary for Prof. Dr. Dharma Prakash Agrawal
by
J. Sens. Actuator Netw. 2021, 10(2), 32; https://doi.org/10.3390/jsan10020032 - 06 May 2021
Cited by 1 | Viewed by 1445
Abstract
Dharma Prakash Agrawal, 12 April 1945–15 February 2021[...] Full article
Review
Bluetooth Communication Leveraging Ultra-Low Power Radio Design
J. Sens. Actuator Netw. 2021, 10(2), 31; https://doi.org/10.3390/jsan10020031 - 26 Apr 2021
Cited by 3 | Viewed by 2527
Abstract
Energy-efficient wireless connectivity plays an important role in scaling both battery-less and battery-powered Internet-of-Things (IoT) devices. The power consumption in these devices is dominated by the wireless transceivers which limit the battery’s lifetime. Different strategies have been proposed to tackle these issues both [...] Read more.
Energy-efficient wireless connectivity plays an important role in scaling both battery-less and battery-powered Internet-of-Things (IoT) devices. The power consumption in these devices is dominated by the wireless transceivers which limit the battery’s lifetime. Different strategies have been proposed to tackle these issues both in physical and network layers. The ultimate goal is to lower the power consumption without sacrificing other important metrics like latency, transmission range and robust operation under the presence of interference. Joint efforts in designing energy-efficient wireless protocols and low-power radio architectures result in achieving sub-100 μW operation. One technique to lower power is back-channel (BC) communication which allows ultra-low power (ULP) receivers to communicate efficiently with commonly used wireless standards like Bluetooth Low-Energy (BLE) while utilizing the already-deployed infrastructure. In this paper, we present a review of BLE back-channel communication and its forms. Additionally, a comprehensive survey of ULP radio design trends and techniques in both Bluetooth transmitters and receivers is presented. Full article
(This article belongs to the Special Issue Bluetooth Low Energy in Sensor and Actuator Networks)
Show Figures

Figure 1

Article
A Highly Effective Route for Real-Time Traffic Using an IoT Smart Algorithm for Tele-Surgery Using 5G Networks
J. Sens. Actuator Netw. 2021, 10(2), 30; https://doi.org/10.3390/jsan10020030 - 22 Apr 2021
Cited by 6 | Viewed by 1753
Abstract
Nowadays, networks use many different paths to exchange data. However, our research will construct a reliable path in the networks among a huge number of nodes for use in tele-surgery using medical applications such as healthcare tracking applications, including tele-surgery which lead to [...] Read more.
Nowadays, networks use many different paths to exchange data. However, our research will construct a reliable path in the networks among a huge number of nodes for use in tele-surgery using medical applications such as healthcare tracking applications, including tele-surgery which lead to optimizing medical quality of service (m-QoS) during the COVID-19 situation. Many people could not travel due to the current issues, for fear of spreading the covid-19 virus. Therefore, our paper will provide a very trusted and reliable method of communication between a doctor and his patient so that the latter can do his operation even from a far distance. The communication between the doctor and his/her patient will be monitored by our proposed algorithm to make sure that the data will be received without delay. We test how we can invest buffer space that can be used efficiently to reduce delays between source and destination, avoiding loss of high-priority data packets. The results are presented in three stages. First, we show how to obtain the greatest possible reduction in rate variability when the surgeon begins an operation using live streaming. Second, the proposed algorithm reduces congestion on the determined path used for the online surgery. Third, we have evaluated the affection of optimal smoothing algorithm on the network parameters such as peak-to-mean ratio and delay to optimize m-QoS. We propose a new Smart-Rout Control algorithm (s-RCA) for creating a virtual smart path between source and destination to transfer the required data traffic between them, considering the number of hops and link delay. This provides a reliable connection that can be used in healthcare surgery to guarantee that all instructions are received without any delay, to be executed instantly. This idea can improve m-QoS in distance surgery, with trusted paths. The new s-RCA can be adapted with an existing routing protocol to track the primary path and monitor emergency packets received in node buffers, for direct forwarding via the demand path, with extended features. Full article
Show Figures

Figure 1

Article
A Feed-Forward Neural Network Approach for Energy-Based Acoustic Source Localization
J. Sens. Actuator Netw. 2021, 10(2), 29; https://doi.org/10.3390/jsan10020029 - 22 Apr 2021
Cited by 5 | Viewed by 1824
Abstract
The localization of an acoustic source has attracted much attention in the scientific community, having been applied in several different real-life applications. At the same time, the use of neural networks in the acoustic source localization problem is not common; hence, this work [...] Read more.
The localization of an acoustic source has attracted much attention in the scientific community, having been applied in several different real-life applications. At the same time, the use of neural networks in the acoustic source localization problem is not common; hence, this work aims to show their potential use for this field of application. As such, the present work proposes a deep feed-forward neural network for solving the acoustic source localization problem based on energy measurements. Several network typologies are trained with ideal noise-free conditions, which simplifies the usual heavy training process where a low mean squared error is obtained. The networks are implemented, simulated, and compared with conventional algorithms, namely, deterministic and metaheuristic methods, and our results indicate improved performance when noise is added to the measurements. Therefore, the current developed scheme opens up a new horizon for energy-based acoustic localization, a field where machine learning algorithms have not been applied in the past. Full article
(This article belongs to the Special Issue Machine Learning in WSN and IoT)
Show Figures

Figure 1

Article
MINDS: Mobile Agent Itinerary Planning Using Named Data Networking in Wireless Sensor Networks
J. Sens. Actuator Netw. 2021, 10(2), 28; https://doi.org/10.3390/jsan10020028 - 22 Apr 2021
Cited by 4 | Viewed by 1487
Abstract
Mobile agents have the potential to offer benefits, as they are able to either independently or cooperatively move throughout networks and collect/aggregate sensory data samples. They are programmed to autonomously move and visit sensory data stations through optimal paths, which are established according [...] Read more.
Mobile agents have the potential to offer benefits, as they are able to either independently or cooperatively move throughout networks and collect/aggregate sensory data samples. They are programmed to autonomously move and visit sensory data stations through optimal paths, which are established according to the application requirements. However, mobile agent routing protocols still suffer heavy computation/communication overheads, lack of route planning accuracy and long-delay mobile agent migrations. For this, mobile agent route planning protocols aim to find the best-fitted paths for completing missions (e.g., data collection) with minimised delay, maximised performance and minimised transmitted traffic. This article proposes a mobile agent route planning protocol for sensory data collection called MINDS. The key goal of this MINDS is to reduce network traffic, maximise data robustness and minimise delay at the same time. This protocol utilises the Hamming distance technique to partition a sensor network into a number of data-centric clusters. In turn, a named data networking approach is used to form the cluster-heads as a data-centric, tree-based communication infrastructure. The mobile agents utilise a modified version of the Depth-First Search algorithm to move through the tree infrastructure according to a hop-count-aware fashion. As the simulation results show, MINDS reduces path length, reduces network traffic and increases data robustness as compared with two conventional benchmarks (ZMA and TBID) in dense and large wireless sensor networks. Full article
Show Figures

Figure 1

Article
A Programming Approach to Collective Autonomy
J. Sens. Actuator Netw. 2021, 10(2), 27; https://doi.org/10.3390/jsan10020027 - 19 Apr 2021
Cited by 4 | Viewed by 1695
Abstract
Research and technology developments on autonomous agents and autonomic computing promote a vision of artificial systems that are able to resiliently manage themselves and autonomously deal with issues at runtime in dynamic environments. Indeed, autonomy can be leveraged to unburden humans from mundane [...] Read more.
Research and technology developments on autonomous agents and autonomic computing promote a vision of artificial systems that are able to resiliently manage themselves and autonomously deal with issues at runtime in dynamic environments. Indeed, autonomy can be leveraged to unburden humans from mundane tasks (cf. driving and autonomous vehicles), from the risk of operating in unknown or perilous environments (cf. rescue scenarios), or to support timely decision-making in complex settings (cf. data-centre operations). Beyond the results that individual autonomous agents can carry out, a further opportunity lies in the collaboration of multiple agents or robots. Emerging macro-paradigms provide an approach to programming whole collectives towards global goals. Aggregate computing is one such paradigm, formally grounded in a calculus of computational fields enabling functional composition of collective behaviours that could be proved, under certain technical conditions, to be self-stabilising. In this work, we address the concept of collective autonomy, i.e., the form of autonomy that applies at the level of a group of individuals. As a contribution, we define an agent control architecture for aggregate multi-agent systems, discuss how the aggregate computing framework relates to both individual and collective autonomy, and show how it can be used to program collective autonomous behaviour. We exemplify the concepts through a simulated case study, and outline a research roadmap towards reliable aggregate autonomy. Full article
(This article belongs to the Special Issue Agents and Robots for Reliable Engineered Autonomy)
Show Figures

Figure 1

Article
SCATTER: Service Placement in Real-Time Fog-Assisted IoT Networks
J. Sens. Actuator Netw. 2021, 10(2), 26; https://doi.org/10.3390/jsan10020026 - 06 Apr 2021
Cited by 7 | Viewed by 2063
Abstract
Internet of Things (IoT) networks dependent on cloud services usually fail in supporting real-time applications as there is no response time guarantees. The fog computing paradigm has been used to alleviate this problem by executing tasks at the edge of the network, where [...] Read more.
Internet of Things (IoT) networks dependent on cloud services usually fail in supporting real-time applications as there is no response time guarantees. The fog computing paradigm has been used to alleviate this problem by executing tasks at the edge of the network, where it is possible to provide time bounds. One of the challenging topics in a fog-assisted architecture is to task placement on edge devices in order to obtain a good performance. The process of task mapping into computational devices is known as Service Placement Problem (SPP). In this paper, we present a heuristic algorithm to solve SPP, dubbed as clustering of fog devices and requirement-sensitive service first (SCATTER). We provide simulations using iFogSim toolkit and experimental evaluations using real hardware to verify the feasibility of the SCATTER algorithm by considering a smart home application. We compared the SCATTER with two existing works: edge-ward and cloud-only approaches, in terms of Quality of Service (QoS) metrics. Our experimental results have demonstrated that SCATTER approach has better performance compared with the edge-ward and cloud-only, 42.1% and 60.2% less application response times, 22% and 27.8% less network usage, 45% and 65.7% less average application loop delays, and 2.33% and 3.2% less energy consumption. Full article
(This article belongs to the Special Issue QoS in Wireless Sensor/Actuator Networks)
Show Figures

Figure 1

Article
An Approach for Stego-Insider Detection Based on a Hybrid NoSQL Database
J. Sens. Actuator Netw. 2021, 10(2), 25; https://doi.org/10.3390/jsan10020025 - 30 Mar 2021
Cited by 7 | Viewed by 1731
Abstract
One of the reasons for the implementation of information security threats in organizations is the insider activity of its employees. There is a big challenge to detect stego-insiders-employees who create stego-channels to secretly receive malicious information and transfer confidential information across the organization’s [...] Read more.
One of the reasons for the implementation of information security threats in organizations is the insider activity of its employees. There is a big challenge to detect stego-insiders-employees who create stego-channels to secretly receive malicious information and transfer confidential information across the organization’s perimeter. Especially presently, with great popularity of wireless sensor networks (WSNs) and Internet of Things (IoT) devices, there is a big variety of information that could be gathered and processed by stego-insiders. Consequently, the problem arises of identifying such intruders and their transmission channels. The paper proposes an approach to solving this problem. The paper provides a review of the related works in terms of insider models and methods of their identification, including techniques for handling insider attacks in WSN, as well methods of embedding and detection of stego-embeddings. This allows singling out the basic features of stego-insiders, which could be determined by their behavior in the network. In the interests of storing these attributes of user behavior, as well as storing such attributes from large-scale WSN, a hybrid NoSQL database is created based on graph and document-oriented approaches. The algorithms for determining each of the features using the NoSQL database are specified. The general scheme of stego-insider detection is also provided. To confirm the efficiency of the approach, an experiment was carried out on a real network. During the experiment, a database of user behavior was collected. Then, user behavior features were retrieved from the database using special SQL queries. The analysis of the results of SQL queries is carried out, and their applicability for determining the attribute is justified. Weak points of the approach and ways to improve them are indicated. Full article
(This article belongs to the Special Issue Security Threats and Countermeasures in Cyber-Physical Systems)
Show Figures

Figure 1

Article
QoS Enabled Heterogeneous BLE Mesh Networks
J. Sens. Actuator Netw. 2021, 10(2), 24; https://doi.org/10.3390/jsan10020024 - 28 Mar 2021
Cited by 4 | Viewed by 2362
Abstract
Bluetooth Low Energy (BLE) is a widely known short-range wireless technology used for various Internet of Things (IoT) applications. Recently, with the introduction of BLE mesh networks, this short-range barrier of BLE has been overcome. However, the added advantage of an extended range [...] Read more.
Bluetooth Low Energy (BLE) is a widely known short-range wireless technology used for various Internet of Things (IoT) applications. Recently, with the introduction of BLE mesh networks, this short-range barrier of BLE has been overcome. However, the added advantage of an extended range can come at the cost of a lower performance of these networks in terms of latency, throughput and reliability, as the core operation of BLE mesh is based on advertising and packet flooding. Hence, efficient management of the system is required to achieve a good performance of these networks and a smoother functioning in dense scenarios. As the number of configuration points in a standard mesh network is limited, this paper describes a novel set of standard compliant Quality of Service (QoS) extensions for BLE mesh networks. The resulting QoS features enable better traffic management in the mesh network, providing sufficient redundancy to achieve reliability whilst avoiding unnecessary packet flooding to reduce collisions, as well as the prioritization of certain traffic flows and the ability to control end-to-end latencies. The QoS-based system has been implemented and validated in a small-scale BLE mesh network and compared against a setup without any QoS support. The assessment in a small-scale test setup confirms that applying our QoS features can enhance these types of non-scheduled and random access networks in a significant way. Full article
(This article belongs to the Special Issue Bluetooth Low Energy in Sensor and Actuator Networks)
Show Figures

Figure 1

Article
A Comprehensive Worst Case Bounds Analysis of IEEE 802.15.7
J. Sens. Actuator Netw. 2021, 10(2), 23; https://doi.org/10.3390/jsan10020023 - 26 Mar 2021
Cited by 3 | Viewed by 1591
Abstract
Visible Light Communication (VLC) has been emerging as a promising technology to address the increasingly high data-rate and time-critical demands that the Internet of Things (IoT) and 5G paradigms impose on the underlying Wireless Sensor Actuator Networking (WSAN) technologies. In this line, the [...] Read more.
Visible Light Communication (VLC) has been emerging as a promising technology to address the increasingly high data-rate and time-critical demands that the Internet of Things (IoT) and 5G paradigms impose on the underlying Wireless Sensor Actuator Networking (WSAN) technologies. In this line, the IEEE 802.15.7 standard proposes several physical layers and Medium Access Control (MAC) sub-layer mechanisms that support a variety of VLC applications. Particularly, at the MAC sub-layer, it can support contention-free communications using Guaranteed Timeslots (GTS), introducing support for time-critical applications. However, to effectively guarantee accurate usage of such functionalities, it is vital to derive the worst-case bounds of the network. In this paper, we use network calculus to carry out the worst-case bounds analysis for GTS utilization of IEEE 802.15.7 and complement our model with an in-depth performance analysis. We also propose the inclusion of an additional mechanism to improve the overall scalability and effective bandwidth utilization of the network. Full article
(This article belongs to the Special Issue QoS in Wireless Sensor/Actuator Networks)
Show Figures

Figure 1

Previous Issue
Back to TopTop