sensors-logo

Journal Browser

Journal Browser

Special Issue "Sensors, Robots, Internet of Things, and Smart Factories"

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensor Networks".

Deadline for manuscript submissions: closed (30 November 2019).

Special Issue Editors

Prof. Dr. Jehn-Ruey Jiang
Website
Guest Editor
Department of Computer Science and Information Engineering National Central University, Jhongli City 32001, Taiwan
Interests: wireless sensor networks; Internet of Things; cyber-physical systems; smart manufacturing; deep learning
Prof. Dr. Tharek Abd Rahman
Website
Guest Editor
Universiti Teknologi Malaysia, Malaysia
Interests: radio propagation; antenna and RF design; indoors and outdoords wireless communication; 5G communication
Prof. Dr. Haibo Zhang
Website
Guest Editor
Department of Computer Science, University of Otago, Dunedin 9054, New Zealand
Interests: wireless sensor networks; routing protocol design; Internet of Things; cyber-physical systems; 5G networks; vehicular networks

Special Issue Information

Dear Colleagues,

Smart factories play an important role in the Industry 4.0 concept, which has been attracting much attention. With the advance of technologies of sensors, robots, and Internet of Things (IoT), a cyberphysical system (CPS) is built for gathering and analyzing sensor data, as well as for intelligently controlling robots, in order to meet smart factories’ requirements, such as performance optimization, energy efficiency, and dependability. You are invited to contribute to this Special Issue research results related to sensors, robots, and IoT for forming smart factories. Related research includes embedding of sensors (e.g., vision sensors, vibration sensors, and even RFID sensors/readers) and their applications (e.g., automated optical inspection or AOI), robots coordination and applications (e.g., path planning and navigation of automatic guided vehicles or AGV), networking for industry (e.g., fieldbus networking, industrial wireless networking, and 5G IoT networking), and big data storing and analysis (e.g., big data storing and retrieving, machine learning, deep learning).

Prof. Dr. Jehn-Ruey Jiang
Prof. Dr. Tharek Abd Rahman
Prof. Dr. Haibo Zhang
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2200 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Smart sensor data acquisition and fusion
  • RFID sensors/readers and related applications
  • Positioning in wireless networks
  • Fieldbus networking
  • 5G IoT networking
  • Intelligent robot coordination and control  
  • AGV navigation and path planning
  • Time series anomaly detection
  • Predictive and prescriptive maintenance
  • Machine learning/deep learning for smart manufacturing
  • Machine health prognostics (MHP)
  • Automated optical inspection (AOI)

Published Papers (11 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

Open AccessArticle
Design and Implementation of a Virtual Sensor Network for Smart Waste Water Monitoring
Sensors 2020, 20(2), 358; https://doi.org/10.3390/s20020358 - 08 Jan 2020
Cited by 3
Abstract
Monitoring and analysis of open air basins is a critical task in waste water plant management. These tasks generally require sampling waters at several hard to access points, be it real time with multiparametric sensor probes, or retrieving water samples. Full automation of [...] Read more.
Monitoring and analysis of open air basins is a critical task in waste water plant management. These tasks generally require sampling waters at several hard to access points, be it real time with multiparametric sensor probes, or retrieving water samples. Full automation of these processes would require deploying hundreds (if not thousands) of fixed sensors, unless the sensors can be translated. This work proposes the utilization of robotized unmanned aerial vehicle (UAV) platforms to work as a virtual high density sensor network, which could analyze in real time or capture samples depending on the robotic UAV equipment. To check the validity of the concept, an instance of the robotized UAV platform has been fully designed and implemented. A multi-agent system approach has been used (implemented over a Robot Operating System, ROS, middleware layer) to define a software architecture able to deal with the different problems, optimizing modularity of the software; in terms of hardware, the UAV platform has been designed and built, as a sample capturing probe. A description on the main features of the multi-agent system proposed, its architecture, and the behavior of several components is discussed. The experimental validation and performance evaluation of the system components has been performed independently for the sake of safety: autonomous flight performance has been tested on-site; the accuracy of the localization technologies deemed as deployable options has been evaluated in controlled flights; and the viability of the sample capture device designed and built has been experimentally tested. Full article
(This article belongs to the Special Issue Sensors, Robots, Internet of Things, and Smart Factories)
Show Figures

Figure 1

Open AccessArticle
Time Series Multiple Channel Convolutional Neural Network with Attention-Based Long Short-Term Memory for Predicting Bearing Remaining Useful Life
Sensors 2020, 20(1), 166; https://doi.org/10.3390/s20010166 - 26 Dec 2019
Cited by 6
Abstract
This paper proposes two deep learning methods for remaining useful life (RUL) prediction of bearings. The methods have the advantageous end-to-end property that they take raw data as input and generate the predicted RUL directly. They are TSMC-CNN, which stands for the time [...] Read more.
This paper proposes two deep learning methods for remaining useful life (RUL) prediction of bearings. The methods have the advantageous end-to-end property that they take raw data as input and generate the predicted RUL directly. They are TSMC-CNN, which stands for the time series multiple channel convolutional neural network, and TSMC-CNN-ALSTM, which stands for the TSMC-CNN integrated with the attention-based long short-term memory (ALSTM) network. The proposed methods divide a time series into multiple channels and take advantage of the convolutional neural network (CNN), the long short-term memory (LSTM) network, and the attention-based mechanism for boosting performance. The CNN performs well for extracting features from data with multiple channels; dividing a time series into multiple channels helps the CNN extract relationship among far-apart data points. The LSTM network is excellent for processing temporal data; the attention-based mechanism allows the LSTM network to focus on different features at different time steps for better prediction accuracy. PRONOSTIA bearing operation datasets are applied to the proposed methods for the purpose of performance evaluation and comparison. The comparison results show that the proposed methods outperform the others in terms of the mean absolute error (MAE) and the root mean squared error (RMSE) of RUL prediction. Full article
(This article belongs to the Special Issue Sensors, Robots, Internet of Things, and Smart Factories)
Show Figures

Figure 1

Open AccessArticle
Kinematic Modeling of a Combined System of Multiple Mecanum-Wheeled Robots with Velocity Compensation
Sensors 2020, 20(1), 75; https://doi.org/10.3390/s20010075 - 21 Dec 2019
Abstract
In industry, combination configurations composed of multiple Mecanum-wheeled mobile robots are adopted to transport large-scale objects. In this paper, a kinematic model with velocity compensation of the combined mobile system is created, aimed to provide a theoretical kinematic basis for accurate motion control. [...] Read more.
In industry, combination configurations composed of multiple Mecanum-wheeled mobile robots are adopted to transport large-scale objects. In this paper, a kinematic model with velocity compensation of the combined mobile system is created, aimed to provide a theoretical kinematic basis for accurate motion control. Motion simulations of a single four-Mecanum-wheeled virtual robot prototype on RecurDyn and motion tests of a robot physical prototype are carried out, and the motions of a variety of combined mobile configurations are also simulated. Motion simulation and test results prove that the kinematic models of single- and multiple-robot combination systems are correct, and the inverse kinematic correction model with velocity compensation matrix is feasible. Through simulations or experiments, the velocity compensation coefficients of the robots can be measured and the velocity compensation matrix can be created. This modified inverse kinematic model can effectively reduce the errors of robot motion caused by wheel slippage and improve the motion accuracy of the mobile robot system. Full article
(This article belongs to the Special Issue Sensors, Robots, Internet of Things, and Smart Factories)
Show Figures

Figure 1

Open AccessArticle
Novel Framework Based on HOSVD for Ski Goggles Defect Detection and Classification
Sensors 2019, 19(24), 5538; https://doi.org/10.3390/s19245538 - 14 Dec 2019
Abstract
No matter your experience level or budget, there is a great ski goggle waiting to be found.Goggles are an essential part of skiing or snowboarding gear to protect your eyes from harsh environmental elements and injury. In the ski goggles manufacturing industry, defects, [...] Read more.
No matter your experience level or budget, there is a great ski goggle waiting to be found.Goggles are an essential part of skiing or snowboarding gear to protect your eyes from harsh environmental elements and injury. In the ski goggles manufacturing industry, defects, especially on the lens surface, are unavoidable. However, defect detection and classification by visual inspection in the manufacturing process is very difficult. To overcome this problem, a novel framework based on machine vision is presented, named as the ski goggles lens defect detection, with five high-resolution cameras and custom-made lighting field to achieve a high-quality ski goggles lens image. Next, the defects on the lens of ski goggles are detected by using parallel projection in opposite directions based on adaptive energy analysis. Before being put into the classification system, the defect images are enhanced by an adaptive method based on the high-order singular value decomposition (HOSVD). Finally, dust and five types of defect images are classified into six types, i.e., dust, spotlight (type 1, type 2, type 3), string, and watermark, by using the developed classification algorithm. The defect detection and classification results of the ski goggles lens are compared to the standard quality of the manufacturer. Experiments using 120 ski goggles lens samples collected from the largest manufacturer in Taiwan are conducted to validate the performance of the proposed framework. The accurate defect detection rate is 100% and the classification accuracy rate is 99.3%, while the total running time is short. The results demonstrate that the proposed method is sound and useful for ski goggles lens inspection in industries. Full article
(This article belongs to the Special Issue Sensors, Robots, Internet of Things, and Smart Factories)
Show Figures

Graphical abstract

Open AccessArticle
Use of Thermistor Temperature Sensors for Cyber-Physical System Security
Sensors 2019, 19(18), 3905; https://doi.org/10.3390/s19183905 - 10 Sep 2019
Cited by 6
Abstract
The last few decades have seen a large proliferation in the prevalence of cyber-physical systems. This has been especially highlighted by the explosive growth in the number of Internet of Things (IoT) devices. Unfortunately, the increasing prevalence of these devices has begun to [...] Read more.
The last few decades have seen a large proliferation in the prevalence of cyber-physical systems. This has been especially highlighted by the explosive growth in the number of Internet of Things (IoT) devices. Unfortunately, the increasing prevalence of these devices has begun to draw the attention of malicious entities which exploit them for their own gain. What makes these devices especially attractive is the various resource constraints present in these devices that make it difficult to add standard security features. Therefore, one intriguing research direction is creating security solutions out of already present components such as sensors. Physically Unclonable Functions (PUFs) are one potential solution that use intrinsic variations of the device manufacturing process for provisioning security. In this work, we propose a novel weak PUF design using thermistor temperature sensors. Our design uses the differences in resistance variation between thermistors in response to temperature change. To generate a PUF that is reliable across a range of temperatures, we use a response-generation algorithm that helps mitigate the effects of temperature variation on the thermistors. We tested the performance of our proposed design across a range of environmental operating conditions. From this we were able to evaluate the reliability of the proposed PUF with respect to variations in temperature and humidity. We also evaluated the PUF’s uniqueness using Monte Carlo simulations. Full article
(This article belongs to the Special Issue Sensors, Robots, Internet of Things, and Smart Factories)
Show Figures

Figure 1

Open AccessArticle
Bandwidth-Aware Traffic Sensing in Vehicular Networks with Mobile Edge Computing
Sensors 2019, 19(16), 3547; https://doi.org/10.3390/s19163547 - 14 Aug 2019
Cited by 1
Abstract
Traffic sensing is one of the promising applications to guarantee safe and efficient traffic systems in vehicular networks. However, due to the unique characteristics of vehicular networks, such as limited wireless bandwidth and dynamic mobility of vehicles, traffic sensing always faces high estimation [...] Read more.
Traffic sensing is one of the promising applications to guarantee safe and efficient traffic systems in vehicular networks. However, due to the unique characteristics of vehicular networks, such as limited wireless bandwidth and dynamic mobility of vehicles, traffic sensing always faces high estimation error based on collected traffic data with missing elements and over-high communication cost between terminal users and central server. Hence, this paper investigates the traffic sensing system in vehicular networks with mobile edge computing (MEC), where each MEC server enables traffic data collection and recovery in its local server. On this basis, we formulate the bandwidth-constrained traffic sensing (BCTS) problem, aiming at minimizing the estimation error based on the collected traffic data. To tackle the BCTS problem, we first propose the bandwidth-aware data collection (BDC) algorithm to select the optimal uploaded traffic data by evaluating the priority of each road segment covered by the MEC server. Then, we propose the convex-based data recovery (CDR) algorithm to minimize estimation error by transforming the BCTS into an l 2 -norm minimization problem. Last but not the least, we implement the simulation model and conduct performance evaluation. The comprehensive simulation results verify the superiority of the proposed algorithm. Full article
(This article belongs to the Special Issue Sensors, Robots, Internet of Things, and Smart Factories)
Show Figures

Figure 1

Open AccessArticle
Deep CNN for Indoor Localization in IoT-Sensor Systems
Sensors 2019, 19(14), 3127; https://doi.org/10.3390/s19143127 - 15 Jul 2019
Cited by 7
Abstract
Currently, indoor localization is among the most challenging issues related to the Internet of Things (IoT). Most of the state-of-the-art indoor localization solutions require a high computational complexity to achieve a satisfying localization accuracy and do not meet the memory limitations of IoT [...] Read more.
Currently, indoor localization is among the most challenging issues related to the Internet of Things (IoT). Most of the state-of-the-art indoor localization solutions require a high computational complexity to achieve a satisfying localization accuracy and do not meet the memory limitations of IoT devices. In this paper, we develop a localization framework that shifts the online prediction complexity to an offline preprocessing step, based on Convolutional Neural Networks (CNN). Motivated by the outstanding performance of such networks in the image classification field, the indoor localization problem is formulated as 3D radio image-based region recognition. It aims to localize a sensor node accurately by determining its location region. 3D radio images are constructed based on Received Signal Strength Indicator (RSSI) fingerprints. The simulation results justify the choice of the different parameters, optimization algorithms, and model architectures used. Considering the trade-off between localization accuracy and computational complexity, our proposed method outperforms other popular approaches. Full article
(This article belongs to the Special Issue Sensors, Robots, Internet of Things, and Smart Factories)
Show Figures

Figure 1

Open AccessArticle
RGB-D-Based Pose Estimation of Workpieces with Semantic Segmentation and Point Cloud Registration
Sensors 2019, 19(8), 1873; https://doi.org/10.3390/s19081873 - 19 Apr 2019
Cited by 7
Abstract
As an important part of a factory’s automated production line, industrial robots can perform a variety of tasks by integrating external sensors. Among these tasks, grasping scattered workpieces on the industrial assembly line has always been a prominent and difficult point in robot [...] Read more.
As an important part of a factory’s automated production line, industrial robots can perform a variety of tasks by integrating external sensors. Among these tasks, grasping scattered workpieces on the industrial assembly line has always been a prominent and difficult point in robot manipulation research. By using RGB-D (color and depth) information, we propose an efficient and practical solution that fuses the approaches of semantic segmentation and point cloud registration to perform object recognition and pose estimation. Different from objects in an indoor environment, the characteristics of the workpiece are relatively simple; thus, we create and label an RGB image dataset from a variety of industrial scenarios and train the modified FCN (Fully Convolutional Network) on a homemade dataset to infer the semantic segmentation results of the input images. Then, we determine the point cloud of the workpieces by incorporating the depth information to estimate the real-time pose of the workpieces. To evaluate the accuracy of the solution, we propose a novel pose error evaluation method based on the robot vision system. This method does not rely on expensive measuring equipment and can also obtain accurate evaluation results. In an industrial scenario, our solution has a rotation error less than two degrees and a translation error < 10 mm. Full article
(This article belongs to the Special Issue Sensors, Robots, Internet of Things, and Smart Factories)
Show Figures

Figure 1

Open AccessArticle
PlantTalk: A Smartphone-Based Intelligent Hydroponic Plant Box
Sensors 2019, 19(8), 1763; https://doi.org/10.3390/s19081763 - 12 Apr 2019
Cited by 7
Abstract
This paper proposes an IoT-based intelligent hydroponic plant factory solution called PlantTalk. The novelty of our approach is that the PlantTalk intelligence can be built through an arbitrary smartphone. We show that PlantTalk can flexibly configure the connections of various plant sensors and [...] Read more.
This paper proposes an IoT-based intelligent hydroponic plant factory solution called PlantTalk. The novelty of our approach is that the PlantTalk intelligence can be built through an arbitrary smartphone. We show that PlantTalk can flexibly configure the connections of various plant sensors and actuators through a smartphone. One can also conveniently write Python programs for plant-care intelligence through the smart phone. The developed plant-care intelligence includes automatic LED lighting, water spray, water pump and so on. As an example, we show that the PlantTalk intelligence effectively lowers the CO2 concentration, and the reduction speed is 53% faster than a traditional plant system. PlantTalk has been extended for a plant factory called AgriTalk. Full article
(This article belongs to the Special Issue Sensors, Robots, Internet of Things, and Smart Factories)
Show Figures

Figure 1

Open AccessArticle
Energy and Distance-Aware Hopping Sensor Relocation for Wireless Sensor Networks
Sensors 2019, 19(7), 1567; https://doi.org/10.3390/s19071567 - 01 Apr 2019
Cited by 7
Abstract
Recent advances in big data technology collecting and analyzing large amounts of valuable data have attracted a lot of attention. When the information in non-reachable areas is required, IoT wireless sensor network technologies have to be applied. Sensors fundamentally have energy limitations, and [...] Read more.
Recent advances in big data technology collecting and analyzing large amounts of valuable data have attracted a lot of attention. When the information in non-reachable areas is required, IoT wireless sensor network technologies have to be applied. Sensors fundamentally have energy limitations, and it is almost impossible to replace energy-depleted sensors that have been deployed in an inaccessible region. Therefore, moving healthy sensors into the sensing hole will recover the faulty sensor area. In rough surfaces, hopping sensors would be more appropriate than wheel-driven mobile sensors. Sensor relocation algorithms to recover sensing holes have been researched variously in the past. However, the majority of studies to date have been inadequate in reality, since they are nothing but theoretical studies which assume that all the topology in the network is known and then computes the shortest path based on the nonrealistic backing up knowledge—The topology information. In this paper, we first propose a distributed hopping sensor relocation protocol. The possibility of movement of the hopping sensor is also considered to recover sensing holes and is not limited to applying the shortest path strategy. Finally, a performance analysis using OMNeT++ has demonstrated the solidification of the excellence of the proposed protocol. Full article
(This article belongs to the Special Issue Sensors, Robots, Internet of Things, and Smart Factories)
Show Figures

Figure 1

Review

Jump to: Research

Open AccessFeature PaperReview
Routing Protocols for Low Power and Lossy Networks in Internet of Things Applications
Sensors 2019, 19(9), 2144; https://doi.org/10.3390/s19092144 - 09 May 2019
Cited by 23
Abstract
The emergence of the Internet of Things (IoT) and its applications has taken the attention of several researchers. In an effort to provide interoperability and IPv6 support for the IoT devices, the Internet Engineering Task Force (IETF) proposed the 6LoWPAN stack. However, the [...] Read more.
The emergence of the Internet of Things (IoT) and its applications has taken the attention of several researchers. In an effort to provide interoperability and IPv6 support for the IoT devices, the Internet Engineering Task Force (IETF) proposed the 6LoWPAN stack. However, the particularities and hardware limitations of networks associated with IoT devices lead to several challenges, mainly for routing protocols. On its stack proposal, IETF standardizes the RPL (IPv6 Routing Protocol for Low-Power and Lossy Networks) as the routing protocol for Low-power and Lossy Networks (LLNs). RPL is a tree-based proactive routing protocol that creates acyclic graphs among the nodes to allow data exchange. Although widely considered and used by current applications, different recent studies have shown its limitations and drawbacks. Among these, it is possible to highlight the weak support of mobility and P2P traffic, restrictions for multicast transmissions, and lousy adaption for dynamic throughput. Motivated by the presented issues, several new solutions have emerged during recent years. The approaches range from the consideration of different routing metrics to an entirely new solution inspired by other routing protocols. In this context, this work aims to present an extensive survey study about routing solutions for IoT/LLN, not limited to RPL enhancements. In the course of the paper, the routing requirements of LLNs, the initial protocols, and the most recent approaches are presented. The IoT routing enhancements are divided according to its main objectives and then studied individually to point out its most important strengths and weaknesses. Furthermore, as the main contribution, this study presents a comprehensive discussion about the considered approaches, identifying the still remaining open issues and suggesting future directions to be recognized by new proposals. Full article
(This article belongs to the Special Issue Sensors, Robots, Internet of Things, and Smart Factories)
Show Figures

Figure 1

Back to TopTop