Development of a Low Cost and Path-free Autonomous Patrol System Based on Stereo Vision System and Checking Flags
Abstract
:1. Introduction
2. System Architecture
2.1. Patrol Robot
2.2. Server
3. Materials and Methods
3.1. Imaging Space Conversion and Patrol Path Determination
- Initially, compare each in with the threshold of depth . The which are larger than continuously are selected as a sub-section , which .
- Select the largest section of , and calculate the corresponding real width of the selected section. Assume the selected section , , and the width of the corresponding world coordinates are , then the width of the section can be estimated as follows:
- As long as the calculated width exceeds width threshold , then this route (the region of selected section ) is adopted as a suitable path. The angle from the center of the image plane (the head of the robot) to the center of the path is then calculated to control the rotation of the robot. Assume is the center of FOV, and is the middle of the selected section on the horizontal axis of image coordinate, then the angle between the head of the robot and the center of the desired path can be calculated as follows:
- If the analysis results fail to provide a viable path, the robot rotates counterclockwise (CCW) in search of another route.
3.2. Signing in at Inspection Points
3.3. Detection and Tracking of Intruders
Algorithm 1 Tracking of intruders. | |
1: | Detecting the skeleton of an intruder |
2: | Getting the image coordinates and distance of the left and right hip, and |
3: | Calculating the tracking target, , and |
4: | ifthen |
5: | The robot is turned CCW for |
6: | else ifthen |
7: | The robot is turned CW for |
8: | else ifthen |
9: | If then |
10: | The robot moves forward |
11: | else if then |
12: | The robot moves backward |
13: | else if then |
14: | The robot remains in the same position |
15: | end if |
16: | end if |
3.4. Wireless Backup and Query Service
4. Experiment and Results
- From the start point until 21 s, the robot searched for a passable path and followed the resulting path.
- At 31 s, the 1st inspection point marker was detected by the robot, to which the robot approached.
- At 44 s, the robot reached the 1st inspection point. The RFID tag number was recorded by the robot, thereby completing the 1st inspection point sign-in.
- From 44 s to 54 s, the robot rotated CCW in search of its next route to follow.
- From 54 s to 74 s, the robot moved from the 1st to the 2nd inspection point.
- At 84 s, the robot located the 2nd inspection point. The center of the marker image was not located in the target region of the image plane (between 310 and 330 pixels); therefore, the robot rotated until the image marker was near the center of the image plane (at 94 s), as shown in the figure.
- At 94 s, the robot proceeded forward to sign-in at this inspection point at 101 s, whereupon it rotated and began patrolling from the 2nd inspection point to the 3rd inspection point (109 s to the 139 s).
- The 3rd and 4th inspection points were identified at 159 s and the 203 s, respectively. After checking-in at the final (4th) inspection point, the robot continued patrolling for approximately 30 seconds before halting.
- At 11 s and 129 s, the center of the section was near the origin of the polar space, which means that the robot was moving toward the center of the patrol path.
- Between 21 s and 31 s, the robot was approaching the 1st inspection point, as indicated by the decrease in recorded depth.
- While the robot was signing in at the 1st inspection point (at 44 s), the distance between the wall and the robot was less than the minimum distance required by the SVS for detection. Thus, no depth information appears in the polar space during this time. Similarly, the depth values disappeared at 94 s, 159 s, and 203 s, corresponding to the sign-in process at the 2nd, 3rd, and 4th inspection points.
- After signing in, the robot rotated CCW to search for a suitable path, whereupon the depth values increased considerably from the left side to the right side from 44 s to 64 s. This is similar to the records from 94 s to 119 s, 159 s to 173 s, and 193 s to 216 s.
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Nomenclature
Distance from the center of SVS on the x-axis (width) at pixel | |
Distance from the center of SVS on the y-axis (height) at pixel | |
Distance from the center of SVS on the z-axis (depth) at pixel | |
Threshold of depth used to determine a possible path. | |
Minimum depth at n-th column of ROI. | |
set of | |
kth set of which are larger than continuously, | |
mth element of | |
Threshold of width used to determine a possible path. | |
Width at the range of a selected in the real world. | |
The center of FOV on the horizontal axis of the image coordinate | |
The middle of the selected section on the horizontal axis of image coordinate | |
The detected circle center on the horizontal axis of image coordinate | |
Left edge of the center region on the horizontal axis of image coordinate | |
Right edge of the center region on the horizontal axis of image coordinate | |
Left hip node on the horizontal axis of image coordinate | |
Right hip node on the horizontal axis of image coordinate | |
Depth of the left hip node | |
Depth of the right hip node | |
The center between the left and right hip node on the horizontal axis of image coordinate | |
The distance from the center between the left and right hip node to the SVS | |
Near threshold values of tracking length | |
Far threshold values of tracking length |
References
- Lee, C.W.; Lee, J.D.; Ahn, J.; Oh, H.J.; Park, J.K.; Jeon, H.S. A Low Overhead Mapping Scheme for Exploration and Representation in the Unknown Area. Appl. Sci. 2019, 9, 3089. [Google Scholar] [CrossRef] [Green Version]
- Dwijotomo, A.; Abdul Rahman, M.A.; Mohammed Ariff, M.H.; Zamzuri, H.; Wan Azree, W.M.H. Cartographer SLAM Method for Optimization with an Adaptive Multi-Distance Scan Scheduler. Appl. Sci. 2020, 10, 347. [Google Scholar] [CrossRef] [Green Version]
- Li, X.; Wang, D.; Ao, H.; Belaroussi, R.; Gruyer, D. Fast 3D Semantic Mapping in Road Scenes. Appl. Sci. 2019, 9, 631. [Google Scholar] [CrossRef] [Green Version]
- Lee, T.J.; Kim, C.H.; Cho, D.I.D. A Monocular Vision Sensor-Based Efficient SLAM Method for Indoor Service Robots. IEEE Trans. Ind. Electron. 2019, 66, 318–328. [Google Scholar] [CrossRef]
- Valiente, D.; Gil, A.; Payá, L.; Sebastián, J.M.; Reinoso, Ó. Robust Visual Localization with Dynamic Uncertainty Management in Omnidirectional SLAM. Appl. Sci. 2017, 7, 1294. [Google Scholar] [CrossRef] [Green Version]
- Muñoz-Bañón, M.Á.; del Pino, I.; Candelas, F.A.; Torres, F. Framework for Fast Experimental Testing of Autonomous Navigation Algorithms. Appl. Sci. 2019, 9, 1997. [Google Scholar] [CrossRef] [Green Version]
- Li, I.H.; Wang, W.Y.; Chien, Y.; Fang, N.H. Autonomous Ramp Detection and Climbing Systems for Tracked Robot Using Kinect Sensor. Int. J. Fuzzy Syst. 2013, 15, 452–459. [Google Scholar]
- Li, I.H.; Hsu, C.C.; Lin, S.S. Map Building of Unknown Environment Based on Fuzzy Sensor Fusion of Ultrasonic Ranging Data. Int. J. Fuzzy Syst. 2014, 16, 368–377. [Google Scholar]
- Cherubini, A.; Spindler, F.; Chaumette, F. Autonomous Visual Navigation and Laser-Based Moving Obstacle Avoidance. IEEE Trans. Intell. Transp. Syst. 2014, 15, 2101–2110. [Google Scholar] [CrossRef]
- Jiang, G.; Yin, L.; Jin, S.; Tian, C.; Ma, X.; Ou, Y. A Simultaneous Localization and Mapping (SLAM) Framework for 2.5D Map Building Based on Low-Cost LiDAR and Vision Fusion. Appl. Sci. 2019, 9, 2105. [Google Scholar] [CrossRef] [Green Version]
- Kang, X.; Li, J.; Fan, X.; Wan, W. Real-Time RGB-D Simultaneous Localization and Mapping Guided by Terrestrial LiDAR Point Cloud for Indoor 3-D Reconstruction and Camera Pose Estimation. Appl. Sci. 2019, 9, 3264. [Google Scholar] [CrossRef] [Green Version]
- Zou, R.; Ge, X.; Wang, G. Applications of RGB-D Data for 3D Reconstruction in the Indoor Environment. In Proceedings of the 2016 IEEE Chinese Guidance, Navigation and Control Conference (CGNCC), Nanjing, China, 12–14 August 2016; pp. 375–378. [Google Scholar]
- Ramer, C.; Sessner, J.; Scholz, M.; Zhang, X.; Franke, J. Fusing Low-Cost Sensor Data for Localization and Mapping of Automated Guided Vehicle Fleets in Indoor Applications. In Proceedings of the 2015 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), San Diego, CA, USA, 14–16 September 2015; pp. 65–70. [Google Scholar]
- Jiang, G.; Yin, L.; Liu, G.; Xi, W.; Ou, Y. FFT-Based Scan-Matching for SLAM Applications with Low-Cost Laser Range Finders. Appl. Sci. 2019, 9, 41. [Google Scholar] [CrossRef] [Green Version]
- Li, S.; Handa, A.; Zhang, Y.; Calway, A. HDRFusion: HDR SLAM Using a Low-Cost Auto-Exposure RGB-D Sensor. In Proceedings of the 2016 Fourth International Conference on 3D Vision (3DV), Stanford, CA, USA, 25–28 October 2016; pp. 314–322. [Google Scholar]
- Bergeon, Y.; Hadda, I.; Křivánek, V.; Motsch, J.; Štefek, A. Low Cost 3D Mapping for Indoor Navigation. In Proceedings of the International Conference on Military Technologies (ICMT) 2015, Brno, Czech Republic, 19–21 May 2015. [Google Scholar]
- Calloway, T.; Megherbi, D.B. Using 6 DOF Vision-Inertial Tracking to Evaluate and Improve Low Cost Depth Sensor based SLAM. In Proceedings of the 2016 IEEE International Conference on Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA), Budapest, Hungary, 27–28 June 2016. [Google Scholar]
- Gong, Z.; Li, J.; Li, W. A Low Cost Indoor Mapping Robot Based on TinySLAM Algorithm. In Proceedings of the 2016 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Beijing, China, 10–15 July 2016; pp. 4549–4552. [Google Scholar]
- Anandharaman, S.; Sudhakaran, M.; Seyezhai, R. A Low-Cost Visual Navigation and Mapping System for Unmanned Aerial Vehicle Using LSD-SLAM Algorithm. In Proceedings of the 2016 Online International Conference on Green Engineering and Technologies (IC-GET), Coimbatore, India, 19 November 2016; p. 16864750. [Google Scholar]
- Xi, W.; Ou, Y.; Peng, J.; Yu, G. A New Method for Indoor Low-Cost Mobile Robot SLAM. In Proceedings of the 2017 IEEE International Conference on Information and Automation (ICIA), Macau, China, 18–20 July 2017; pp. 1012–1017. [Google Scholar]
- Fang, Z.; Zhao, S.; Wen, S. A Real-Time and Low-Cost 3D SLAM System Based on a Continuously Rotating 2D Laser Scanner. In Proceedings of the 2017 IEEE 7th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Honolulu, HI, USA, 31 July–4 August 2017; pp. 4544–4559. [Google Scholar]
- Bae, H.; Oh, J.; Lee, K.K.; Oh, J.H. Low-Cost Indoor Positioning System using BLE (Bluetooth Low Energy) based Sensor Fusion with Constrained Extended Kalman Filter. In Proceedings of the 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO), Qingdao, China, 3–7 December 2016; pp. 939–945. [Google Scholar]
- DiGiampaolo, E.; Martinelli, F. A Robotic System for Localization of Passive UHF-RFID Tagged Objects on Shelves. IEEE Sens. J. 2018, 18, 8558–8568. [Google Scholar] [CrossRef]
- Chen, K.C.; Tsai, W.H. Vision-based autonomous vehicle guidance for indoor security patrolling by a sift-based vehicle-localization technique. IEEE Trans. Veh. Technol. 2010, 59, 3261–3271. [Google Scholar] [CrossRef]
- Song, D.; Kim, C.Y.; Yi, J. Simultaneous localization of multiple unknown and transient radio sources using a mobile robot. IEEE Trans. Robot. 2012, 28, 668–680. [Google Scholar] [CrossRef]
- Chen, X.; Sun, H.; Zhang, H. A New Method of Simultaneous Localization and Mapping for Mobile Robots Using Acoustic Landmarks. Appl. Sci. 2019, 9, 1352. [Google Scholar] [CrossRef] [Green Version]
- Wang, Z.; Chen, Y.; Mei, Y.; Yang, K.; Cai, B. IMU-Assisted 2D SLAM Method for Low-Texture and Dynamic Environments. Appl. Sci. 2018, 8, 2534. [Google Scholar] [CrossRef] [Green Version]
- An, V.; Qu, Z.; Roberts, R. A Rainbow Coverage Path Planning for a Patrolling Mobile Robot with Circular Sensing Range. IEEE Trans. Syst. Man Cybern. Syst. 2018, 48, 1238–1254. [Google Scholar] [CrossRef]
- Huang, W.Q.; Chang, D.; Wang, S.Y.; Xiang, J. An Efficient Visualization Method of RFID Indoor Positioning Data. In Proceedings of the 2014 2nd International Conference on Systems and Informatics (ICSAI), Shanghai, China, 15–17 November 2014; p. 14852364. [Google Scholar]
- Yu, W.S.; Wen, Y.J.; Tsai, C.H.; Ren, G.P.; Lin, P.C. Human following on a mobile robot by low-cost infrared sensors. J. Chin. Soc. Mech. Eng. 2014, 35, 429–441. [Google Scholar]
- Hung, M.H.; Chen, K.Y.; Lin, S.S. A web-services-based remote monitoring and control system architecture with the capability of protecting appliance safety. J. Chung Cheng Inst. Technol. 2005, 34, 3854. [Google Scholar]
- Lian, F.L.; Lin, Y.C.; Kuo, C.T.; Jean, J.H. Rate and quality control with embedded coding for mobile robot with visual patrol. IEEE Syst. J. 2012, 6, 368–377. [Google Scholar] [CrossRef]
Inspection Points Placed | Starting Time (hh:mm:ss) | Stopping Time (hh:mm:ss) | Elapsed Time (s) | Average Time (s) |
---|---|---|---|---|
Yes | 10:53:55 | 10:57:17 | 202 | 206 |
15:22:02 | 15:25:48 | 226 | ||
17:40:33 | 17:43:55 | 202 | ||
18:11:19 | 18:14:33 | 194 | ||
No | 13:41:57 | 13:44:40 | 163 | 205.75 |
16:38:24 | 16:41:56 | 212 | ||
16:54:04 | 16:57:12 | 188 | ||
09:24:43 | 09:29:03 | 260 |
Proposed System | Other Solutions (e.g., Security Camera) | |
---|---|---|
Price | V | X |
Path Randomness | V | X |
Mobility | V | X |
Convenient | X | V |
Security and Privacy | V | X |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lan, C.-W.; Chang, C.-Y. Development of a Low Cost and Path-free Autonomous Patrol System Based on Stereo Vision System and Checking Flags. Appl. Sci. 2020, 10, 974. https://doi.org/10.3390/app10030974
Lan C-W, Chang C-Y. Development of a Low Cost and Path-free Autonomous Patrol System Based on Stereo Vision System and Checking Flags. Applied Sciences. 2020; 10(3):974. https://doi.org/10.3390/app10030974
Chicago/Turabian StyleLan, Chien-Wu, and Chi-Yao Chang. 2020. "Development of a Low Cost and Path-free Autonomous Patrol System Based on Stereo Vision System and Checking Flags" Applied Sciences 10, no. 3: 974. https://doi.org/10.3390/app10030974
APA StyleLan, C.-W., & Chang, C.-Y. (2020). Development of a Low Cost and Path-free Autonomous Patrol System Based on Stereo Vision System and Checking Flags. Applied Sciences, 10(3), 974. https://doi.org/10.3390/app10030974