Effective Waterline Detection of Unmanned Surface Vehicles Based on Optical Images
AbstractReal-time and accurate detection of the sailing or water area will help realize unmanned surface vehicle (USV) systems. Although there are some methods for using optical images in USV-oriented environmental modeling, both the robustness and precision of these published waterline detection methods are comparatively low for a real USV system moving in a complicated environment. This paper proposes an efficient waterline detection method based on structure extraction and texture analysis with respect to optical images and presents a practical application to a USV system for validation. First, the basic principles of local binary patterns (LBPs) and gray level co-occurrence matrix (GLCM) were analyzed, and their advantages were integrated to calculate the texture information of river images. Then, structure extraction was introduced to preprocess the original river images so that the textures resulting from USV motion, wind, and illumination are removed. In the practical application, the waterlines of many images captured by the USV system moving along an inland river were detected with the proposed method, and the results were compared with those of edge detection and super pixel segmentation. The experimental results showed that the proposed algorithm is effective and robust. The average error of the proposed method was 1.84 pixels, and the mean square deviation was 4.57 pixels. View Full-Text
Share & Cite This Article
Wei, Y.; Zhang, Y. Effective Waterline Detection of Unmanned Surface Vehicles Based on Optical Images. Sensors 2016, 16, 1590.
Wei Y, Zhang Y. Effective Waterline Detection of Unmanned Surface Vehicles Based on Optical Images. Sensors. 2016; 16(10):1590.Chicago/Turabian Style
Wei, Yangjie; Zhang, Yuwei. 2016. "Effective Waterline Detection of Unmanned Surface Vehicles Based on Optical Images." Sensors 16, no. 10: 1590.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.