We propose a robust approach to detecting and tracking moving objects for a naval unmanned aircraft system (UAS) landing on an aircraft carrier. The frame difference algorithm follows a simple principle to achieve real-time tracking, whereas Faster Region-Convolutional Neural Network (R-CNN) performs highly precise detection and tracking characteristics. We thus combine Faster R-CNN with the frame difference method, which is demonstrated to exhibit robust and real-time detection and tracking performance. In our UAS landing experiments, two cameras placed on both sides of the runway are used to capture the moving UAS. When the UAS is captured, the joint algorithm uses frame difference to detect the moving target (UAS). As soon as the Faster R-CNN algorithm accurately detects the UAS, the detection priority is given to Faster R-CNN. In this manner, we also perform motion segmentation and object detection in the presence of changes in the environment, such as illumination variation or “walking persons”. By combining the 2 algorithms we can accurately detect and track objects with a tracking accuracy rate of up to 99% and a frame per second of up to 40 Hz. Thus, a solid foundation is laid for subsequent landing guidance.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.