Next Article in Journal
Hyperspectral and LiDAR Data Fusion Classification Using Superpixel Segmentation-Based Local Pixel Neighborhood Preserving Embedding
Next Article in Special Issue
Effect of Leaf Occlusion on Leaf Area Index Inversion of Maize Using UAV–LiDAR Data
Previous Article in Journal
An Uncertainty Quantified Fundamental Climate Data Record for Microwave Humidity Sounders
Previous Article in Special Issue
Rigorous Boresight Self-Calibration of Mobile and UAV LiDAR Scanning Systems by Strip Adjustment
Article Menu
Issue 5 (March-1) cover image

Export Article

Open AccessArticle

Correlation Filter-Based Visual Tracking for UAV with Online Multi-Feature Learning

1
School of Mechanical Engineering, Tongji University, Shanghai 201804, China
2
School of Automotive Studies, Tongji University, Shanghai 201804, China
*
Authors to whom correspondence should be addressed.
Remote Sens. 2019, 11(5), 549; https://doi.org/10.3390/rs11050549
Received: 20 February 2019 / Revised: 26 February 2019 / Accepted: 28 February 2019 / Published: 6 March 2019
(This article belongs to the Special Issue Trends in UAV Remote Sensing Applications)
  |  
PDF [2037 KB, uploaded 8 March 2019]
  |  

Abstract

In this paper, a novel online learning-based tracker is presented for the unmanned aerial vehicle (UAV) in different types of tracking applications, such as pedestrian following, automotive chasing, and building inspection. The presented tracker uses novel features, i.e., intensity, color names, and saliency, to respectively represent both the tracking object and its background information in a background-aware correlation filter (BACF) framework instead of only using the histogram of oriented gradient (HOG) feature. In other words, four different voters, which combine the aforementioned four features with the BACF framework, are used to locate the object independently. After obtaining the response maps generated by aforementioned voters, a new strategy is proposed to fuse these response maps effectively. In the proposed response map fusion strategy, the peak-to-sidelobe ratio, which measures the peak strength of the response, is utilized to weight each response, thereby filtering the noise for each response and improving final fusion map. Eventually, the fused response map is used to accurately locate the object. Qualitative and quantitative experiments on 123 challenging UAV image sequences, i.e., UAV123, show that the novel tracking approach, i.e., OMFL tracker, performs favorably against 13 state-of-the-art trackers in terms of accuracy, robustness, and efficiency. In addition, the multi-feature learning approach is able to improve the object tracking performance compared to the tracking method with single-feature learning applied in literature. View Full-Text
Keywords: visual tracking; unmanned aerial vehicle (UAV); background-aware correlation filter; online multi-feature learning; peak-to-sidelobe ratio (PSR); response map fusion visual tracking; unmanned aerial vehicle (UAV); background-aware correlation filter; online multi-feature learning; peak-to-sidelobe ratio (PSR); response map fusion
Figures

Graphical abstract

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Fu, C.; Lin, F.; Li, Y.; Chen, G. Correlation Filter-Based Visual Tracking for UAV with Online Multi-Feature Learning. Remote Sens. 2019, 11, 549.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Remote Sens. EISSN 2072-4292 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top