Next Article in Journal
Mapping Height and Aboveground Biomass of Mangrove Forests on Hainan Island Using UAV-LiDAR Sampling
Next Article in Special Issue
The Influence of Vegetation Characteristics on Individual Tree Segmentation Methods with Airborne LiDAR Data
Previous Article in Journal
Combined Use of Terrestrial Laser Scanning and UAV Photogrammetry in Mapping Alpine Terrain
Previous Article in Special Issue
A Review on IoT Deep Learning UAV Systems for Autonomous Obstacle Detection and Collision Avoidance
Open AccessArticle

Orientation- and Scale-Invariant Multi-Vehicle Detection and Tracking from Unmanned Aerial Videos

Department of Geomatics Engineering, University of Calgary, 2500 University Drive NW, Calgary, AB T2N 1N4, Canada
*
Author to whom correspondence should be addressed.
Remote Sens. 2019, 11(18), 2155; https://doi.org/10.3390/rs11182155
Received: 29 July 2019 / Revised: 2 September 2019 / Accepted: 11 September 2019 / Published: 16 September 2019
(This article belongs to the Special Issue Trends in UAV Remote Sensing Applications)
Along with the advancement of light-weight sensing and processing technologies, unmanned aerial vehicles (UAVs) have recently become popular platforms for intelligent traffic monitoring and control. UAV-mounted cameras can capture traffic-flow videos from various perspectives providing a comprehensive insight into road conditions. To analyze the traffic flow from remotely captured videos, a reliable and accurate vehicle detection-and-tracking approach is required. In this paper, we propose a deep-learning framework for vehicle detection and tracking from UAV videos for monitoring traffic flow in complex road structures. This approach is designed to be invariant to significant orientation and scale variations in the videos. The detection procedure is performed by fine-tuning a state-of-the-art object detector, You Only Look Once (YOLOv3), using several custom-labeled traffic datasets. Vehicle tracking is conducted following a tracking-by-detection paradigm, where deep appearance features are used for vehicle re-identification, and Kalman filtering is used for motion estimation. The proposed methodology is tested on a variety of real videos collected by UAVs under various conditions, e.g., in late afternoons with long vehicle shadows, in dawn with vehicles lights being on, over roundabouts and interchange roads where vehicle directions change considerably, and from various viewpoints where vehicles’ appearance undergo substantial perspective distortions. The proposed tracking-by-detection approach performs efficiently at 11 frames per second on color videos of 2720p resolution. Experiments demonstrated that high detection accuracy could be achieved with an average F1-score of 92.1%. Besides, the tracking technique performs accurately, with an average multiple-object tracking accuracy (MOTA) of 81.3%. The proposed approach also addressed the shortcomings of the state-of-the-art in multi-object tracking regarding frequent identity switching, resulting in a total of only one identity switch over every 305 tracked vehicles. View Full-Text
Keywords: traffic monitoring; vehicle detection; multi-vehicle tracking; vehicle re-identification; unmanned aerial vehicles; deep convolutional neural network. traffic monitoring; vehicle detection; multi-vehicle tracking; vehicle re-identification; unmanned aerial vehicles; deep convolutional neural network.
Show Figures

Graphical abstract

MDPI and ACS Style

Wang, J.; Simeonova, S.; Shahbazi, M. Orientation- and Scale-Invariant Multi-Vehicle Detection and Tracking from Unmanned Aerial Videos. Remote Sens. 2019, 11, 2155. https://doi.org/10.3390/rs11182155

AMA Style

Wang J, Simeonova S, Shahbazi M. Orientation- and Scale-Invariant Multi-Vehicle Detection and Tracking from Unmanned Aerial Videos. Remote Sensing. 2019; 11(18):2155. https://doi.org/10.3390/rs11182155

Chicago/Turabian Style

Wang, Jie; Simeonova, Sandra; Shahbazi, Mozhdeh. 2019. "Orientation- and Scale-Invariant Multi-Vehicle Detection and Tracking from Unmanned Aerial Videos" Remote Sens. 11, no. 18: 2155. https://doi.org/10.3390/rs11182155

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop