Next Article in Journal
Effect of Application Height and Ground Speed on Spray Pattern and Droplet Spectra from Remotely Piloted Aerial Application Systems
Previous Article in Journal
Multi-Sensor UAV Tracking of Individual Seedlings and Seedling Communities at Millimetre Accuracy
Open AccessArticle

Drone-Action: An Outdoor Recorded Drone Video Dataset for Action Recognition

1
School of Engineering, University of South Australia, Adelaide SA 5095, Australia
2
Joint and Operations Analysis Division, Defence Science and Technology Group, Melbourne, VIC 3207, Australia
*
Author to whom correspondence should be addressed.
Drones 2019, 3(4), 82; https://doi.org/10.3390/drones3040082
Received: 6 November 2019 / Revised: 23 November 2019 / Accepted: 24 November 2019 / Published: 28 November 2019
Aerial human action recognition is an emerging topic in drone applications. Commercial drone platforms capable of detecting basic human actions such as hand gestures have been developed. However, a limited number of aerial video datasets are available to support increased research into aerial human action analysis. Most of the datasets are confined to indoor scenes or object tracking and many outdoor datasets do not have sufficient human body details to apply state-of-the-art machine learning techniques. To fill this gap and enable research in wider application areas, we present an action recognition dataset recorded in an outdoor setting. A free flying drone was used to record 13 dynamic human actions. The dataset contains 240 high-definition video clips consisting of 66,919 frames. All of the videos were recorded from low-altitude and at low speed to capture the maximum human pose details with relatively high resolution. This dataset should be useful to many research areas, including action recognition, surveillance, situational awareness, and gait analysis. To test the dataset, we evaluated the dataset with a pose-based convolutional neural network (P-CNN) and high-level pose feature (HLPF) descriptors. The overall baseline action recognition accuracy calculated using P-CNN was 75.92%.
View Full-Text
Keywords: drone; dataset; human action recognition; aerial video analysis; P-CNN drone; dataset; human action recognition; aerial video analysis; P-CNN
Show Figures

Figure 1

MDPI and ACS Style

Perera, A.G.; Law, Y.W.; Chahl, J. Drone-Action: An Outdoor Recorded Drone Video Dataset for Action Recognition. Drones 2019, 3, 82.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop