Next Article in Journal
Land Surface Parameterization at Exposed Playa and Desert Region to Support Dust Emissions Estimates in Southern California, United States
Next Article in Special Issue
A Geomatics Approach in Scan to FEM Process Applied to Cultural Heritage Structure: The Case Study of the “Colossus of Barletta”
Previous Article in Journal
Estimating the Near-Ground PM2.5 Concentration over China Based on the CapsNet Model during 2018–2020
Previous Article in Special Issue
The Fusion Strategy of 2D and 3D Information Based on Deep Learning: A Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Urban Traffic Monitoring and Analysis Using Unmanned Aerial Vehicles (UAVs): A Systematic Literature Review

by
Eugen Valentin Butilă
and
Răzvan Gabriel Boboc
*
Department of Automotive and Transport Engineering, Transilvania University of Brasov, RO-500036 Brasov, Romania
*
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(3), 620; https://doi.org/10.3390/rs14030620
Submission received: 30 December 2021 / Revised: 23 January 2022 / Accepted: 25 January 2022 / Published: 27 January 2022

Abstract

:
Unmanned aerial vehicles (UAVs) are gaining considerable interest in transportation engineering in order to monitor and analyze traffic. This systematic review surveys the scientific contributions in the application of UAVs for civil engineering, especially those related to traffic monitoring. Following the PRISMA framework, 34 papers were identified in five scientific databases. First, this paper introduces previous works in this field. In addition, the selected papers were analyzed, and some conclusions were drawn to complement the findings. It can be stated that this is still a field in its infancy and that progress in advanced image processing techniques and technologies used in the construction of UAVs will lead to an explosion in the number of applications, which will result in increased benefits for society, reducing unpleasant situations, such as congestion and collisions in major urban centers of the world.

1. Introduction

Unmanned aerial vehicles (UAVs), commonly known as drones, are gaining considerable interest in applications such as surveillance, mapping, and remote sensing [1]. The growing interest in the use of UAVs is based on many factors, including the cost of acquiring these systems, the availability of trained operators, low risk to human life, and ease of use. Due to these advantages, as well as offering a good resolution and tracking capabilities, they are starting to be used increasingly in more fields.
After they were first used in geomatics applications, providing alternatives to classical photogrammetry [2] and 3D mapping [3] to present data in a suitable format for architects and engineers [4], UAVs became commonly used tools for data acquisition. Through photogrammetry techniques and remote sensing, structure from motion (SfM) applications allow for the creation of 3D models of different objects, buildings, or areas [5].
In the last few years, UAVs have found their applicability in the field of civil engineering, especially in transportation engineering, in order to supervise and monitor traffic [6]. The main benefit of traffic monitoring with UAVs is that they can be deployed to many different places where, for example, a local council may want to gather information on the use of infrastructure, such as roads, bridges, train tracks, and so on. The same goes for monitoring people and animals for conservation purposes [7,8], or, more recently, they have been used to combat the coronavirus disease (COVID-19) pandemic [9]. Because of their mobility, traffic monitoring UAVs are able to collect high-resolution data, which can then be analyzed in real time. The results can then be displayed or printed, and in some cases sent to a central server or cloud for further analysis.
The growth in traffic volume and the growth of global travel makes traffic monitoring a problem of interest and a major challenge in many countries around the world. In this context, it is expected that UAVs will be an emerging solution to this challenge [10]. The bird’s eye-view of the camera provided by UAVs improves the traditional methodologies used in traffic monitoring [11], but the recognition and tracking of moving vehicles still remains a challenging problem, depending on the accuracy of image registration methods [12].
The use of UAVs for monitoring purpose is a relative new emerging field that requires development and validation of new solutions [13]. UAVs represent a potential solution to support many aspects of the existing traffic monitoring systems such as surveillance and collision avoidance [14]. In a similar way, UAVs have been applied for the monitoring of environmental parameters, e.g., air pollution, land surface temperature, flood risk, forest fire, road surface distress, land terrain monitoring etc. [15,16,17,18,19,20], but also for pedestrian traffic monitoring or disaster evacuation [21,22,23].
Currently, UAVs are very vulnerable to adverse weather conditions, such as wind, fog, and rain [24]. However, there have been significant efforts to improve the robustness of the systems using different types of sensors. For instance, GPS-enabled UAVs have been shown to provide a reasonable degree of robustness and accuracy in challenging environments [25]. Moreover, the use of inertial measurement units (IMUs) for UAV stabilization has received significant attention and has been applied to different types of UAVs including quadcopters [26].
In traffic monitoring, precision is needed to collect and send real-time vehicle data to traffic processing centers for efficient traffic management. This is of special significance to cities where traffic and road conditions are monitored every day. In most cases, wireless sensors are deployed on the road and connected to each other via wireless communication networks to obtain real-time traffic data within the intelligent transportation system (ITS) [27]. In addition, in the case of vehicle monitoring, it is necessary to identify the speed, distance, and current location of the vehicle. In this context, UAVs have significant advantages in traffic information collection because they provide a global perspective of the road and they can obtain traffic parameters that cannot be extracted by conventional monitoring methods [28].
Traffic monitoring represents a challenge not only for police and traffic authority departments, but also for individual drivers. There is great potential in UAVs for assisting drivers in a variety of traffic-related applications, including safety, incident detection, and vehicle tracking [11]. Some problems that can be solved with UAVs in the future are: traffic congestion [29,30], collision avoidance [31], safety analysis [32], and roundabout flow analysis [33]. The driver assistance can be provided via UAV-to-car communication [34].
The aim of this paper is to analyze the main applications regarding the use of UAVs in traffic monitoring. A systematic literature review was conducted for this purpose and the results provide a base for future research and development in this field. The study also highlights the current surveys related to the use of UAVs in the civil engineering field.

2. Related Work

Research on the on the use of UAV in civil engineering related to transportation is relative limited, including several literature reviews that summarize a wide range of applications (Table 1). These studies address the following topics:
  • In [35], a review optimization approaches for drone operations and drone–truck combined operations in civil applications is provided. Drone operation and applications, some previous works, and issues like mathematical models, solution methods, and synchronization between a drone and a truck are presented in the study, also suggesting some possible research directions.
  • The recent advances of UAVs and their roles in current and future transportation systems are presented in [10]. The paper summarizes the emerging technologies of UAV in transportation, highlighting performance measures, network and communications, software architecture, privacy, and security concerns. The challenges and opportunities of integrating UAVs in ITS are discussed and some potential research directions are identified in the paper.
  • In [36], a literature review of 111 publications related to the use of civil drones for transportation is provided. The focus is on passenger transportation drones, but applications from the urban and transportation planning fields are also reviewed. Potential problems are identified, and proposed solutions are given for different areas of application.
  • Emerging issues in civilian UAV usage and case studies for various fields are presented in [37], a review article that tries to analyze the potential implementations of drones in the economic system and how these implementations can be managed.
  • The state of the art of UAV for geomatics applications is reported in [3]. The survey gives an overview of different UAV platforms, also presenting various applications, approaches, and perspectives for UAV image processing.
  • Ref. [38] provide an extensive review of optimization approaches for the civil application of UAVs. The study addresses different aspects related to UAV operation, such as area coverage, search operations, routing, data gathering and recharging, communication links, and computing power.
  • In [11], the applications of UAVs in three domains of transportation (road safety, traffic monitoring, and highway infrastructure management) are reviewed. The paper discusses topics related to vision algorithms and image processing systems used in accident investigation, traffic flow analysis, and road monitoring.
  • An overview of advances in the vision-based condition assessment of civil infrastructure, civil infrastructure inspection, and monitoring applications is presented in [39]. The study reviews relevant findings in computer vision, machine learning, and structural engineering, highlighting some key challenges and concluding with ongoing work.
  • Another study [40] presents the research on using UAVs for vehicle detection by means of deep learning techniques. The work is focused on accuracy improvements and computation overhead reduction, showing similarities and differences of various techniques.
  • A comprehensive study focused on UAV civil applications and their challenges is presented in [12]. Research trends, key challenges related to charging, collision avoidance, networking and security, and future insights are featured in the paper.
  • In [41], a critical review of UAVs remote sensing data processing and their application is performed, focusing on land-cover classification and change detection and discussing potential improvements and algorithmic aspects.
Table 1. Related review papers on UAV application in civil engineering.
Table 1. Related review papers on UAV application in civil engineering.
No.Ref.TitleYearJournalApplication Domain
1[35]Optimization for drone and drone-truck combined operations: A review of the state of the art and future directions2020Computers and Operations Researchcivil applications including construction/infrastructure, agriculture, transportation/logistics, security/disaster management, entertainment/media, etc.
2[10]Advances of UAVs toward Future Transportation: The State-of-the-Art, Challenges, and Opportunities2021Future Transportationtransportation sector: surveillance, urban planning, traffic monitoring, emergency response, road maintenance and safety, warehouse inventory management, UAV delivery, disaster management, search and rescue
3[36]Drones for parcel and passenger transportation: A literature review2020Transportation Research Interdisciplinary Perspectivessafety and security, environment and sustainability, urban planning and infrastructure
4[37]Managing the drone revolution: A systematic literature review into the current use of airborne drones and future strategic directions for their effective control2020Journal of Air Transport Managementmonitoring, inspection and data collection, photography/image collection, recreation, logistics
5[3]UAV for 3D mapping applications: a review2014Applied Geomaticsarcheological site 3D recoding and modeling, geological and mining studies, urban areas,
6[38]Optimization approaches for civil applications of unmanned aerial vehicles (UAVs) or aerial drones: A survey2018Networksagriculture, environmental protection and disaster management, rescue, transport, infrastructure and construction, air traffic management, manufacturing, traffic surveillance, telecommunications, entertainment and media
7[11]Applications of unmanned aerial vehicle (UAV) in road safety, traffic and highway infrastructure management: Recent advances and challenges2020Transportation Research Part Aroad safety, traffic monitoring and highway infrastructure management
8[39]Advances in Computer Vision-Based Civil Infrastructure Inspection and Monitoring2019EngineeringInspection, monitoring
9[40]A survey of deep learning techniques for vehicle detection from UAV images2021Journal of Systems Architecturetraffic management—vehicle detection
10[12]Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges2019IEEE Accesssearch and rescue, remote sensing, construction and infrastructure inspection, precision agriculture, delivery of goods, real-time monitoring of road traffic, surveillance, providing wireless coverage
11[41]Unmanned Aerial Vehicle for Remote Sensing Applications—A Review2019Remote Sensingprecision agriculture and vegetation, urban environment and management, disaster, hazards and rescue
Despite the diversity of UAV analyses in transportation-related areas, less attention has been paid to advances in traffic monitoring techniques using UAV data, which are briefly addressed in the presented studies. Thus, to our knowledge, there is no overview of the acquisition and processing of data received from UAVs in traffic monitoring applications in urban areas. There is only one slightly older conference paper that deals exclusively with this topic, presenting the advantages and disadvantages of various researches in universities and research centers [42]. This paper is therefore a first attempt to review a study that strictly addresses this topic and opens the door to further research in drone monitoring applications that use various detection algorithms. Although there have been many articles investigating the use of UAVs in various fields (mining [43], architecture and urbanism [44], glacial and periglacial geomorphology [45], agriculture [46], geology [5], forest regeneration [47], water monitoring [48] etc.), there is no study yet to exclusively summarize the applications of UAVs in urban traffic monitoring and analysis.
The high percentage of drones in various applications can be seen in the large number of review studies that systematize the work in various fields corresponding to the latest technologies, such as: path planning techniques [49], computer vision algorithms [50], application of blockchain [51], swarm communication and routing protocols [52], configurations, flight mechanisms [53], optical remote sensing applications [54], communication and networking [55], regulation policies and technologies [56], mobile edge-computing for Internet of Things (IoT) applications [57], photogrammetry and remote sensing [58], deep learning approaches for road extraction [59], and advances toward future transportation. As shown in paper [49], it is expected that the percentage of UAVs in the transportation system to be 81% by the year 2022.

3. Materials and Methods

A systematic review of the literature covering relevant research over the last 10 years was performed. The papers were selected according to the recommendations of Systematic Review and Meta-Analysis (PRISMA) (Salameh, 2020).

3.1. Protocol and Registration

The methods and the hypothesis of the review were prepared a priori, but they were not registered on PROSPERO.

3.2. Eligibility Criteria

The papers were selected according to the following inclusion criteria: articles addressing UAV with focus on traffic monitoring or traffic analysis; articles published in English; articles published in peer-review journals; articles published from 2010 onwards; research articles.
The exclusion criteria were the following: duplicate articles; articles addressing the use of UAV in contexts other than car traffic; articles published in languages other than English; articles published before 2010; conference papers, book sections, editorial letters; reviews, conceptual papers. Articles that focused on simulations instead of using real-world data were also excluded.
Although they may provide interesting and valuable works, publications that did not meet these criteria were not included in the study in order to ensure a high-quality standard of investigation.

3.3. Information Sources

The research was carried out on five electronic databases: Scopus, Web of Science, Science Direct, IEEE Xplore, and Springer. The filtering facilities provided by the electronic databases were used to identify the items according to the eligibility criteria presented above. The search has been performed on 25 August 2021.

3.4. Search

Search terms included were: UAV, unmanned aerial vehicle, uncrewed aerial vehicle, drone, unmanned aerial system, traffic, transport, flow, road, analysis, monitoring, surveillance, management, observation, vehicle detection, congestion, urban, city, intersection. The keywords were combined with Boolean operators according to the search possibilities provided by each database.
The results obtained were exported to EndNote (Clarivate™, Philadelphia, PA, USA). Using this software, the duplicates were removed and an initial screening of titles and abstracts followed to extract relevant studies.
Figure 1 shows the author keywords visualization related to UAV use for traffic monitoring. It was obtained using VOS viewer software and it can be observed that several clusters are formed according to the author keywords: aerial vehicle, drone, traffic, image, communication.

3.5. Study Selection

The authors of this article (R.G.B. and E.V.B.) performed the search and selection of papers to be included in the study. When disagreements arose between the two reviewers, they were resolved by consensus. Papers were included in this review if they were relevant to a UAV system used for traffic surveillance purposes.

3.6. Data Extraction

Data extraction was performed independently by the two reviewers (R.G.B. and E.V.B) and disagreements were also resolved by consensus. The following data from each study were extracted: author, publication year, country, paper objective, UAV type, camera resolution, flying height, software technique, urban area, outcomes, vehicle type, main findings, future work. This information was added to Microsoft Excel (Microsoft, Redmond, WA, USA) for further analysis.

4. Results

4.1. Study Selection

A flow-chart diagram showing the selection process according to PRISMA guidelines is presented in Figure 2. A total of 2557 articles were found, but 191 were duplicated papers. After their removal, 2366 studies were screened by title and abstract and 2278 were excluded because they were not relevant to our study. The remaining 88 papers were selected for full-text screening. Of these articles, 54 were excluded (no full-text available: two, magazine article: one, review article: three, no relevant to the study: 48). Finally, 34 papers were considered eligible to be included in the review.

4.2. Study Characteristics

The review process identified a total of 34 studies: Ahmed et al., 2021 [60], Apeltauer et al., 2015 [61], Balamuralidhar et al., 2021 [62], Barmpounakis et al., 2018 [63], Barmpounakis et al., 2019 [64], Barmpounakis and Geroliminis 2020 [65], Brkić et al., [66], Chen et al., 2019 [67], Chen et al., 2021 [68], Guido et al., 2016 [69], Javadi et al., 2021 [70], Kang and Mattyus 2015 [71], Kaufmann et al., 2018 [72], Ke et al., 2017 [73], Khan et al., 2017 [74], Khan et al., 2018 [75], Khan et al., 2020 [76], Kujawski and Dudek 2021 [77], Li et al., 2019 [78], Li et al., 2020 [79], Liu and Zhang 2021 [28], Luo et al., 2020 [80], Moranduzzo and Melgani 2014 [81], Shan et al. [82], Wan et al., 2019 [83], Wang et al., 2016a [84], Wang et al., 2016b [85], Wang et al., 2019 [86], Wang et al., 2019 [87], Xing et al., 2020a [88], Xing et al., 2020b [89], Xu et al., 2016 [90], Zhu et al., 2018a [91], Zhu et al., 2018b [92].
The quantitative analysis of the publications is presented in Figure 3, where the distribution of papers in terms of journal, publication year, and country is presented. This was realized online [93]. As can be seen, the analyzed articles were published in top ranking journals like Automation in Construction (AC), Transportation Research: Part A (TR_A), IEEE Internet of Things (IoT) Journal and so on. Most of the articles identified have been published in Remote Sensing (six studies), then Accident Analysis & Prevention (AAP) (four studies), IEEE Access (two studies), IEEE Transactions on Intelligent Transportation Systems (two studies), Transportation Research: Part C (two studies).
Regarding the year of publication, there is a growing trend from 2014 to 2021. This is to be expected given the continuous growth of the UAVs market [94]. Further, it was considered the country where the experiment was conducted or where the research center is located. As can be seen, the country that dominates overwhelmingly in terms of the number of publications on the proposed topic is China, with 16 studies, followed by Greece with three studies and Belgium, Germany, and Italy with two studies each. For one of the studies, even if the authors belong to an institution in Switzerland, the experiment described was performed in Athens, so Greece was considered as the host country.

4.3. Synthesis of Results

The synthesis of the results is provided in Table 2, where some significant data are extracted, and Table A1, where objective findings and future work for each study are summarized.

4.4. Main Purpose of the Study

The selected works were classified according to their main purpose into two main categories, as can be seen in Table 3: traffic analysis and traffic monitoring. For each of these categories, the basic objective of the study was extracted and several subcategories could be identified. As can be observed, most studies in the first category address issues related to vehicle trajectory extraction, traffic parameter estimation, congestion analysis, or conflict evaluation. In the category of articles referred to traffic monitoring, works that mainly address vehicle detection, vehicle tracking, or vehicle collision detection were included.
The articles were classified in the two categories, taking into account the following criteria: in the traffic monitoring category, the studies focused only on the identification and/or tracking of vehicles in traffic, often in real time were included. In the category of traffic analysis, studies that present a more detailed analysis of certain traffic parameters, such as traffic density estimation, recognizing vehicle behavior, or assessing the risk of collisions were included. In most cases, traffic analysis studies include elements of traffic monitoring: vehicles are first detected, tracked, then further analyses are performed. The analysis, visualization, and interpretation of data obtained from UAV cameras requires intelligent processing systems [77]. Thus, for the trajectory extraction of multiple vehicles, several steps are required: preprocessing, stabilization, georegistration, vehicle detection and tracking, and trajectory management [74]. The first four steps are usually common for both traffic monitoring and traffic analysis applications.
There are a variety of techniques implemented in the analyzed studies that focus on vehicle detection, tracking and/or extraction of traffic parameters. The vast majority of efforts used UAVs to record a certain area and then to extract significant information from the videos. The information can be extracted manually [60], semi-automatic [63] or fully automatic [28]. For vehicle detection, some of the works used conventional computer vision techniques that are focused on feature extraction, such as interest point detection (Shi-Tomasi features) [73], scale invariant feature transform (SIFT) [67,70], histogram of oriented gradients (HOG) features [58,76], local binary patterns (LBP) [49], Viola–Jones object detection scheme [58,76], Haar-like features [56] together with classifiers like support vector machine (SVM) [81], AdaBoost classifier [49,58,76], or k-means clustering [70]. Moreover, for fully automatic techniques of tracking, traditional motion-based methods can be identified, e.g., optical flow (e.g., Kanade–Lucas algorithm) [73,74,75,84,86], background subtraction [61,74,75,77,80], particle filter [28,61,83], correlation filter [68], Kalman filter [65,75,78,82,87,92].
In addition to these classic methods, deep learning-based methods have been developed using two-step detectors like deep neural network (DNN) [70], RetinaNet [91], convolutional neural network (CNN) [28,79], Faster R-CNN [66], fully connected neural network (fcNN) [70], or one-step detectors: You only look once (YOLO) [28,70,78,82,87] and single shot multibox detector (SSD) [92]. These studies showed that deep learning-based methods are more effective than traditional computer vision techniques in traffic video analysis [92]. There are several well-defined steps that researchers follow to detect and track moving vehicles and their trajectories: pre-processing, stabilization, geo-registration, vehicle detection and tracking, and trajectory management [75].
Regarding the variables that were taken into account for the evaluation of the proposed system, some authors used parameters such as speed [60,64,78,85], traffic density [66,73], vehicle counting [77,92], vehicle trajectory [80,91], and parameters related to the performance of developed method: precision [79], accuracy [81], F1 score [87], correctness, completeness, and quality [84,90]. In the vast majority of studies, there is no difference between the types of vehicles identified, but in some of the them, vehicles are classified in various categories, like cars, buses, trucks, motorbikes, and even pedestrians are detected in several studies.
The drones used for data acquisition are of various types, the most used being produced by DJI Technology Co., Ltd., Shenze, China, especially Phantom model (in seven studies, Phantom 2, 3, and 4 were used), Inspire 1 (three studies), Mavic Pro (two studies), Matrice 100 (two studies). Thirteen studies did not mention the UAV model. The flying height varies from 50 to 281 m, but this parameter is also reported in a few studies. The resolution of images varies from 960 × 540, 24 frames per second (fps) to 5184 × 3456.
The objective of the studies, the main findings, and the future work for each selected study are presented in Table A1 from Appendix A.

5. Discussion

The applications related to traffic monitoring and analysis identified in the literature review include different techniques for vehicle detection and tracking, and estimation or extraction of different traffic parameters. The variety of approaches can be divided into two categories: conventional machine vision techniques and deep learning machine vision techniques [95]. The conventional motion-based methods use traditional machine learning and computer vision techniques to detect and track vehicles, e.g., background subtraction, optical flow, blob analysis [74], histogram of oriented gradient (HOG), Haar-like features, speeded-up robust features (SURF), and so on. The most recent techniques are based on deep learning and it has been shown that they outperform the traditional ones, providing better feature representation and processing time [70]. There are many object detection methods based on deep learning, but they are often divided into one-stage or two-stage detectors [87]. While one-stage detectors use region of interest (ROI) directly from the image, two-stage detectors use first some techniques such as region proposal network (RPN) to predict the location of potential objects [66]. YOLO and SSD are CNN one-stage detectors [79], while R-CNN, Fast R-CNN, Faster R-CNN, and Mask R-CNN are two-stage detectors [70]. Even if two-stage detectors are more advanced, they require high hardware performance [62]. Moreover, it was shown that YOLO v3 outperforms R-CNN and runs significantly faster [78].
In the following, some aspects related to the type of UAVs used for traffic monitoring purposes will be discussed. Depending on the construction of the flying mechanism, UAVs can be classified in: fixed-wing, rotary-wing and hybrid UAV [57]. Fixed-wing were prevalent for traffic monitoring applications a few years ago [75], but today small rotary-wing are preferred [11] due to the fact that they are low-cost and require less experience and training [64]. The first type of UAV have some advantages, e.g., increased flight endurance [53], faster travel and ability to carry heavier payloads [51], and ability to fly along linear distance [62], but they are larger in size and depend on airfield for take-off [96]. Rotary-wing UAVs are lighter in weight, capable for vertical take-off and landing (VTOL), provide significant advantages in enclosed or constrained environments [43], and are capable to hover and to get very close to objectives [62], providing very high spatial resolution [64]. On the other hand, they have less mobility and consume more power [51]. A compromise of this types is the hybrid fixed/rotary-wing UAV, that can provide operation modes for both high speed flight and low speed flight, including hovering [80]. However, all types presented are limited to climate factors, the presence of physical obstructions, as well as instrumental or legal factors [97]. More details, classifications and characteristics of UAVs can be seen in [50]. In the analyzed studies, the rotary-wing type predominates, especially quadcopter DJI drones (Phantom 2, Phantom 3, Phantom 4, Mavic Pro, Inspire 1 Pro, Matrice 100), Argus-One quadcopter, and hexacopters (see Table 3). This is consistent with the findings of a study showing that rotary-wing UAVs are more efficient for use in urban environments [98].
Another important issue when it comes to using UAVs for traffic monitoring in urban environments is the safety of UAV against ground vehicles. The certification of UAV operation is regulated by authorities for each state in order to avoid different situations like crashing into pedestrians or buildings, collision with other aircrafts, or disturbance [87]. With the spread of drones and their types, the risk of accidents also increases [35] and that is why strict rules are needed to control UAV operations and avoid their unsafe and unnecessary use. The cooperation between all authorities is of great importance to ensure the uniformity of regulations [3]. In some countries, the operation of UAVs can be performed only if the operator holds a certificate recognized by the Federal Aviation Administration (FAA) [99]. In order to ensure the safety of UAV and to reduce the risk of collisions, in most cases adequate separation of people, buildings, and traffic is sufficient. Thus, some countries have imposed on UAV operators well-defined limits (i.e., 30 or 50 m) for the flight of drones to any person or structure [100]. However, this separation is difficult to achieve in urban environments and different solutions were proposed for this problem: an air tunnel designed as an air tunnel for movement of UAV in areas of transport infrastructure facilities [101], risk maps to define the risk associated to accidents [102], safe landing systems able to identify obstacles [103] or to identify landing zones [104]. The various techniques of safe landing zone detection are reviewed in [96].
In most of the cases, a single UAV was used in the analyzed literature for urban traffic monitoring, capturing portions of roads [60,66,72,85], intersections (one intersection [64,74,75], two [67], five [92], or ten intersections [86]), roundabouts [61,69], a toll plaza area [88,89], and so on. One single paper presented a large-scale field experiment, using observation taken by a swarm of 10 UAVs from a large congested area covering 1.3 km2 with around 100 busy intersections [65]. It is obvious that a collaborative formation of UAVs can provide faster, more effective and more flexible monitoring [55]. A performance comparison of single and multi-UAV systems is provided in [52]. Moreover, different solutions for traffic monitoring and management using multiple cooperative UAVs were proposed [105,106]. However, there are also limitations of multi-cooperative UAV systems like ‘blind’ gaps, as can be seen in [76].
The low-altitude traffic management should also be taken into account when developing systems for monitoring traffic in urban areas since the flight environment in these areas is increasingly complex with the development of UAVs. Progress has been made in this regard as well through the development of public air route networks for UAVs [107] based on aerial corridor systems [108] or airspace geofencing volumization algorithms to support unmanned aircraft management of low-altitude airspace [109]. Another solution is represented by a multilayer network of nodes and airways [110]. Nevertheless, these aspects are not discussed in the papers selected for this analysis because they are strictly focused on describing the algorithms for traffic monitoring. As stated in [82], the requirements for real-time traffic management and control generated broad attention in the field of traffic monitoring and new frameworks for low-altitude UAV systems were developed in many countries.
In this field of urban traffic monitoring, appropriate spatial and temporal resolutions are required to capture details related to three-dimensional traffic. The spatial resolution determines the quality of an image, the smallest pixel area that can be identified [111]. It represents a key aspect for determination of traffic flow parameters [66]. Since the UAVs fly at lower altitudes, they can achieve high spatial resolutions [64]. For instance, in [61] the spatial resolution is mentioned as having the value of 10.5 cm, in [66] it has the value of 13 cm and in [81] it is 2 cm. Compared to satellite remote sensing, that can achieve high-resolution of up to 0.3 m [112], UAVs provide ultra-high spatial resolutions at cm-level. A research on the evaluation of the impact of spatial resolution on the classification of vegetation types in provided in [113]. Another research on post-fire mapping and vegetation recovery highlights the advantages of UAV-based systems compared to satellite-collected imagery in terms of spatial and temporal resolutions [114]. Temporal resolution is also of great importance for remote-sensing applications in urban environments because of its dynamic nature. Traffic monitoring and analysis must be performed promptly and consistently. UAV platforms are efficient in this regard, providing appropriate high temporal resolutions [112,115]. However, in the analyzed studies, these parameters were reported to a very small extent.
Finally, some issues related to different laws and regulations of airspace management will be addressed. In light of the growing number of UAVs, countries around the world are trying to develop policies and means to control the operations of low-altitude aircrafts to ensure their safety and the environment in which they fly. The main regulations are related to the controlled use of airspace, operational limitations, and administrative procedures [10]. There are three categories of the management of low-altitude UAVs: the registration of flight activity, the limitation of maximum flight height for different types of UAVs, setting the area for different flight activities [107]. As an example, the Federal Aviation Administration of USA mention that the maximum allowable altitude is 400 ft (about 122 m) above the ground [116]. The same maximum height is stated by the European Aviation Safety Agency (EASA) [117]. A review of maximum flying height for different countries is provided in [56]. Regarding the different zones where the UAVs are allowed to fly, there are different legal requirements defined by each country. For instance, in Belgium and the Netherlands, the use of UAVs for flying above crowds of people or urban areas, and in Canada, UAVs must not approach more than 120 m to people, animals, buildings or vehicles [118]. In Romania, the Romanian Civil Aviation Authority (AACR) provides that a safety distance of 500m from buildings, people, vehicle and animals is required [119]. Moreover, in order to take pictures or videos in this country, pre-approval is required from the Ministry of Defense. Other EU regulations and requirements regarding policies and authorizations can be found in [120].
Given these regulations, authorities around the world are trying to find solutions to implement a secure UAV operating environment. At the EU level, a report states that there is a need to develop and validate UAV capabilities in certain key areas, such as: urban air mobility, air traffic management, advanced services and technologies [121]. Since in the future the airspace will be very crowded, as multi-purpose UAV applications are developed, it is necessary to implement policies and regulations for a safe, reliable, and efficient use of these flying vehicles, and this can be obtained by digitally sharing flight details in traffic management systems, with minimal human intervention [10]. It has become obvious that there is a need for a common system to control the flight of UAVs and large aircraft, referred to by specialists as ‘low altitude airspace management systems (LAAM)’ [37]. Moreover, future smart cities must provide the necessary infrastructure for UAV to vehicle (UAV-2-V) communication [122], which ultimately has to be adopted in the vehicle-to-everything (V2X) space, involving serious data security issues [85] and issues of processing large volumes of data. A new approach to addressing these issues is proposed in [123], a blockchain-based solution for unmanned traffic management. Beyond the current limitations and barriers of UAVs (such as reduced flight time, legal issues, lack of acceptance, economic barriers, and so on [36]), solutions must be found for optimal planning routes, the development of computer vision systems, infrastructure for processing large data sets [124], and UAV positioning algorithms [125]. Some future research directions are provided in [11,12,35,56,57].

6. Conclusions

In this work, we provided an overview of the UAVs application for traffic monitoring and analysis. The main conclusions that can be drawn are the following:
  • There is a growing trend in the use of drones to monitor traffic in recent years, with a significant increase in the last three years.
  • China has supremacy in terms of the number of applications in this field, as well as the source of data acquisition equipment (i.e., UAV models).
  • In terms of the construction of flying mechanisms, rotary-wing UAVs were preferred for data collection, especially quadcopters.
  • Various image processing methods were proposed for vehicle detection and tracking, but approaches based on deep learning have been preferred in recent years.
  • Most of the identified studies are based on vehicle detection and tracking techniques, but also the extraction of the trajectory of the vehicles and the evaluation or prediction of a collision.
  • There is a vast literature on the use of drones in various fields, but there is still much to add to traffic monitoring. This article is part of a series of those aiming to provide help to researchers and practitioners who contribute to this field.
For future work, we plan to expand the investigation and to include more studies, to add the current ones and to analyze in more detail every aspect related to the use of drones in transportation field. Obviously, this article has limitations that will be covered in a future paper.

Author Contributions

Conceptualization, R.G.B. and E.V.B.; methodology, R.G.B.; software, E.V.B.; validation, E.V.B.; formal analysis, R.G.B.; data curation, E.V.B.; writing—original draft preparation, R.G.B.; writing—review and editing, R.G.B.; visualization, E.V.B.; supervision, E.V.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Summary of the purpose, results and future work for the analyzed studies.
Table A1. Summary of the purpose, results and future work for the analyzed studies.
AuthorPaper AimFindingsFuture Work
Ahmed et al., 2021 [60]The utilization of a UAV-based geospatial analysis technique for accurate extraction of longitudinal and lateral distances between vehicles to determine the relationship between macroscopic and microscopic parameters of traffic flow.
  • Lateral gaps between vehicles are inversely related to traffic density;
  • Higher heterogeneity and aggressive driver behaviour, which also increases the risk of accidents;
  • A policy framework is needed to reduce the heterogeneity of the traffic stream and induce some discipline in the traffic stream.
  • Studying the relation of traffic mix on the behaviour of fundamental diagrams and drivers’ behaviour.
Apeltauer et al., 2015 [61]A new approach for simultaneous detection and tracking of vehicles moving through an intersection in aerial images acquired by an unmanned aerial vehicle (UAV).
  • The approach showed sufficient performance for automatic extraction of vehicles’ trajectories for further traffic inspection
  • Handling the road junctions with grade separations, as well as using data fusion from more UAVs.
Balamuralidhar et al., 2021 [62]Presentation of a traffic monitoring system that can detect, track, and estimate the velocity of vehicles in a sequence of aerial images. The solution has been optimized to execute these tasks in real-time on an embedded computer installed on an Unmanned Aerial Vehicle (UAV)
  • Vehicle detection model performed 4.8% better than the state-of-the-art algorithms for vehicle detection in aerial images, in terms of accuracy (mAP) while preserving the processing speed;
  • Vehicle detection network showed it is able to generalize to real world data, such as the used state-of-the-art dataset
  • The acquisition of more realistic datasets over traffic scenes, capturing cars moving at high speed and in varying lighting conditions;
  • Different flight heights.
Barmpounakis et al., 2018 [63]Address PTW (Powered Two-Wheeler) overtaking phenomena using a two-step modelling approach based on optimized and meta-optimized decision trees.
  • UAV can contribute substantially towards creating detailed naturalistic trajectory datasets;
  • A detailed dataset is the most important step when it comes to data mining techniques and understanding a phenomenon.
  • The combination of the advanced data gathering tools and ML models that can advance the design of Advanced Driver Assistance Systems (ADAS).
Barmpounakis et al., 2019 [64]Examination of the potential of using UAVs as part of the ITS infrastructure as a way of extracting naturalistic trajectory data from aerial video footage from a low volume four-way intersection and a pedestrian passage.
  • Accuracy is highly dependent on the stabilization of the video and the geo-reference procedure.
  • High accuracy and fast communication protocols are required to send the information back to the ground (for example researchers, traffic centres and managers).
Barmpounakis and Geroliminis 2020 [65]Recording traffic streams in a multi-modal congested environment over an urban setting using UAS that can allow the deep investigation of critical traffic phenomena.
  • Tremendous possibilities of the specific dataset to be share with the rest of the community;
  • It can be a benchmark dataset for both existing and future modelling approaches for several disciplines.
  • This dataset can be utilized by the whole research community of transportation science and other disciplines, such as Machine Learning or Artificial Intelligence, to study, model and improve traffic congestion.
Brkić et al., 2020 [66]Proposing a new, low-cost framework for the determination of highly accurate traffic flow parameters.
  • The proposed framework provides a simple and accurate method for plotting vehicle trajectories and continuous headway measurements at road sections;
  • Vehicle detection achieved a recall of 0.994 and vehicle speed estimation with a MAPE was 0.92%.
  • Creating large datasets containing labelled vehicles in images from different videos and different contexts.
Chen et al., 2019 [67]Assessing how simulation can be utilized for vehicle-pedestrian conflict assessment at crosswalks. Empirical models have been established to represent the stochastic behaviour of right-turning vehicles and pedestrians under different geometric layouts and operational conditions at signalized intersections.
  • The simulation model can reasonably represent the frequency and severity of conflict occurrence at signalized crosswalks;
  • Large dimensions and turning angles of intersections tend to result in undesirable safety performance.
  • It is necessary to update some key behavioural models, such as non-free-flow right-turning vehicle path/speed models and pedestrian behaviour outside the crosswalk;
  • Other conflict types, such as left-turning vehicles versus opposing through traffic conflict should also be incorporated into the future development of the simulation model.
Chen et al., 2021 [68]Proposing a novel methodological framework for automatic and accurate vehicle trajectory extraction from aerial videos.
  • The proposed method successfully extracts vehicle trajectories with a high accuracy: the measurement error of Mean Squared Deviation is 2.301 m, the Root-mean-square deviation is 0.175 m, and the Pearson correlation coefficient is 0.999.
  • Enhancing the usability of the proposed vehicle trajectory extraction algorithms under challenging situations such as moving UAV with multi-dimensional camera motions (rolling, heaving and surging, combination of the two motions, etc.);
  • Investigating the performance of the approaches for other traffic states or complex state transition status;
  • Enhancing the framework and improving the model performance for poor visibility situations such as night time, raining, snowing, etc.;
  • Employing deep learning models for accurate vehicle detections and assemble vehicle trajectories by solving the vehicle matching tasks in different image frames.
Guido et al., 2016 [69]Presenting a methodology to extract vehicle trajectories and speeds from Unmanned Aerial Vehicles (UAV) video processing.
  • The results of the experiments highlight the versatility of the Unmanned Aerial Vehicles technology combined with video processing technique in monitoring real traffic data.
  • Calibrating simulation models with a high level of detail by using spatial information acquired from a UAV;
  • The observed level of accuracy in speed estimation can be used for safety assessment where the differential speed between a pair of vehicles is the most important factor;
  • Considering other additional intrinsic camera parameters (including, for example, lens distortion) to better match the ground coordinate systems.
Javadi et al., 2021 [70] Investigating the ability of three-dimensional (3D) feature maps to improve the performance of deep neural network (DNN) for vehicle detection. First, we propose a DNN based on YOLOv3 with various base networks, including DarkNet-53, SqueezeNet, MobileNet-v2, and DenseNet-201.
  • The experimental results show that 3D features improved the precision of DNNs from 88.23% to 96.43% and from 97.10% to 100% when using DNN confidence thresholds of 0.01 and 0.05, respectively;
  • The proposed system was able to successfully remove 72.22% to 100% of false positives from the DNN outputs.
  • Developing a unified deep neural network that includes 3D features as part of the input signal.
Kang and Mattyus 2015 [71]Presenting a method which can detect vehicles with orientation and type information on aerial images in a few seconds on large images. The application of Integral Channel Features in a Soft Cascade structure results in both good detection performance and fast speed
  • The application of Integral Channel Features in a Soft Cascade structure results in both good detection performance and fast speed;
  • The detector works on original images where no georeference and resolution information is available.
  • Performance could be improved by using a deep neural network after the binary detector like R-CNN.
Kaufmann et al., 2018 [72]Showing a suitable way to perform spatiotemporal measurements of city traffic using aerial observations with an unmanned aerial vehicle.
  • The conducted video analyses using supervised tracking together with key frame techniques proved to be a robust solution to gain complete microscopic data (single-vehicle data) for empirical traffic research.
  • More detailed measurements and studies of single vehicle data in city traffic are required.
Ke et al., 2017 [73]Proposing a novel framework for real-time traffic flow parameter estimation from aerial videos. The proposed system identifies the directions of traffic streams and extracts traffic flow parameters of each traffic stream separately.
  • The experimental results show that the proposed method achieves about 96% and 87% accuracy in estimating average traffic stream speed and vehicle count, respectively;
  • The method also achieves a fast-processing speed that enables real-time traffic information estimation.
  • Testing the method on heavily congested traffic conditions and curved road segments, and adjusting it to improve performance would also be insightful;
  • Improving accuracy for estimation of large vehicles and improving the overall performances.
Khan et al., 2017 [74]Processing and analysis of UAV-acquired traffic footage. A detailed methodological framework for automated UAV video processing is proposed to extract the trajectories of multiple vehicles at a particular road segment.
  • The results generated depict the overall applicability of the system. Such a systematic framework may prove to be helpful for future traffic-related UAV studies as well by streamlining the processes involved. It may also serve as a comprehensive guide for the automated and quick extraction of multivehicle trajectories from UAV-acquired data.
  • The extension of the applications of the proposed framework within the context of UAV-based traffic monitoring and analysis;
  • The proposed framework will be extended to implement real-time processing and analysis of UAV-acquired data.
Khan et al., 2018 [75]Analysis of vehicle trajectories acquired via small rotary-wing UAV footage. The experimental data to analyse traffic flow conditions at a signalized intersection was obtained in the city of Sint-Truiden, Belgium.
  • The results reflect the value of flexibility and bird’s-eye view provided by UAV videos; thereby depicting the overall applicability of the UAV-based traffic analysis system.
  • Various approaches for automation and optimization of vehicle trajectories’ analysis, including the ‘critical point’ approach, will be explored in more detail;
  • The prospects of real-time processing and analysis of traffic data obtained via UAVs will be inspected.
Khan et al., 2020 [76]Proposing a smart traffic surveillance system based on Unmanned Aerial Vehicle (UAV) using 5G technology.
  • The results show that those violations when to overcome, the number of accidents per year falls to 299,317 leading to 4868 deaths and 33,199 injuries for 1st year, and in the next five years the number of deaths and will be decreased to 3745 and injuries to 16,600 based on the current data available.
  • Autodetection of lane switching, other traffic violations, and warning to drivers about these violations will promote better lane discipline among drivers in Saudi Arabia.
Kujawski and Dudek 2021 [77]Presenting methods of data acquisition from cameras mounted on unmanned aerial vehicles (UAV) and their further analysis, which may be used to improve urban transportation systems and its sustainability. The analysed data concerning the situation of urban transport in points of intersection of national and local roads.
  • Results can be used in the future together with the data from existing intelligent transportation systems (a fusion of such data will be needed);
  • The application of used methods allow to extend and support the existing approaches to manage public and freight transport in cities.
  • Improving the power supply of flying vehicles so that it is possible to power them continuously, because currently there is only possibility to fly up to 30 min on one battery.
Li et al., 2019 [78]Proposing a novel adaptive framework for multi-vehicle ground speed estimation in airborne videos.
  • The proposed system has a unique ability to detect, track, and estimate the speed of ground vehicles simultaneously even with a single downward-looking camera.
  • The system can obtain effective and accurate speed estimation results, even in various complex scenes.
  • Using SLAM to estimate the 3D information of the scene to improve the accuracy of vehicle speed estimation.
Li et al., 2020 [79]Proposing a robust vehicle detection model for aerial images. First, image pre-processing to deal with IoU distribution imbalance problem and greatly improve the recall rate were performed. Then, SSP-SSD to enhance feature representation of vehicles with different scales and improve the precision was proposed.
  • Extensive experiments demonstrate the superiority of the proposed algorithm by comparison with other SOTA solutions.
  • Focusing the research on few-shot learning to improve the performance of our detector on unbalanced data.
Liu and Zhang 2021 [28]Fusing the target detection network, YOLO v4, with the detection-based multitarget tracking algorithm, DeepSORT, a method based on deep learning for automatic vehicle detection and tracking in urban environments, has been designed.
  • Results of the simulation show that the algorithm proposed can detect and track vehicles automatically in urban environments;
  • The particle filter algorithm based on an interactive multimodel significantly improves the performance of the UAV in terms of positioning the manoeuvring targets, and this has good engineering application value.
NR
Luo et al., 2020 [80]Proposing a traffic collisions early warning scheme aided by small unmanned aerial vehicle (UAV) companion. Basically, it is a vision-based driver assistance system, and the difference in comparison with the available schemes is that the camera is flying along with the host vehicle.
  • Extensive experimental results and examples demonstrate the effectiveness of the proposed method, and its real-time performance outperforms typical tracking methods such as that based on Gaussian mixture model.
  • Incorporating a high level intelligent onboard processing system in UAV for detection, classification and understanding various types of vehicle crashes, such as head-on, road departure, rear-end, side collisions, rollovers, etc. In particular, the innovations of artificial intelligence, data mining and machine learning can be used.
Moranduzzo and Melgani 2014 [81]Presenting a solution to solve the car detection and counting problem in images acquired by means of unmanned aerial vehicles (UAVs).
  • The experimental results obtained on a real UAV scene characterized by a spatial resolution of 2 cm show that the proposed method exhibits a promising car counting accuracy.
  • Other colour-based SIFT methods could be envisioned in order to better exploit the discrimination potential conveyed by the original image colour space;
  • Efforts should be made in the direction of assessing other kinds of descriptors and detectors (e.g., Harris and Gabor filters, local binary patterns) as potentially alternatives to SIFT in terms of complexity and discrimination power.
Shan et al., 2021 [82]Proposing a systematic approach to detect and track vehicles based on the YOLO v3 model and the deep SORT algorithm for further extracting key traffic parameters.
  • The proposed approach exhibits strong robustness and reliability, due to the 90.88% accuracy of object detection and 98.9% precision of tracking vehicle.
  • The absolute and relative error of extracted speed falls within ±3 km/h and 2%, respectively.
  • The overall accuracy of the extracted parameters reaches up to 98%.
  • Different traffic scenarios (for example, intersection, roundabout, low visibility, etc.) may also be considered to test the robustness of the proposed approach;
  • Further analysis about driving behaviour could be conducted utilizing real-time traffic data obtained from the present work.
Wan et al., 2019 [83]Proposing a computer vision-based target tracking algorithm aiming at locating UAV-captured targets, like pedestrian and vehicle, using sparse representation theory.
  • Both qualitative and quantitative experiments implemented on visible (Vis) and infrared (IR) UAV videos prove that the presented tracker can achieve better performance in terms of precision rate and success rate when compared with other state-of-the-art trackers.
  • Focusing on optimizing the codes and transplanting our algorithm to hardware devices, like FPGA and GPU, to make the real-time running come true.
Wang et al., 2016a [84]Presenting a crash prediction method based on a bivariate extreme value theory (EVT) framework and UAV trajectory data processing.
  • The crash prediction method appeared to be a promising tool for safety evaluation on signalized intersections.
  • Incorporate more intersections;
  • Using other state-of-the-art computer vision techniques, such as deep learning methods;
  • A multivariate EVT model framework could also be attempted, incorporating multiple conflict metrics.
Wang et al., 2016b [85]Proposing an improved start-wave velocity model, where the speed and density of traffic flow are converted into vehicle space headway, meaning vehicle length and other auxiliary parameters which can be recognized from aerial video or other means
  • The improved model is accurate enough to be used for model calibration and validation in signal timing optimization
  • The correlation between the weighted average of PCU and the average start-up acceleration should be analysed;
  • The relationship between characteristics of queues dissipating and driver behaviours needs a more specific study.
Wang et al., 2019 [86]Proposing a deep-learning framework for vehicle detection and tracking from UAV videos for monitoring traffic flow in complex road structures. The approach is designed to be invariant to significant orientation and scale variations in the videos. The detection procedure is performed by fine-tuning a state-of-the-art object detector, You Only Look Once (YOLOv3), using several custom-labelled traffic datasets.
  • Experiments demonstrated that high detection accuracy could be achieved with an average F1-score of 92.1%. Besides, the tracking technique performs accurately, with an average multiple-object tracking accuracy (MOTA) of 81.3%.
  • More custom UAV traffic images with different lighting and weather conditions as well as vehicle models and colours will be beneficial to train a more robust vehicle tracker;
  • Traditional data augmentation approaches, as well as generative adversarial networks, can be helpful in improving training data;
  • Geometric calibration of the camera will be performed, and images will be rectified in order to reduce mis-detection errors near image corners.
Wang et al., 2019 [87]Introducing a new vehicle detecting and tracking system based on image data collected by UAV. The system uses consecutive frames to generate vehicle’s dynamic information, such as positions and velocities.
  • Field tests demonstrate that the present system exhibits high accuracy in traffic information acquisition at different UAV altitudes with different view scopes, which can be used in future traffic monitoring and control in metropolitan areas.
  • Vehicle detecting and tracking methods for different altitudes and different accuracy should be considered.
Xing et al., 2020a [88]Investigating the traffic conflict risks at the upstream approach of toll plaza during the vehicles’ diverging period from the time of arrival at the diverging area to that of entering the tollbooths. Based on the vehicle’s trajectory data extracted from unmanned aerial vehicle (UAV) videos using an automated video analysis system, vehicles’ collision risk is computed by extended time to collision (TTC).
  • The results indicate that the T-RPLR model has the highest prediction accuracy. Eight influencing factors including following vehicle’s travel distance, following vehicle’s initial lane, following vehicle’s toll collection type, leading vehicle’s toll collection type, distance between two vehicles’ centroids, and following vehicle’s speed, are found to have time-varying effects on collision risk.
  • Various toll plaza diverging areas should be conducted for validation and a general model suitable for all kinds of toll plaza diverging areas should be developed.
Xing et al., 2020b [89]Developing the logistic regression (LR) model and five typical non-parametric models including, K-Nearest Neighbour (KNN), Artificial Neural Networks (ANN), Support Vector Machines (SVM), Decision Trees (DT), and Random Forest (RF) to examine the relationship between influencing factors and vehicle collision risk.
  • Results of model performance comparison indicate that not all non-parametric models have a better prediction performance than the LR model;
  • The KNN, SVM, DT and RF models have better model performance than LR model in model training, while the ANN model has the worst model performance;
  • The accuracy of LR model is higher than that of other five non-parametric models under various ETTC thresholds conditions.
  • The detailed influence needs to be investigated in order to further enhance prediction accuracy;
  • The interpretability and convenience of non-parametric models could be improved, which may impede the practicability compared with statistical models;
  • The unobserved heterogeneity should be analysis by employing the advanced modelling techniques.
Xu et al., 2016 [90]Proposing a new hybrid vehicle detection scheme which integrates the Viola-Jones (V-J) and linear SVM classifier with HOG feature (HOG + SVM) methods for vehicle detection from low-altitude unmanned aerial vehicle (UAV) images.
  • A comprehensive evaluation shows that the switching strategy, combined with the road orientation adjustment method, can significantly improve the efficiency and effectiveness of the vehicle detection from UAV images;
  • The proposed vehicle detection method is competitive compared with other existing vehicle detection methods.
  • Future research will be focusing on expanding the current method for detecting other transportation modes such as buses, trucks, motors, bicycles, and pedestrians.
Zhu et al., 2018a [91]Presenting an all-in-one behaviour recognition framework for moving vehicles based on the latest deep learning techniques.
  • The approach outperformed all other methods in terms of both single class performance and overall performance;
  • T-BiLSTM achieved an accuracy of 0.965 on the “go straight” types.
  • Vehicle trajectory analysis also has more applications which we will consider in future works: for example, illegal lane changes, violations of traffic lines, overtaking in prohibited places, and illegal retrograde;
  • An artificial-intelligence-based transportation analytical platform can be implemented and it can be integrated into the existing intelligent transportation system in order to improve the driving experience and safety of drivers.
Zhu et al., 2018b [92]Presenting an advanced urban traffic density estimation solution using the latest deep learning techniques to intelligently process ultrahigh-resolution traffic videos taken from an unmanned aerial vehicle (UAV).
  • The enhanced single shot multibox detector (Enhanced-SSD) outperforms other DNN-based techniques and the deep learning techniques are more effective than traditional computer vision techniques in traffic video analysis;
  • ultrahigh-resolution video provides more information that enables more accurate vehicle detection and recognition than lower resolution contents.
  • Developing more effective vehicle detection and tracking algorithms while achieving a high processing speed and robustness;
  • Designing a method to automatically select wanted regions (city roads) to further reduce human supervision and improve the overall efficiency.

References

  1. Azar, A.; Koubaa, A.; Mohamed, N.A.; Ibrahim, H.; Ibrahim, Z.; Kazim, M.; Ammar, A.; Benjdira, B.; Khamis, A.; Hameed, I.; et al. Drone Deep Reinforcement Learning: A Review. Electronics 2021, 10, 999. [Google Scholar] [CrossRef]
  2. Elkhrachy, I. Accuracy Assessment of Low-Cost Unmanned Aerial Vehicle (UAV) Photogrammetry. Alex. Eng. J. 2021, 60, 5579–5590. [Google Scholar] [CrossRef]
  3. Nex, F.; Remondino, F. UAV for 3D mapping applications: A review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar] [CrossRef]
  4. de Silva, I. Geomatics Applied to Civil Engineering State of the Art. In Applications of Geomatics in Civil Engineering; Ghosh, J., de Silva, I., Eds.; Springer: Singapore, 2020; pp. 31–46. [Google Scholar]
  5. Giordan, D.; Adams, M.S.; Aicardi, I.; Alicandro, M.; Allasia, P.; Baldo, M.; De Berardinis, P.; Dominici, D.; Godone, D.; Hobbs, P.; et al. The use of unmanned aerial vehicles (UAVs) for engineering geology applications. Bull. Eng. Geol. Environ. 2020, 79, 3437–3481. [Google Scholar] [CrossRef] [Green Version]
  6. Alioua, A.; Djeghri, H.-e.; Cherif, M.E.T.; Senouci, S.-M.; Sedjelmaci, H. UAVs for traffic monitoring: A sequential game-based computation offloading/sharing approach. Comput. Netw. 2020, 177, 107273. [Google Scholar] [CrossRef]
  7. Cummings, A.R.; Cummings, G.R.; Hamer, E.; Moses, P.; Norman, Z.; Captain, V.; Bento, R.; Butler, K. Developing a UAV-based monitoring program with indigenous peoples. J. Unmanned Veh. Syst. 2017, 5, 115–125. [Google Scholar] [CrossRef]
  8. Chamoso, P.; Raveane, W.; Parra, V.; González, A. UAVs Applied to the Counting and Monitoring of Animals. In Ambient Intelligence-Software and Applications; Ramos, C., Novais, P., Nihan, C., Corchado Rodríguez, J., Eds.; Springer: Cham, Switzerland, 2013; pp. 71–80. [Google Scholar]
  9. Kumar, A.; Sharma, K.; Singh, H.; Naugriya, S.G.; Gill, S.S.; Buyya, R. A drone-based networked system and methods for combating coronavirus disease (COVID-19) pandemic. Futur. Gener. Comput. Syst. 2021, 115, 1–19. [Google Scholar] [CrossRef]
  10. Gupta, A.; Afrin, T.; Scully, E.; Yodo, N. Advances of UAVs toward Future Transportation: The State-of-the-Art, Challenges, and Opportunities. Futur. Transp. 2021, 1, 19. [Google Scholar] [CrossRef]
  11. Outay, F.; Mengash, H.A.; Adnan, M. Applications of unmanned aerial vehicle (UAV) in road safety, traffic and highway infrastructure management: Recent advances and challenges. Transp. Res. Part A Policy Pract. 2020, 141, 116–129. [Google Scholar] [CrossRef]
  12. Shakhatreh, H.; Sawalmeh, A.H.; Al-Fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, M. Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges. IEEE Access 2019, 7, 48572–48634. [Google Scholar] [CrossRef]
  13. Elloumi, M.; Dhaou, R.; Escrig, B.; Idoudi, H.; Saidane, L.A. Monitoring road traffic with a UAV-based system. In Proceedings of the 2018 IEEE Wireless Communications and Networking Conference (WCNC), Barcelona, Spain, 15–18 April 2018; pp. 1–6. [Google Scholar]
  14. Degas, A.; Kaddoum, E.; Gleizes, M.-P.; Adreit, F.; Rantrua, A. Cooperative multi-agent model for collision avoidance applied to air traffic management. Eng. Appl. Artif. Intell. 2021, 102, 104286. [Google Scholar] [CrossRef]
  15. Villa, T.F.; Jayaratne, E.R.; Gonzalez, L.F.; Morawska, L. Determination of the vertical profile of particle number concentration adjacent to a motorway using an unmanned aerial vehicle. Environ. Pollut. 2017, 230, 134–142. [Google Scholar] [CrossRef] [Green Version]
  16. Naughton, J.; McDonald, W. Evaluating the Variability of Urban Land Surface Temperatures Using Drone Observations. Remote Sens. 2019, 11, 1722. [Google Scholar] [CrossRef] [Green Version]
  17. Yalcin, E. Two-dimensional hydrodynamic modelling for urban flood risk assessment using unmanned aerial vehicle imagery: A case study of Kirsehir, Turkey. J. Flood Risk Manag. 2018, 12, e12499. [Google Scholar] [CrossRef]
  18. De Vivo, F.; Battipede, M.; Johnson, E. Infra-red line camera data-driven edge detector in UAV forest fire monitoring. Aerosp. Sci. Technol. 2021, 111, 106574. [Google Scholar] [CrossRef]
  19. Biçici, S.; Zeybek, M. An approach for the automated extraction of road surface distress from a UAV-derived point cloud. Autom. Constr. 2021, 122, 103475. [Google Scholar] [CrossRef]
  20. Agarwal, A.; Kumar, S.; Singh, D. Development of Neural Network Based Adaptive Change Detection Technique for Land Terrain Monitoring with Satellite and Drone Images. Def. Sci. J. 2019, 69, 474–480. [Google Scholar] [CrossRef]
  21. Sutheerakul, C.; Kronprasert, N.; Kaewmoracharoen, M.; Pichayapan, P. Application of Unmanned Aerial Vehicles to Pedestrian Traffic Monitoring and Management for Shopping Streets. Transp. Res. Procedia 2017, 25, 1717–1734. [Google Scholar] [CrossRef]
  22. Zhu, J.; Chen, S.; Tu, W.; Sun, K. Tracking and Simulating Pedestrian Movements at Intersections Using Unmanned Aerial Vehicles. Remote Sens. 2019, 11, 925. [Google Scholar] [CrossRef] [Green Version]
  23. Sahil; Sood, S.K. Fog-Cloud centric IoT-based cyber physical framework for panic oriented disaster evacuation in smart cities. Earth Sci. Inf. 2020, 1–22. [Google Scholar] [CrossRef]
  24. Ranquist, E.; Steiner, M.; Argrow, B. Exploring the Range of Weather Impacts on UAS Operations. In Proceedings of the 18th Conference on Aviation, Range and Aerospace Meteorology, Seattle, WA, USA, 22–26 January 2017. [Google Scholar]
  25. Vanegas Alvarez, F.; Gonzalez, L. Enabling UAV Navigation with Sensor and Environmental Uncertainty in Cluttered and GPS-Denied Environments. Sensors 2016, 16, 666. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  26. Pascua, D.A.; Abellanosa, C.; Lugpatan, R. Position Estimation using Inertial Measurement Unit (IMU) on a Quadcopter in an Enclosed Environment. Int. J. Comput. Commun. Instrum. Eng. 2016, 2, 332. [Google Scholar] [CrossRef]
  27. Saboor, A.; Coene, S.; Vinogradov, E.; Tanghe, E.; Joseph, W.; Pollin, S. Elevating the future of mobility: UAV-enabled Intelligent Transportation Systems. arXiv 2021, arXiv:2110.09934. [Google Scholar] [CrossRef]
  28. Liu, X.; Zhang, Z. A Vision-Based Target Detection, Tracking, and Positioning Algorithm for Unmanned Aerial Vehicle. Wirel. Commun. Mob. Comput. 2021, 2021, 5565589. [Google Scholar] [CrossRef]
  29. Utomo, W.; Bhaskara, P.W.; Kurniawan, A.; Juniastuti, S.; Yuniarno, E.M. Traffic Congestion Detection Using Fixed-Wing Unmanned Aerial Vehicle (UAV) Video Streaming Based on Deep Learning. In Proceedings of the 2020 International Conference on Computer Engineering, Network and Intelligent Multimedia (CENIM 2020), Surabaya, Indonesia, 13–18 November 2020. [Google Scholar]
  30. Zhang, H.; Liptrott, M.; Bessis, N.; Cheng, J. Real-Time Traffic Analysis using Deep Learning Techniques and UAV based Video. In Proceedings of the 2019 16th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS), Taipei, Taiwan, 18–21 September 2019; pp. 1–5. [Google Scholar]
  31. Kumar, A.; Krishnamurthi, R.; Nayyar, A.; Luhach, A.K.; Khan, M.S.; Singh, A. A novel Software-Defined Drone Network (SDDN)-based collision avoidance strategies for on-road traffic monitoring and management. Veh. Commun. 2021, 28, 100313. [Google Scholar] [CrossRef]
  32. Chen, P.; Zeng, W.; Yu, G.; Wang, Y. Surrogate Safety Analysis of Pedestrian-Vehicle Conflict at Intersections Using Unmanned Aerial Vehicle Videos. J. Adv. Transp. 2017, 2017, 1–12. [Google Scholar] [CrossRef] [Green Version]
  33. Khan, M.A.; Ectors, W.; Bellemans, T.; Ruichek, Y.; Yasar, A.-U.-H.; Janssens, D.; Wets, G. Unmanned Aerial Vehicle-based Traffic Analysis: A Case Study to Analyze Traffic Streams at Urban Roundabouts. Procedia Comput. Sci. 2018, 130, 636–643. [Google Scholar] [CrossRef]
  34. Hadiwardoyo, S.A.; Hernández-Orallo, E.; Calafate, C.T.; Cano, J.C.; Manzoni, P. Experimental characterization of UAV-to-car communications. Comput. Netw. 2018, 136, 105–118. [Google Scholar] [CrossRef]
  35. Chung, S.H.; Sah, B.; Lee, J. Optimization for drone and drone-truck combined operations: A review of the state of the art and future directions. Comput. Oper. Res. 2020, 123, 105004. [Google Scholar] [CrossRef]
  36. Kellermann, R.; Biehle, T.; Fischer, L. Drones for parcel and passenger transportation: A literature review. Transp. Res. Interdiscip. Perspect. 2020, 4, 100088. [Google Scholar] [CrossRef]
  37. Merkert, R.; Bushell, J. Managing the drone revolution: A systematic literature review into the current use of airborne drones and future strategic directions for their effective control. J. Air Transp. Manag. 2020, 89, 101929. [Google Scholar] [CrossRef]
  38. Otto, A.; Agatz, N.; Campbell, J.; Golden, B.; Pesch, E. Optimization approaches for civil applications of unmanned aerial vehicles (UAVs) or aerial drones: A survey. Networks 2018, 72, 1–48. [Google Scholar] [CrossRef]
  39. Spencer, B.F.; Hoskere, V.; Narazaki, Y. Advances in Computer Vision-Based Civil Infrastructure Inspection and Monitoring. Engineering 2019, 5, 199–222. [Google Scholar] [CrossRef]
  40. Srivastava, S.; Narayan, S.; Mittal, S. A survey of deep learning techniques for vehicle detection from UAV images. J. Syst. Arch. 2021, 117, 102152. [Google Scholar] [CrossRef]
  41. Yao, H.; Qin, R.; Chen, X. Unmanned Aerial Vehicle for Remote Sensing Applications—A Review. Remote Sens. 2019, 11, 1443. [Google Scholar] [CrossRef] [Green Version]
  42. Kanistras, K.; Martins, G.; Rutherford, M.J.; Valavanis, K.P. A Survey of Unmanned Aerial Vehicles (UAVs) for Traffic Monitoring. In Proceedings of the 2013 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 28–31 May 2013; pp. 221–234. [Google Scholar]
  43. Ren, H.; Zhao, Y.; Xiao, W.; Hu, Z. A review of UAV monitoring in mining areas: Current status and future perspectives. Int. J. Coal Sci. Technol. 2019, 6, 320–333. [Google Scholar] [CrossRef] [Green Version]
  44. Videras Rodríguez, M.; Melgar, S.G.; Cordero, A.S.; Márquez, J.M.A. A Critical Review of Unmanned Aerial Vehicles (UAVs) Use in Architecture and Urbanism: Scientometric and Bibliometric Analysis. Appl. Sci. 2021, 11, 9966. [Google Scholar] [CrossRef]
  45. Śledź, S.; Ewertowski, M.W.; Piekarczyk, J. Applications of unmanned aerial vehicle (UAV) surveys and Structure from Motion photogrammetry in glacial and periglacial geomorphology. Geomorphology 2021, 378, 107620. [Google Scholar] [CrossRef]
  46. Eskandari, R.; Mahdianpari, M.; Mohammadimanesh, F.; Salehi, B.; Brisco, B.; Homayouni, S. Meta-analysis of Unmanned Aerial Vehicle (UAV) Imagery for Agro-environmental Monitoring Using Machine Learning and Statistical Models. Remote Sens. 2020, 12, 3511. [Google Scholar] [CrossRef]
  47. Mohan, M.; Richardson, G.; Gopan, G.; Aghai, M.M.; Bajaj, S.; Galgamuwa, G.A.P.; Vastaranta, M.; Arachchige, P.S.P.; Amorós, L.; Corte, A.P.; et al. UAV-Supported Forest Regeneration: Current Trends, Challenges and Implications. Remote Sens. 2021, 13, 2596. [Google Scholar] [CrossRef]
  48. Sibanda, M.; Mutanga, O.; Chimonyo, V.G.P.; Clulow, A.D.; Shoko, C.; Mazvimavi, D.; Dube, T.; Mabhaudhi, T. Application of Drone Technologies in Surface Water Resources Monitoring and Assessment: A Systematic Review of Progress, Challenges, and Opportunities in the Global South. Drone 2021, 5, 84. [Google Scholar] [CrossRef]
  49. Aggarwal, S.; Kumar, N. Path planning techniques for unmanned aerial vehicles: A review, solutions, and challenges. Comput. Commun. 2020, 149, 270–299. [Google Scholar] [CrossRef]
  50. Al-Kaff, A.; Martín, D.; García, F.; Escalera, A.D.L.; María Armingol, J. Survey of computer vision algorithms and applications for unmanned aerial vehicles. Expert Syst. Appl. 2018, 92, 447–463. [Google Scholar] [CrossRef]
  51. Alladi, T.; Chamola, V.; Sahu, N.; Guizani, M. Applications of blockchain in unmanned aerial vehicles: A review. Veh. Commun. 2020, 23, 100249. [Google Scholar] [CrossRef]
  52. Chen, X.; Tang, J.; Lao, S. Review of Unmanned Aerial Vehicle Swarm Communication Architectures and Routing Protocols. Appl. Sci. 2020, 10, 3661. [Google Scholar] [CrossRef]
  53. Darvishpoor, S.; Roshanian, J.; Raissi, A.; Hassanalian, M. Configurations, flight mechanisms, and applications of unmanned aerial systems: A review. Prog. Aerosp. Sci. 2020, 121, 100694. [Google Scholar] [CrossRef]
  54. Emilien, A.-V.; Thomas, C.; Thomas, H. UAV & satellite synergies for optical remote sensing applications: A literature review. Sci. Remote Sens. 2021, 3, 100019. [Google Scholar] [CrossRef]
  55. Jawhar, I.; Mohamed, N.; Al-Jaroodi, J.; Agrawal, D.P.; Zhang, S. Communication and networking of UAV-based systems: Classification and associated architectures. J. Netw. Comput. Appl. 2017, 84, 93–108. [Google Scholar] [CrossRef]
  56. Xu, C.; Liao, X.; Tan, J.; Ye, H.; Lu, H. Recent Research Progress of Unmanned Aerial Vehicle Regulation Policies and Technologies in Urban Low Altitude. IEEE Access 2020, 8, 74175–74194. [Google Scholar] [CrossRef]
  57. Yazid, Y.; Ez-Zazi, I.; Guerrero-González, A.; El Oualkadi, A.; Arioua, M. UAV-Enabled Mobile Edge-Computing for IoT Based on AI: A Comprehensive Review. Drones 2021, 5, 148. [Google Scholar] [CrossRef]
  58. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef] [Green Version]
  59. Abdollahi, A.; Pradhan, B.; Shukla, N.; Chakraborty, S.; Alamri, A. Deep Learning Approaches Applied to Remote Sensing Datasets for Road Extraction: A State-Of-The-Art Review. Remote Sens. 2020, 12, 1444. [Google Scholar] [CrossRef]
  60. Ahmed, A.; Ngoduy, D.; Adnan, M.; Baig, M.A.U. On the fundamental diagram and driving behavior modeling of heterogeneous traffic flow using UAV-based data. Transp. Res. Part A Policy Pr. 2021, 148, 100–115. [Google Scholar] [CrossRef]
  61. Apeltauer, J.; Babinec, A.; Herman, D.; Apeltauer, T. Automatic vehicle trajectory extraction for traffic analysis from aerial video data. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2015, 40, 9–15. [Google Scholar] [CrossRef] [Green Version]
  62. Balamuralidhar, N.; Tilon, S.; Nex, F. MultEYE: Monitoring System for Real-Time Vehicle Detection, Tracking and Speed Estimation from UAV Imagery on Edge-Computing Platforms. Remote Sens. 2021, 13, 573. [Google Scholar] [CrossRef]
  63. Barmpounakis, E.N.; Vlahogianni, E.I.; Golias, J.C. Identifying Predictable Patterns in the Unconventional Overtaking Decisions of PTW for Cooperative ITS. IEEE Trans. Intell. Veh. 2018, 3, 102–111. [Google Scholar] [CrossRef]
  64. Barmpounakis, E.N.; Vlahogianni, E.I.; Golias, J.C.; Babinec, A. How accurate are small drones for measuring microscopic traffic parameters? Transp. Lett. 2019, 11, 332–340. [Google Scholar] [CrossRef]
  65. Barmpounakis, E.; Geroliminis, N. On the new era of urban traffic monitoring with massive drone data: The pNEUMA large-scale field experiment. Transp. Res. Part C Emerg. Technol. 2020, 111, 50–71. [Google Scholar] [CrossRef]
  66. Brkić, I.; Miler, M.; Ševrović, M.; Medak, D. An Analytical Framework for Accurate Traffic Flow Parameter Calculation from UAV Aerial Videos. Remote Sens. 2020, 12, 3844. [Google Scholar] [CrossRef]
  67. Chen, P.; Zeng, W.; Yu, G. Assessing right-turning vehicle-pedestrian conflicts at intersections using an integrated microscopic simulation model. Accid. Anal. Prev. 2019, 129, 211–224. [Google Scholar] [CrossRef] [PubMed]
  68. Chen, X.Q.; Li, Z.B.; Yang, Y.S.; Qi, L.; Ke, R.M. High-Resolution Vehicle Trajectory Extraction and Denoising from Aerial Videos. IEEE Trans. Intell. Transp. Syst. 2021, 22, 3190–3202. [Google Scholar] [CrossRef]
  69. Guido, G.; Gallelli, V.; Rogano, D.; Vitale, A. Evaluating the accuracy of vehicle tracking data obtained from Unmanned Aerial Vehicles. Int. J. Transp. Sci. Technol. 2016, 5, 136–151. [Google Scholar] [CrossRef]
  70. Javadi, S.; Dahl, M.; Pettersson, M.I. Vehicle Detection in Aerial Images Based on 3D Depth Maps and Deep Neural Networks. IEEE Access 2021, 9, 8381–8391. [Google Scholar] [CrossRef]
  71. Kang, L.; Mattyus, G. Fast Multiclass Vehicle Detection on Aerial Images. IEEE Geosci. Remote Sens. Lett. 2015, 12, 1938–1942. [Google Scholar] [CrossRef] [Green Version]
  72. Kaufmann, S.; Kerner, B.S.; Rehborn, H.; Koller, M.; Klenov, S.L. Aerial observations of moving synchronized flow patterns in over-saturated city traffic. Transp. Res. Part C Emerg. Technol. 2018, 86, 393–406. [Google Scholar] [CrossRef]
  73. Ke, R.; Li, Z.; Kim, S.; Ash, J.; Cui, Z.; Wang, Y. Real-Time Bidirectional Traffic Flow Parameter Estimation from Aerial Videos. IEEE Trans. Intell. Transp. Syst. 2017, 18, 890–901. [Google Scholar] [CrossRef]
  74. Khan, M.A.; Ectors, W.; Bellemans, T.; Janssens, D.; Wets, G. Unmanned aerial vehicle-based traffic analysis: Methodological framework for automated multivehicle trajectory extraction. Transp. Res. Rec. 2017, 2626, 25–33. [Google Scholar] [CrossRef]
  75. Khan, M.A.; Ectors, W.; Bellemans, T.; Janssens, D.; Wets, G. Unmanned aerial vehicle-based traffic analysis: A case study for shockwave identification and flow parameters estimation at signalized intersections. Remote Sens. 2018, 10, 458. [Google Scholar] [CrossRef] [Green Version]
  76. Khan, N.A.; Jhanjhi, N.Z.; Brohi, S.N.; Usmani, R.S.A.; Nayyar, A. Smart traffic monitoring system using Unmanned Aerial Vehicles (UAVs). Comput. Commun. 2020, 157, 434–443. [Google Scholar] [CrossRef]
  77. Kujawski, A.; Dudek, T. Analysis and visualization of data obtained from camera mounted on unmanned aerial vehicle used in areas of urban transport. Sustain. Cities Soc. 2021, 72, 103004. [Google Scholar] [CrossRef]
  78. Li, J.; Chen, S.; Zhang, F.; Li, E.; Yang, T.; Lu, Z. An Adaptive Framework for Multi-Vehicle Ground Speed Estimation in Airborne Videos. Remote Sens. 2019, 11, 1241. [Google Scholar] [CrossRef] [Green Version]
  79. Li, X.; Li, X.; Pan, H. Multi-Scale Vehicle Detection in High-Resolution Aerial Images with Context Information. IEEE Access 2020, 8, 208643–208657. [Google Scholar] [CrossRef]
  80. Luo, H.; Chu, S.C.; Wu, X.; Wang, Z.; Xu, F. Traffic collisions early warning aided by small unmanned aerial vehicle companion. Telecommun. Syst. 2020, 75, 169–180. [Google Scholar] [CrossRef]
  81. Moranduzzo, T.; Melgani, F. Automatic Car Counting Method for Unmanned Aerial Vehicle Images. IEEE Trans. Geosci. Remote. Sens. 2014, 52, 1635–1647. [Google Scholar] [CrossRef]
  82. Shan, D.; Lei, T.; Yin, X.; Luo, Q.; Gong, L. Extracting Key Traffic Parameters from UAV Video with On-Board Vehicle Data Validation. Sensors 2021, 21, 5620. [Google Scholar] [CrossRef] [PubMed]
  83. Wan, M.; Gu, G.; Qian, W.; Ren, K.; Maldague, X.; Chen, Q. Unmanned Aerial Vehicle Video-Based Target Tracking Algorithm Using Sparse Representation. IEEE Internet Things J. 2019, 6, 9689–9706. [Google Scholar] [CrossRef]
  84. Wang, L.; Chen, F.; Yin, H. Detecting and tracking vehicles in traffic by unmanned aerial vehicles. Autom. Constr. 2016, 72, 294–308. [Google Scholar] [CrossRef]
  85. Wang, H.W.; Cheng, K.; Lu, Q.C.; Peng, Z.R. Improved model of start-wave velocity at intersections based on unmanned aerial vehicle data. J. Donghua Univ. Eng. Ed. 2016, 33, 13–19. [Google Scholar]
  86. Wang, C.; Xu, C.; Dai, Y. A crash prediction method based on bivariate extreme value theory and video-based vehicle trajectory data. Accid. Anal. Prev. 2019, 123, 365–373. [Google Scholar] [CrossRef]
  87. Wang, J.; Simeonova, S.; Shahbazi, M. Orientation- and Scale-Invariant Multi-Vehicle Detection and Tracking from Unmanned Aerial Videos. Remote Sens. 2019, 11, 2155. [Google Scholar] [CrossRef] [Green Version]
  88. Xing, L.; He, J.; Li, Y.; Wu, Y.; Yuan, J.; Gu, X. Comparison of different models for evaluating vehicle collision risks at upstream diverging area of toll plaza. Accid. Anal. Prev. 2020, 135, 105343. [Google Scholar] [CrossRef]
  89. Xing, L.; He, J.; Abdel-Aty, M.; Wu, Y.; Yuan, J. Time-varying Analysis of Traffic Conflicts at the Upstream Approach of Toll Plaza. Accid. Anal. Prev. 2020, 141, 105539. [Google Scholar] [CrossRef] [PubMed]
  90. Xu, Y.; Yu, G.; Wang, Y.; Wu, X.; Ma, Y. A hybrid vehicle detection method based on viola-jones and HOG + SVM from UAV images. Sensors 2016, 16, 1325. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  91. Zhu, J.; Sun, K.; Jia, S.; Lin, W.; Hou, X.; Liu, B.; Qiu, G. Bidirectional long short-term memory network for vehicle behavior recognition. Remote Sens. 2018, 10, 887. [Google Scholar] [CrossRef] [Green Version]
  92. Zhu, J.S.; Sun, K.; Jia, S.; Li, Q.Q.; Hou, X.X.; Lin, W.D.; Liu, B.Z.; Qiu, G.P. Urban Traffic Density Estimation Based on Ultrahigh-Resolution UAV Video and Deep Neural Network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 4968–4981. [Google Scholar] [CrossRef]
  93. SankeyMATIC. Available online: https://sankeymatic.com/ (accessed on 22 December 2021).
  94. Kovalev, I.; Voroshilova, A.; Karaseva, M. Analysis of the current situation and development trend of the international cargo UAVs market. J. Phys. Conf. Ser. 2019, 1399, 055095. [Google Scholar] [CrossRef] [Green Version]
  95. Smith, M.L.; Smith, L.N.; Hansen, M.F. The quiet revolution in machine vision-a state-of-the-art survey paper, including historical review, perspectives, and future directions. Comput. Ind. 2021, 130, 103472. [Google Scholar] [CrossRef]
  96. Alam, S.; Oluoch, J. A survey of safe landing zone detection techniques for autonomous unmanned aerial vehicles (UAVs). Expert Syst. Appl. 2021, 179, 115091. [Google Scholar] [CrossRef]
  97. Salvo, G.; Caruso, L.; Scordo, A.; Guido, G.; Vitale, A. Traffic data acquirement by unmanned aerial vehicle. Eur. J. Remote Sens. 2017, 50, 343–351. [Google Scholar] [CrossRef]
  98. Milić, A.; Randjelovic, A.; Radovanović, M. Use of Drones in Operations in the Urban Environment. In Proceedings of the 5th International Scientific Conference Safety and Crisis Management-Theory and Practise Safety for the Future–SecMan 2019, Belgrad, Serbia, 3–4 October 2019. [Google Scholar]
  99. Watkins, S.; Burry, J.; Mohamed, A.; Marino, M.; Prudden, S.; Fisher, A.; Kloet, N.; Jakobi, T.; Clothier, R. Ten questions concerning the use of drones in urban environments. Build. Environ. 2020, 167, 106458. [Google Scholar] [CrossRef]
  100. Schmidt, T.; Hauer, F.; Pretschner, A. Understanding Safety for Unmanned Aerial Vehicles in Urban Environments. In Proceedings of the 2021 IEEE Intelligent Vehicles Symposium (IV), Nagoya, Japan, 11–17 July 2021; pp. 638–643. [Google Scholar]
  101. Shvetsova, S.; Shvetsov, A. Safety when Flying Unmanned Aerial Vehicles at Transport Infrastructure Facilities. Transp. Res. Procedia 2021, 54, 397–403. [Google Scholar] [CrossRef]
  102. Primatesta, S.; Rizzo, A.; la Cour-Harbo, A. Ground Risk Map for Unmanned Aircraft in Urban Environments. J. Intell. Robot. Syst. 2020, 97, 489–509. [Google Scholar] [CrossRef]
  103. Lee, J.Y.; Chung, A.Y.; Shim, H.; Joe, C.; Park, S.; Kim, H. UAV Flight and Landing Guidance System for Emergency Situations. Sensors 2019, 19, 4468. [Google Scholar] [CrossRef] [Green Version]
  104. Guerin, J.; Delmas, K.; Guiochet, J. Certifying Emergency Landing for Safe Urban UAV. In Proceedings of the 2021 51st Annual IEEE/IFIP International Conference on Dependable Systems and Networks Workshops (DSN-W), Taipei, Taiwan, 21–24 June 2021. [Google Scholar]
  105. Elloumi, M.; Dhaou, R.; Escrig, B.; Idoudi, H.; Saidane, L.; Fer, A. Traffic Monitoring on City Roads using UAVs. In Proceedings of the International Conference on Ad-Hoc Networks and Wireless, Luxembourg, 1–3 October 2019. [Google Scholar]
  106. Sharma, V.; Chen, H.-C.; Kumar, R. Driver behaviour detection and vehicle rating using multi-UAV coordinated vehicular networks. J. Comput. Syst. Sci. 2017, 86, 3–32. [Google Scholar] [CrossRef]
  107. Zhai, W.; Han, B.; Li, D.; Duan, J.; Cheng, C. A low-altitude public air route network for UAV management constructed by global subdivision grids. PLoS ONE 2021, 16, e0249680. [Google Scholar] [CrossRef]
  108. Feng, D.; Du, P.; Shen, H.; Liu, Z. UAS Traffic Management in Low-Altitude Airspace Based on Three Dimensional Digital Aerial Corridor System. In Urban Intelligence and Applications. Studies in Distributed Intelligence; Yuan, X., Elhoseny, M., Eds.; Springer: Cham, Switzerland, 2020; pp. 179–188. [Google Scholar]
  109. Kim, J.; Atkins, E. Airspace Geofencing and Flight Planning for Low-Altitude, Urban, Small Unmanned Aircraft Systems. Appl. Sci. Switz. 2022, 12, 576. [Google Scholar] [CrossRef]
  110. Samir Labib, N.; Danoy, G.; Musial, J.; Brust, M.R.; Bouvry, P. Internet of Unmanned Aerial Vehicles—A Multilayer Low-Altitude Airspace Model for Distributed UAV Traffic Management. Sensors 2019, 19, 4779. [Google Scholar] [CrossRef] [Green Version]
  111. Messina, G.; Peña, J.M.; Vizzari, M.; Modica, G. A Comparison of UAV and Satellites Multispectral Imagery in Monitoring Onion Crop. An Application in the ‘Cipolla Rossa di Tropea’ (Italy). Remote Sens. 2020, 12, 3424. [Google Scholar] [CrossRef]
  112. Xiang, T.; Xia, G.; Zhang, L. Mini-Unmanned Aerial Vehicle-Based Remote Sensing: Techniques, applications, and prospects. IEEE Geosci. Remote Sens. Mag. 2019, 7, 29–63. [Google Scholar] [CrossRef] [Green Version]
  113. Liu, M.; Yu, T.; Gu, X.; Sun, Z.; Yang, J.; Zhang, Z.; Mi, X.; Cao, W.; Li, J. The Impact of Spatial Resolution on the Classification of Vegetation Types in Highly Fragmented Planting Areas Based on Unmanned Aerial Vehicle Hyperspectral Images. Remote Sens. 2020, 12, 146. [Google Scholar] [CrossRef] [Green Version]
  114. Samiappan, S.; Hathcock, L.; Turnage, G.; McCraine, C.; Pitchford, J.; Moorhead, R. Remote Sensing of Wildfire Using a Small Unmanned Aerial System: Post-Fire Mapping, Vegetation Recovery and Damage Analysis in Grand Bay, Mississippi/Alabama, USA. Drones 2019, 3, 43. [Google Scholar] [CrossRef] [Green Version]
  115. Zhang, X.; Jin, J.; Lan, Z.; Li, C.; Fan, M.; Wang, Y.; Yu, X.; Zhang, Y. ICENET: A Semantic Segmentation Deep Network for River Ice by Fusing Positional and Channel-Wise Attentive Features. Remote Sens. 2020, 12, 221. [Google Scholar] [CrossRef] [Green Version]
  116. FAA. Small Unmanned Aircraft Systems (UAS) Regulations (Part 107). Available online: https://www.faa.gov/newsroom/small-unmanned-aircraft-systems-uas-regulations-part-107 (accessed on 15 December 2021).
  117. EASA. 2021. Available online: https://www.easa.europa.eu/document-library/easy-access-rules/online-publications/easy-access-rules-unmanned-aircraft-systems?page=5 (accessed on 23 December 2021).
  118. Lewandowski, K. Sustainable Usage of Freight Drones in City Centers, Proposition of Regulations for Safe Usage of Drones. Sustainability 2021, 13, 8634. [Google Scholar] [CrossRef]
  119. Drone-Laws. Drone Laws in Romania. Available online: https://drone-laws.com/drone-laws-in-romania/ (accessed on 23 December 2021).
  120. Alamouri, A.; Lampert, A.; Gerke, M. An Exploratory Investigation of UAS Regulations in Europe and the Impact on Effective Use and Economic Potential. Drones 2021, 5, 63. [Google Scholar] [CrossRef]
  121. U-space. Supporting Safe and Secure Drone Operations in Europe; Publications Office of the European Union; European Union: Luxembourg, 2020. [Google Scholar]
  122. Li, T.; Ye, J.; Dai, J.; Lei, H.; Yang, W.; Pan, G.; Chen, Y. Secure UAV-to-Vehicle Communications. IEEE Trans. Commun. 2021, 69, 5381–5393. [Google Scholar] [CrossRef]
  123. Allouch, A.; Cheikhrouhou, O.; Koubâa, A.; Toumi, K.; Khalgui, M.; Nguyen Gia, T. UTM-Chain: Blockchain-Based Secure Unmanned Traffic Management for Internet of Drones. Sensors 2021, 21, 3049. [Google Scholar] [CrossRef]
  124. Mukhamediev, R.I.; Symagulov, A.; Kuchin, Y.; Zaitseva, E.; Bekbotayeva, A.; Yakunin, K.; Assanov, I.; Levashenko, V.; Popova, Y.; Akzhalova, A.; et al. Review of Some Applications of Unmanned Aerial Vehicles Technology in the Resource-Rich Country. Appl. Sci. 2021, 11, 171. [Google Scholar] [CrossRef]
  125. Oliveira, F.; Luís, M.; Sargento, S. Machine Learning for the Dynamic Positioning of UAVs for Extended Connectivity. Sensors 2021, 21, 4618. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Network visualization of author keywords co-occurrence.
Figure 1. Network visualization of author keywords co-occurrence.
Remotesensing 14 00620 g001
Figure 2. Study selection process.
Figure 2. Study selection process.
Remotesensing 14 00620 g002
Figure 3. Sankey diagram showing journal vs. publication year vs. country.
Figure 3. Sankey diagram showing journal vs. publication year vs. country.
Remotesensing 14 00620 g003
Table 2. Summary of information related to data acquisition and analysis.
Table 2. Summary of information related to data acquisition and analysis.
AuthorUAV TypeCamera Resolution, fpsFlying HeightVideo Dataset DurationSoftware TechniquesVehicle TypeUrban AreaMeasures
Ahmed et al., 2021 [60]DJI Phantom 34k, NRNR15 minmanual extraction, speed-density model—least-squares method (LSM)cars, motorbikes, rickshaws, loading pickups, buses, trucksUniversity Road in Karachi—100-ft long, four marked lanestraffic flow, traffic density, average speed, longitudinal and lateral gap
Apeltauer et al., 2015 [61]NR1920 × 980, 29 fps100 mNRViola and Jones’s AdaBoost algorithm, sequential particle filtervehiclesThe site of roundabout junction of Hamerska road and Lipenska road near Olomouc, Czech Republicrelative number of missed targets, relative number of false tracks, average number of swaps in tracks, temporal average of measure of completeness, spatial precision
Balamuralidhar et al., 2021 [62]DJI Phantom 33269 × 720, 30 fps50 mNRCPSDarkNet53 backbone, EnEt- segmentation head, YOLO v4, Minimum Output Sum of Squared Error (MOSSE) algorithm, Ground Sampling Distance (GSD);
YOLO v4, Tiny YOLO v4, YOLO v3, SSD, Faster RCNN comparation
vehiclesNRperformance of vehicle detectionand vehicle tracking algorithms, speed estimation, inference on Jetson Xavier NX
Barmpounakis et al., 2018 [63]hexacopter4K, 30 fpsNRNRmanual or semi-automatic extraction, frame-by-frame analysis, machine learning—meta-optimized Decision Treesmotorcycles, scooters, cars and heavy vehiclesNational Technical University of Athens campus—arterial with three lanes per directionthe type of each vehicle, the lane each vehicle is moving, speeds of all vehicles present, accelerations of all vehicles present, spatial distances between vehicles, duration between each state and general information for the PTW driver
Barmpounakis et al., 2019 [64]hexacopter4K, 30 fps70 m15 minpositive and negative vehicle ‘detectors’ on image content, matching the peaks of probability140 vehicles, 23 pedestriansintersection in the National Technical University of Athens campus—four-legged intersectiontrajectory, speed
Barmpounakis and Geroliminis 2020 [65]DJI Phantom 4 Advanced4K, 25 fpsNR59 hvirtual loop detectors (gates) used to calculate several traffic variables and extract valuable informationcars, taxis, motorcycles, buses, heavy vehiclesa congested area of a 1.3 km2 area with more than 100 km-lanes of road network, around 100 busy intersectionsarterial travel time, congestion propagation, lane changing
Brkić et al., 2020 [66]NR4096 × 2160, 24 fps50 m13:52 mindeep learning object detection—Faster R-CNN with ResNet50 backbone networkvehicles500 m long section of Zagreb bypass motorwaytraffic flow rate, speed estimation, traffic flow density, distance headways and gaps, time headways and gaps
Chen et al., 2019 [67]NR1920 × 1080, NR100 mNRViola-Jones (V-J) and linear SVM classifier with HOG feature (HOG + SVM), KLT (Kanade-Lucas-Tomasi) feature tracker, image processing system, surrogate safety measures (SSMs)pedestrians and right-turning vehiclestwo urban intersections in Beijing, Chinavehicle turning path, turning speed, gap acceptance model and pedestrian behavior model, post encroachment time (PET)
Chen et al., 2021 [68]DJI Mavic professional3840 × 2160, 25 fps223, 281 m22, 43 sCanny-based ensemble detector, kernelized correlation filter (KCF), wavelet transformvehiclestwo urban expressway sections in Nanjing, ChinaRoot-mean-square deviation (RMSE), the Mean Squared Deviation (MSD), and the Pearson product-moment correlation coefficient (Pearson’s r)
Guido et al., 2016 [69]UAV drone with eight propellers4k, 23 fps60 m19, 21 min, 11, 5 minidentifying pixels associated with the objects of interest, Haar classifier, video stabilization, the conversion to grayscale space and the Gaussian-blurring filter, extract vehicle trajectories—Haar Classifier, ROIvehiclesa great urban roundabout at the intersection of the ‘‘Asse Viario” with De Gasperi roadNormalized Root Mean Square Error in positioning, speed profile, Root Mean Square Percentage Error in speed evaluation
Javadi et al., 2021 [70]NR3840 × 2160, NRNRNRYOLO v3 + DarkNet-53, SqueezeNet, MobileNet-v2 and DenseNet-201 + 3D depth maps, Levenberg-Marquardt algorithm, fcNNtrucks, semi-trailers, and trailerstwo industrial harborsaverage precision, performance evaluation
Kang and Mattyus 2015 [71]NR5616 × 3744, NRNRNRIntegral Channel Features (ICF), HOG features, AdaBoost classifier in Soft Cascade structurecars and trucksarea of Munich, Germanyorientation estimation, type classification, baseline comparison, computation time
Kaufmann et al., 2018 [72]DJI Inspire 1—a small-scale quadcopter4K, 25 fps100 m14.5 minLevenberg-Marquardt optimization, moving linear regression (MLR), vehicle trajectories—supervised tracking methodvehiclesthe street “Völklinger Straβe” in Düsseldor—600m street— starting with two lanes, broadens to three lanes at location 500m and to four lanes at location 530 mspeed, location, trajectories, lane changes per minute
Ke et al., 2017 [73]NR960 × 540, 24 fpsNR1.17 minvehicle tracking (Shi-Tomasi features, Kanade-Lucas optical flow algorithm), motion-vector clustering (k-means algorithm), connected-based graph method to detect clustervehiclessix lanes of traffic moving in two directionsspeed, density, volume
Khan et al., 2017 [74]Argus-One (from ArgusVision)4K, 25 fps80, 60 m14 minoptical flow tracking (Lucas-Kanade algorithm, background substraction treshold, blob analysis, vehicle extraction—computer visionvehiclesan urban intersection near the city of Sint-Truiden in Belgium—four-leg intersectiontrajectory, speed profile, space-time trajectories
Khan et al., 2018 [75]Argus-One (from Argus-Vision)4K, 25 fps80, 60 m10–12 minoptical flow tracking, blob analysis, Kalman filtervehiclesa four-legged sub-urban signalized intersection from Sint-Truiden, Belgiumtrajectory, speed profile
Khan et al., 2020 [76]NRNRNRNRdetect speed or other traffic violation in real timevehiclesSaudi Arabiaexcess speed limit and other traffic safety violations on highways and roads
Kujawski and Dudek 2021 [77]NR720p, 60 fpsNR8 himage processing—blob detectionvehicles in/outcity of Szczecin in Poland—two lanes of traffic each from and to the citynumbers of cars per hour on holiday and workday
Li et al., 2019 [78]DJI-Matrice 1001280 × 960, NR80 m YOLO v3, tracking-by-detection methd, Kalman filter, Hungarian algorithm, motion compensation based on homography, optical flow—RANSAC, adaptive vehicle speed estimationvehiclesan intersection, country road, parking entrance, highways, and crossroadsvehicle speed estimation, velocity measurement
Li et al., 2020 [79]DJI Matrice 100NRNRNRCNN + SSD—scale-specific prediction based single shot detector (SSP-SSD), Resnet 101, remove redundant detection—OA-NMS (Outlier-Aware Non-Maximum Suppression), comparation with SSD, Cascade RCNN, FRCNN, YOLOV3, YOLOV4, YOLOV5(x), FCOS, Retinanet and CenterNetVehicles—small, medium, largedataset containing 312071 vehiclesperformance evaluation—precision, recall rate, F1-score, average precision
Liu and Zhang 2021 [28]NRNRNRNRYOLO v4, DeepSORT (KF prediction), trajectory estimation—eight-dimensional space, high-precision positioning—interacting multiple model (IMM)—PF (particle filter) algorithm, IMM-PF, CV-EKF, IMM-EKF comparisoncars, buses, trucks, and vansdataset containing
15,741 images
position, Normalized distance, Model probabilities
Luo et al., 2020 [80]small UAV similar as SkyProwler640 × 360, 570 × 640NRNRblob detection, classifier, dot-product kernels and radial basis function (RBF) kernels, tracking-by-detection, crash decisionvehiclesenvironment includes city, suburban and rural areasvehicle trajectory
Moranduzzo and Melgani 2014 [81]hexacopter5184 × 3456, NRNRNRfeature extraction process based on scalar invariant feature transform (SIFT), classification by means of support vector machine (SVM) classifier, grouping of the key points belonging to the same carvehiclesNRaccuracy
Shan et al., 2021 [82]DJI Phantom 4 Pro3840 × 2160, 25 fps150–350 mNRPre-processing, YOLO v3, deep SORT algorithmvehicles1 km long of Xi’an Ring Expresswa—upstream of ZHANG-BA interchange exitprecision of vehicle detection, precision of extracted speed
Wan et al., 2019 [83]NRNRNRNRjoint dictionary, L2 regularization based on temporal consistency, Markov Random Field (MRF)-based binary support vector, particle filter framework along with a dynamic template update scheme; comparation of 9 state-of-the-art visual tracking algorithms, including IV, L1, PCOM, CT, MTT, WMIL, OFDS, STC and CNTvehicles, pedestriansUAV videosprecision and success plots, time complexity, execution time
Wang et al., 2016a [84]NRNR80–90 m4 hShi-Tomasi features, optical flow (Kanade-Lucas algorithm), prediction method—bivariate bivariate extreme value theory (EVT)vehiclesten urban signalized intersections in Fengxian District in Shanghaitime-to-accident (TA), post-encroachment Time (PET), minimum time-to-collision (mTTC), and maximum deceleration rate (MaxD)
Wang et al., 2016b [85]MD3-1000 by Germany Microdrones CompanyNR42.6 mNRcalculating start-wave velocity at signalized intersectionslarge, medium and small vehiclesstraight lanes at the intersection of Cao-an Highway and North Jia-song Road in Shanghaispeed, density of traffic flow
Wang et al., 2019 [86]DJI Phantom 4 Pro2720 × 1530, 30 fps60–150 mNRYOLOv3, motion estimation based on Kalman filtering is integrated with deep appearance featuresvehiclesN/Atrue positive (TP), false positive (FP), true negative (TN), false negative (FN), identification precision (IDP), identification recall (IDR), F1 score, multiple-object tracking accuracy (MOTA), mostly tracked (MT), mostly lost (ML), and identity switching (IDSW)
Wang et al., 2019 [87]DJI Phantom 21920 × 1080, 30 fps100–150 mNRimage registration, image feature extraction—edge (Prewitt edge detection), optical flow (Lucas–Kanade operator), local feature point (SIFT), vehicle detection—shape detection, vehice tracking—optical flow, matched local feature pointsvehiclesthe north part of the 5th Ring Road in Beijing, Chinacorrectness, completeness and quality—vehicle detection, number of vehicles, error rate
Xing et al., 2020a [88]NR4K, 30 fpsNRNRlogistic regression, time-varying random effects logistic regression (T-RELR) model and time-varying random parameters logistic regression (T-RPLR), two time-varying mixed logit models including time-varying random effects logistic regression (T-RELR) model and time-varying random parameters logistic regression (T-RPLR) are developed to examine the time varying effects of influencing factors on vehicle collision riskcars, buses and trucksa toll plaza area on G42 freeway in Nanjing, China -12 toll collection lanes on the east-west direction (north side) and 6 toll collection lanes on the west-east direction (south side)model performance, time-varying logistic regression model, TTC
Xing et al., 2020b [89]NR4K, 30 fpsNR50 minlogistic regression model vs. K-Nearest Neighbor (KNN), Artificial Neural Networks (ANN), Support Vector Machines (SVM), Decision Trees (DT), and Random Forest (RF)vehiclesa toll plaza area on G42 freeway in Nanjing, Chinasurrogate safety measure (SSM)—extended TTC, model performance
Xu et al., 2016 [90]DJI Phantom 21920 × 1080, 24 fpsNR10 minViola-Jones, linear support vector machine (SVM) + histogram of oriented gradients (HOG) features, comparation with 9 other methodsvehicleNRdetection speed (f/s), correctness, completeness, and quality
Zhu et al., 2018a [91]DJI Inspire 1 Pro4K (3840 × 2178), 30 fpsNR2 min 47 sRetina object detector (RetinaNET), associated detections, trajectory modeling and extraction, semi-supervised nearest
neighbor search, double spectral clustering (DSC), deep learning model based on Long Short-Term Memory (LSTM)
cars, buses, and trucksbusy road intersection of a modern megacitytrajectory, tracking speed, vehicle behavior recognition
Zhu et al., 2018b [92]DJI Inspire 1 Pro4K, 30 fpsNR56 min 39 sdeep learning (enhanced single shot multibox detector), support vector machine; comparation with SSD, Faster RCNN (FRC), and YOLOcars, buses, trucksfive key road intersections in Shenzhenvehicle counting, counting accuracy
Note: NR—not reported, fps—frames per second.
Table 3. Synthesis of the results related to the main purpose of the studies.
Table 3. Synthesis of the results related to the main purpose of the studies.
Application FieldMain PurposePaper
Traffic analysiscongestion analysis[77]
crash prediction[86]
vehicle collision detection[80]
driving behavior modeling[60]
vehicle trajectories extraction[61,64,68,74,75]
traffic parameters extraction[82]
moving synchronized flow patterns observation[72]
traffic flow parameter estimation[73]
traffic density estimation[92]
traffic information collection[85]
unconventional overtaking decisions identification[63]
vehicle behavior recognition[91]
vehicle collision risk evaluation[88,89]
vehicle-pedestrian conflicts evaluation[67]
Traffic monitoringsmart monitoring system[76]
traffic streams recording[65]
vehicle detection[28,70,71,79,81,90]
vehicle detection and tracking[62,84,87]
vehicle tracking[83]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Butilă, E.V.; Boboc, R.G. Urban Traffic Monitoring and Analysis Using Unmanned Aerial Vehicles (UAVs): A Systematic Literature Review. Remote Sens. 2022, 14, 620. https://doi.org/10.3390/rs14030620

AMA Style

Butilă EV, Boboc RG. Urban Traffic Monitoring and Analysis Using Unmanned Aerial Vehicles (UAVs): A Systematic Literature Review. Remote Sensing. 2022; 14(3):620. https://doi.org/10.3390/rs14030620

Chicago/Turabian Style

Butilă, Eugen Valentin, and Răzvan Gabriel Boboc. 2022. "Urban Traffic Monitoring and Analysis Using Unmanned Aerial Vehicles (UAVs): A Systematic Literature Review" Remote Sensing 14, no. 3: 620. https://doi.org/10.3390/rs14030620

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop