Next Article in Journal
Proof of Feasibility of the Sea State Monitoring from Data Collected in Medium Pulse Mode by a X-Band Wave Radar System
Next Article in Special Issue
3-D Characterization of Vineyards Using a Novel UAV Imagery-Based OBIA Procedure for Precision Viticulture Applications
Previous Article in Journal
A Unified Algorithm for Channel Imbalance and Antenna Phase Center Position Calibration of a Single-Pass Multi-Baseline TomoSAR System
Previous Article in Special Issue
Early-Season Stand Count Determination in Corn via Integration of Imagery from Unmanned Aerial Systems (UAS) and Supervised Learning Techniques
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Unmanned Aerial Vehicle-Based Traffic Analysis: A Case Study for Shockwave Identification and Flow Parameters Estimation at Signalized Intersections

UHasselt-Hasselt University, Transportation Research Institute (IMOB), Agoralaan, 3590 Diepenbeek, Belgium
*
Author to whom correspondence should be addressed.
Remote Sens. 2018, 10(3), 458; https://doi.org/10.3390/rs10030458
Submission received: 8 February 2018 / Revised: 6 March 2018 / Accepted: 13 March 2018 / Published: 14 March 2018
(This article belongs to the Special Issue Remote Sensing from Unmanned Aerial Vehicles (UAVs))

Abstract

:
Owing to their dynamic and multidisciplinary characteristics, Unmanned Aerial Vehicles (UAVs), or drones, have become increasingly popular. However, the civil applications of this technology, particularly for traffic data collection and analysis, still need to be thoroughly explored. For this purpose, the authors previously proposed a detailed methodological framework for the automated UAV video processing in order to extract multi-vehicle trajectories at a particular road segment. In this paper, however, the main emphasis is on the comprehensive analysis of vehicle trajectories extracted via a UAV-based video processing framework. An analytical methodology is presented for: (i) the automatic identification of flow states and shockwaves based on processed UAV trajectories, and (ii) the subsequent extraction of various traffic parameters and performance indicators in order to study flow conditions at a signalized intersection. The experimental data to analyze traffic flow conditions was obtained in the city of Sint-Truiden, Belgium. The generation of simplified trajectories, shockwaves, and fundamental diagrams help in analyzing the interrupted-flow conditions at a signalized four-legged intersection using UAV-acquired data. The analysis conducted on such data may serve as a benchmark for the actual traffic-specific applications of the UAV-acquired data. The results reflect the value of flexibility and bird-eye view provided by UAV videos; thereby depicting the overall applicability of the UAV-based traffic analysis system. The future research will mainly focus on further extensions of UAV-based traffic applications.

Graphical Abstract

1. Introduction

The management of ever increasing traffic volumes and congestion levels is one of the most critical challenges faced by modern human society. This problem further magnifies particularly at urban intersections. Moreover, there are only limited viable options available for the expansion of existing infrastructure. Therefore, transport managers are bound to employ “soft measures or policies” in order to ensure smooth and efficient traffic operations. For this purpose, it has become critical to monitor and analyze the state of traffic flow at urban and suburban intersections. However, this requires an accurate, dynamic, and quick inflow of traffic data [1].
The collection of detailed traffic data with traditional equipment, like manual counters, induction loops, fixed video camera systems, etc., is an expensive and difficult process as it either requires a large amount of installed sensors/equipment or a high number of deployed staff in order to cover the entire network [2]. Such data cannot be used to estimate the densities, as well as other more complex traffic flow phenomenon, such as the process of accumulation and dissipation of queues at the intersections. Additionally, as it is not practically possible to cover the entire network with fixed sensors or deployed staff, therefore, certain ‘hidden points’ emerge in the network [3,4]. On the other hand, advanced ITS data collection technologies e.g., vehicle-to-infrastructure (V2I), floating cars (probe vehicles with GPS) and other smartphone sensor technologies have also been employed in recent years. These technologies provide detailed and dynamic traffic data, however, they result in large datasets which are difficult to handle, especially in a short time span [5]. Additionally, such technologies might influence the actual behavior of the travelers since they already know they are being observed [4,6]. Another alternative for traffic data collection is the aerial photography or remote sensing. Satellites and manned aircrafts have been used over the years for dynamic traffic data collection. These technologies provide wide field-of-view and unbiased data, however cost and deployment issues restrict their practical employment. Recently, unmanned aerial systems have started to take the center stage for traffic monitoring, management, and control [3,7].
Unmanned Aerial Vehicles (UAVs), commonly referred to as drones, are being used in the transportation field to monitor and analyze the traffic flow and safety conditions [3,7]. Traditionally, only fixed-wing UAVs were employed for traffic monitoring purposes, however, in recent years the small rotary-wing type UAVs have also been used for traffic-related applications [8]. This non-intrusive and low-cost technology has improved rapidly and is now capable of providing high-resolution data (both in space and time) that can be used to extract vehicle trajectories and estimate traffic parameters. The UAVs can be particularly useful for data collection at sub-urban or such areas in the network where the installation of fixed sensor infrastructure is not viable. Mobility and flexibility are the key assets of this technology [9]. As this is a recent technology and the actual applications, particularly for traffic data collection, have not yet fully developed [3,4], some considerable concerns and limitations still exist, such as limited battery time, safety concerns, etc. In order to streamline the processes involved in the application of UAV technology in traffic analysis, a universal guiding framework was proposed in [9]. Additionally, a detailed methodological framework for the automated UAV traffic video processing and vehicle trajectory extraction has been presented in [1]. This paper is a detailed application and pilot study of the methodology presented by authors in [1].
In this paper, the main focus is on the traffic flow analysis of vehicle trajectories acquired via small rotary-wing UAV footage. The experimental data to analyze traffic flow conditions at a signalized intersection was obtained in the city of Sint-Truiden, Belgium. With the help of a case study, this paper attempts to evaluate the performance of the presented analytical methodology at a signalized intersection using UAV-acquired trajectory data. An analytical methodology is presented for: (i) the automatic identification of shockwaves based on processed trajectories and, (ii) the subsequent extraction of various traffic parameters and performance indicators in order to study flow conditions at a signalized intersection. The paper constitutes an in-depth flow analysis of traffic streams crossing a signalized four-legged intersection. Firstly, the trajectories are processed based on the critical point approach. These processed trajectories are then employed for shockwave analysis and queue estimation at the signalized intersection. This type of analysis conducted on UAV-based data may serve as a benchmark for further research into practical applications of UAV-based traffic analysis systems. With the significant increase in the number of UAV-based traffic studies expected in the coming years, such analytical studies based on an automated systematic framework could become a useful resource for practitioners and researchers alike.
This paper is organized as follows: first of all, the relevant UAV-based traffic analysis studies are discussed concisely. The methodology section consists of a brief description of the UAV video processing and trajectory extraction framework along with the presentation of the signalized intersection flow analysis methodology. In the next section, a case study is presented to support the proposed methodology. This includes the vehicle trajectory extraction and the subsequent traffic flow analysis. Finally, the paper is briefly concluded along with some critical discussion regarding the use of UAVs for traffic data collection, analytical applications of the framework, and proposed future developments.

2. Related Work

According to the literature, various applications of UAVs for traffic monitoring and analysis are currently being researched [2,10,11]. Various researchers [3,7,12,13] summarized the current research trends around the world regarding the use of UAVs for traffic surveillance and analysis applications.
Numerous UAV-based studies specifically for traffic analysis have been conducted in the past few years. These studies can be broadly classified into two types depending on the video processing technique, i.e., (i) manual or semi-automatic studies and (ii) automatic studies. Studies employing the semi-automatic approach have shown high accuracy, but are laborious as vehicles have to be detected and then manually tracked for a number of frames [4,6,14]. In [14], authors make use of UAV traffic footage of a stop-controlled intersection to study the drivers’ behavior. The authors determine the gap-acceptance and waiting time of vehicles while entering a major road in an urban stop-sign controlled intersection. The same authors in [6] have also attempted to determine various traffic parameters (flow, velocity, etc.) from UAV-acquired video data. The authors further compare the calculated values with the theoretical macro-simulation models. Similarly, the authors in [4] have used UAVs to conduct an experiment over an intersection and then using the semi-automated approach extract the vehicle trajectories and consequently determine various traffic parameters. As stated earlier, the semi-automatic approach requires a great deal of time for processing while, on the other hand, the automatic approach promises a quick processing and analysis procedure, ultimately leading to the real-time analysis of the UAV acquired data. Recently, the number of studies based on the automated approach have increased [1,15,16,17,18]. The authors have been attempting to extract various traffic parameters and vehicle trajectories in an automatic environment by using state-of–the art object detection and tracking algorithms.
A great deal of research has been conducted to analyze traffic flow at signalized intersections using data acquired from different sources. This also includes the shockwave and queue analysis based on the traffic video data [19,20,21]. Additionally, a number of studies have employed vehicle trajectories from the NGSIM data for shockwave analysis and queue estimation [22,23,24,25]. All these studies have devised and demonstrated various ways to evaluate the performance of signalized intersections. However, all of the existing studies mentioned up until now have been principally based on the fixed video camera systems. Only a couple of studies have been found in the existing literature that have employed UAV-based traffic data in order to analyze the traffic safety and flow conditions specifically for a signalized intersection. In [26], the authors analyze the traffic safety conditions at an urban signalized intersection using data acquired via a small quadcopter UAV. The authors focus on the pedestrian-vehicle conflicts and estimate different parameters, e.g., time-to-collision (TTC) and post-encroachment time (PET), as safety performance measures. On the other hand, authors in [27] present a computational model based on the famous traffic wave theory in order to determine the velocity of stop-start waves at an urban intersection. The authors have employed the UAV-acquired traffic data to validate the proposed model. However, the paper is focused mainly on the derivation and validation of the model equations.

3. Methodology

This paper is principally based on the vehicle trajectories extracted via the UAV-based video processing and vehicle trajectory extraction framework originally presented in [1]. The extracted trajectories are further analyzed based on the proposed analytical methodology. This section consists of a brief description of the UAV based trajectory extraction framework, which is then followed by an explanation of the signalized intersection flow analysis process.

3.1. UAV Video Processing Framework.

The authors in [1] proposed a detailed UAV-based traffic video processing framework in order to automatically extract multi-vehicle trajectories for an area of interest. The framework consisted of an in-depth description of the steps involved in the systematic and efficient processing of the UAV-based traffic data. The whole process as categorized in the framework included five modules: (i) pre-processing, (ii) stabilization, (iii) geo-registration, (iv) vehicle detection and tracking, and (v) trajectory management. Moreover, certain additions have been made in the framework recently in order to further optimize the final trajectories. Figure 1 illustrates the components of the UAV-based traffic video data processing framework.
The different modules of the proposed framework are elaborated in detail in [1]. However, in this paper the different stages are discussed briefly so as to give an overview of the UAV video processing and analysis.
Firstly, the traffic videos acquired via UAVs are pre-processed. The main target of the step is to prepare the UAV video for the actual processing and analysis steps by removing or minimizing the undesirable aspects of the recorded video e.g., fish-eye effect, ascending/descending of UAV, etc. The pre-processing step ensures an optimal usage of the computational power, hence, increasing the processing speed. This step is followed by the video stabilization and geo-registration of the UAV-acquired videos. Since a slight camera vibration can induce undesired movement in captured images, the stabilization step is critical to minimize the level of instability or shakiness in UAV videos. Once the UAV videos are stabilized, the efficiency of all other modules of the framework, especially the vehicle detection and tracking process, significantly improves. Further, the geo-registration process ensures an efficient calibration and conversion of the UAV acquired mono-vision 2D image coordinates into a real-world coordinate system. This is done in order to enhance the applicability of the extracted vehicle trajectory data. For this purpose, the UAV image is firstly calibrated according to the real-world distances and assigned a coordinate system. These image coordinates can then be converted to real-world coordinates based on a 3 × 3 homography matrix, which is computed after comparing the extracted frames with the referenced map. The details of the process are given in [1]. In a nutshell, this process enables the user to integrate the geo-referenced calibrated trajectories with any GIS application, thereby assisting in an actual-scale visualization and estimation of various traffic parameters.
Once the UAV images are geo-referenced or calibrated to a specific coordinate system, the next step is the automatic detection and tracking of multiple road users. The efficiency and accuracy of this process is critical in order to obtain a reliable and consistent set of trajectory data. The stabilized and calibrated UAV videos are fed into the detection and tracking module which constitutes a number of sub-modules, as indicated in Figure 1. The vehicles in motion are detected and tracked by a series of algorithms implemented using the OpenCV library in C++. First of all, the optical flow tracking and background subtraction algorithms identify the pixels that are in motion. These moving pixels representing the vehicles are then tracked over a series of frames with the help of blob tracking. Finally, the Kalman filter algorithm helps in achieving a smooth tracking data.
A proper management system for the handling of extracted vehicle trajectories is critical to efficiently deal with data extracted during the vehicle detection and tracking process. The data has to be easily accessible so that it can be effectively used for further traffic analysis. For this purpose, a text file is generated which contains all the extracted coordinates of the vehicles detected and tracked in the area of interest. Such a data file allows the analyst to conveniently sort and post-process the data in order to extract various traffic parameters and create different types of graphical displays and illustrations of the vehicle trajectory data.

3.2. Signalized-Intersection Flow Analysis Methodology

The vehicle trajectories extracted via UAV based traffic data collection process can be employed for various traffic related applications. This paper presents an analytical methodology specifically aimed for a systematic traffic flow analysis at signalized intersections. The proposed methodology streamlines the steps involved in the efficient employment of the extracted vehicle trajectory data in order to analyze the flow at signalized intersections. The methodology as shown in Figure 2 consists basically of four modules: (i) the automated simplified trajectories extraction module, (ii) the automated shockwave identification module, (iii) the traffic parameters estimation module, and (iv) the performance indicators’ estimation module.
First of all, the extracted vehicle trajectories are fed as an input into the automated simplified trajectories extraction module. The raw trajectories are processed in order to simplify the visualization of the transformation of traffic flow at a signalized intersection. For this purpose, the ‘critical point’ concept presented in [22] is employed with some modifications. The critical point is defined as that point in a vehicle trajectory after which the motion of vehicle changes significantly, e.g., a critical point may be detected on a vehicle trajectory before it starts accelerating or decelerating. Based on this approach, the critical points are identified on the vehicle trajectories which represent the major or definitive changes in the motion of vehicles along the road. The critical point approach does not only help in simplifying the further analysis of trajectories, but also increases the efficiency of the system by reducing the amount of data to be processed and analyzed. The logic of the critical point extraction and traffic flow state identification algorithm is illustrated in Figure 3, where xi is any point on the trajectory at time ti with velocity vi and acceleration ai. The critical points (CPs) are generated by comparing trajectory points with certain threshold values for velocity and acceleration. In order to minimize the chances of false CP detection, the succeeding ‘n’ (usually 5 or 10) number of acceleration values are checked. Moreover, the stopping velocity threshold vs (normally less than 5 km/h) is compared with the velocity vcp at the critical point. This assists in identifying the true traffic flow regime. Depending on the type of condition it satisfies, the vehicle trajectories can be classified into various regimes, i.e., uniform motion, accelerated/decelerated motion or stationary regime. It is worth mentioning here that the proposed approach is, numerically, very efficient to compute, thereby having an advantage over the more complex approaches (e.g., piece-wise linear regression methods), particularly for cases requiring a quick processing and analysis system.
The resulting simplified trajectories of multiple vehicles can then be utilized efficiently for the identification of different shockwaves that are generated within the proximity of a signalized intersection. The critical points of a series of trajectories are grouped together in order to identify shockwaves. Moreover, the processed vehicle trajectories can also be used to estimate various traffic flow parameters, e.g., density, flow, speed, etc., for each flow regime. These parameters can then be used to develop the speed-flow-density fundamental diagrams which further assist in determining the unknown parameters.
A number of performance indicators can also be extracted to evaluate the performance of the traffic infrastructure under consideration. The combination of space-time and fundamental diagrams is essential for determining the characteristics of traffic flow at the intersection in detail. These diagrams can be used effectively not only to study the signal cycle lengths, but also to determine the speed of generation and dissipation of shockwaves. Additionally, the extracted parameters and the shockwave speeds can also be used for detailed queue analysis at an intersection or any other interrupted flow situation. All these parameters and performance indicators have been estimated and described in detail in the next section with the help of a case study.

4. Case Study

In this section, a detailed case study is presented in order to validate and demonstrate the practicality of the UAV-based traffic data collection, processing and analytical methodology. The data collection experiment is followed by the automated extraction of the vehicle trajectories which are then further employed for detailed traffic flow analysis. The following sub-sections present an in-depth description of the whole experiment and the analytical process:

4.1. Experiment Specifications

In order to obtain an experimental dataset, UAV flights were conducted in the suburbs of the city of Sint-Truiden, Belgium. A four-legged sub-urban signalized intersection was selected as the area under observation. The location as shown in Figure 4 is a linking junction between the Belgian national highways N80 (speed limit: 120 km/h) and N718 (speed limit: 90 km/h), with two lanes in either direction. The specified four-legged intersection primarily handles the traffic leading to and from the city of Hasselt into the center and suburbs of Sint-Truiden. A detailed flight planning process was carried out before the actual conduction of the flights. This included the operational, as well as safety and legal considerations. The UAV flights were conducted in order to capture the Friday evening rush hour (16:30 to 18:00 h). Importantly, the weather and wind conditions were perfect for the UAV flights, i.e., mostly clear skies with gentle wind levels (18 km/h, Beaufort scale 3).
A high-end custom-built octocopter UAV, i.e., Argus-One (from Argus-Vision) with an attached Panasonic Lumix GH4 DSLM camera (Panasonic Corporation: Kadoma, Osaka, Japan) was employed for a series of UAV flights.The equipment used for this experiment belonged to a UAV-imaging company named Argus-Vision, registered in Tongeren, Belgium. Table 1 lists the detailed technical specifications of the equipment used. The equipment as shown in Figure 5 provides stable and high-resolution (4K@25 fps) video data with nearly 10–12 min of flight time. Importantly, the 4K video from this camera does not contain any fish-eye effects (curvature, wide field of view). Additionally, an attached live-feed transmission system allowed to optimize the camera angles for the best view of the intersection during the flight. The UAV was hovered (constant altitude, zero velocity) over the intersection at the altitudes of 80 m and 60 m above ground level. After the conduction of a series of flights over the intersection, the recorded video was trimmed in order to remove insignificant parts of the video which included, e.g., the take-off and landing maneuvers of the UAV. Eventually, a nearly 15-min useful traffic video with 22,649 frames was attained after the pre-processing or trimming step. It is also worth-mentioning that the UAV camera covered approximately 0.07 km2 of intersection area from an 80 m height.

4.2. Vehicle Trajectories

The extraction of vehicle trajectories crossing the intersection under observation was done using the proposed UAV video processing framework. Specifically, the developed computer vision algorithm, as described in the vehicle detection and tracking module of the framework, was utilized for this purpose. All the processing and analysis was done on an HP (Hewlett Packard Enterprise: Palo Alto, CA, USA) Probook 650 G1 machine, having an Intel® Core™ i5-4210M (2.60 GHz) processor with 4-GB RAM and Windows 8.1 (64 bits). It is important to mention that UAV-acquired images were calibrated and a Cartesian coordinate axis was assigned, with the center of the intersection designated as an origin.
Figure 6 consists of various graphs depicting a selected set of extracted vehicle trajectories. These trajectories and speed profiles can be used to make a number of interpretations regarding the drivers; behavior, and overall traffic flow. Figure 6a,b depict the trajectory of a sample vehicle along with its corresponding speed profile. Figure 6a reflects that the sample car initially came to rest upon reaching the queue at the signalized intersection, however it moved after a few seconds in order to reduce the queue spacing or the headway. Figure 6b includes the instantaneous speed accompanied by the running average and smoothed function speed curves. Running or moving averages help in illustrating the trend of instantaneous speed over time as shown in Figure 6b. An interval of 50 frames, implying a 2-s time window, was used to determine the running average speed of the sample vehicle. Similarly, smoothed curve is also used to demonstrate the overall speed trend. The smoothed curve is basically a second-degree (quadratic) polynomial regression curve fit. These curves make the interpretation of instantaneous speed data more efficient. In addition, Figure 6c shows the trajectories of a platoon of vehicles on a space-time diagram while Figure 6d illustrates a sample set of extracted trajectories overlaid on the UAV-acquired image.
The drivers’ behavior while approaching a signalized intersection can be observed from graph in Figure 6c. It can be inferred from vehicle trajectories that each individual driver has its own specific braking behavior in order to halt at the traffic signal. Some drivers decelerate smoothly to a stationary position, e.g., car 8 in Figure 6c. On the other hand, some drivers have the tendency of decelerating strongly as indicated by the steep curve of car 1’s trajectory (Figure 6c). The average speed of the sample vehicle (car 7) based on the smoothing function was measured to be 29.52 km/h (8.2 m/s) and 24.48 km/h (6.8 m/s), respectively, for the intersection arrival and departure maneuvers (Figure 6b).
Apart from the observation of through traffic, the space-time diagrams can also be used to study the behavior of turning vehicles. In this regard, Figure 6c indicates the movement of a right-turning vehicle (car 9). The changing slope of car 9’s trajectory suggests that the vehicle completed the turning maneuver safely by adjusting its speed slightly. All these examples show that the trajectory data can be effectively used to study and analyze the traffic flow, as well as safety conditions on a specific intersection.
It is important to mention here that the precision of all these calculations is highly dependent on the calibration of the extracted frames. Therefore, a two-step calibration process was conducted in order to ensure high accuracy and minimal errors. This involved several on-site measurements followed by a point correspondence step. In this step, the on-site measurements were verified with the referenced maps. Various prominent stationary objects visible in the UAV image were matched with their coordinates and distances on a referenced satellite image. All this helped in the extraction of an accurate trajectory dataset, ultimately leading to a precise calculation of other traffic parameters, as well.
Moreover, the accuracy of the estimated speed was also verified with the ground-truth speed of the sample vehicle (car 7) during different flow states. This was done by calculating the mean absolute percentage error (MAPE) as shown in Equation (1). The ground-truth speed, as given in Equation (2), was determined by observing the number of frames (Nf) taken by the vehicle to travel between two points of known distance (d). In order to determine time (t) for speed estimation, the number of frames were divided by the frames per second (fps) of the UAV video. The ground-truth speed was evaluated for both the intersection arrival and departure phases. Eventually, the mean absolute percentage error was found out to be 5.85%. This shows a good level of accuracy especially for an oblique angled UAV video.
MAPE =   1 N 1 , , N GROUND   TRUTH ESTIMATION GROUND   TRUTH   ×   100 %
GROUND   TRUTH   SPEED =   d t = d N f f p s

4.3. Traffic Flow Analysis

As mentioned earlier, the extracted vehicle trajectories can be used for various types of traffic analyses, e.g., traffic safety analysis, drivers’ behavior analysis, traffic flow analysis, etc. In this paper, however, the focus has been on the traffic flow analysis. Therefore, in this regard, the shockwave analysis can be of particular interest especially for signalized intersections where the vehicle flow is distinctively transformed from one state to another.
The set of trajectories as shown in Figure 6c can be processed based on the critical point concept presented in previous section. Figure 7 shows the transformation of vehicle trajectories into simplified trajectories, which are then used for further traffic analysis.
Figure 7a demonstrates the extracted critical points on a sample trajectory while Figure 7b shows the resulting simplified trajectory. This process results in a series of simplified trajectories with distinct flow states. Figure 7c shows the space-time diagram of the processed trajectories of the vehicles while approaching, waiting, and eventually crossing the signalized intersection. It can be deduced from the figure that the motion of vehicles have basically three types of flow states, i.e., (A) the free-flowing state before reaching the signal, (B) the formation of queue during the red phase of the signal (stationary state), and (C) the dissipation flow state during the green phase of the signal cycle. These changes in flow states at a signalized intersection result in the generation of backward shockwaves as indicated in Figure 7c. Moreover, the signal cycle length can also be determined from such space-time shockwave diagrams. In this particular case, it can be observed that the signalized intersection had a 40-s red phase interval while the green phase interval was of approximately 40 s as well. These estimated times were also verified with the site observation and video recordings. It is clearly evident from Figure 7c that simplified trajectories make the analysis and interpretation of traffic flow much simpler, thereby assisting in an efficient estimation of various traffic parameters and performance indicators. Additionally, Figure 7d illustrates the positioning of the three traffic flow states, i.e., A, B, and C on the UAV-acquired image of the intersection. It is important to emphasize that the areas of these flow states (specifically A and B) can vary, depending on various factors such as the traffic volume and the length of traffic signal cycle.
The space-time diagram shown in Figure 7c can also be used to produce a flow-density fundamental diagram. The flows, densities and speeds for different traffic states, i.e., A, B, and C can be inferred from the trajectory data obtained through UAV-based traffic videos. Figure 8a highlights the overall grid-area used for density estimation at the signalized intersection, whereas Figure 8b demonstrates the specific strips of area on the space-time diagram that are utilized for density estimation of the three traffic states.
Table 2 below shows the values of the traffic parameters i.e., flow, speed, and density for each of the three states. It is worth mentioning here that the density for state B represents the jam density (kj) while the flow during state C is the maximum flow rate (qmax). Additionally, the calculated values of kj and qmax can be verified with the default values given for the type of infrastructure and prevailing traffic conditions in the Highway Capacity Manual [28]. All these parameters cannot only be used to link the space-time diagram with the fundamental diagram, but also can be used for further analysis, including the determination of shockwave speeds and queue lengths.
Figure 9 represents the flow-density fundamental diagram along with the shockwaves and speeds of the vehicles in each traffic state. The speed of the shockwaves can be determined using the Equations (3) and (4). The computed values are −10 km/h and −24 km/h for the accumulating wave (AB) and the dissipating wave (BC), respectively. The negative sign reflects the direction of the propagation of the waves, i.e., backwards:
ω A B = q A k A k j
ω B C = q m a x k C k j
Moreover, other performance indicators for the signalized intersection can be estimated based on the available data. An example of this performance measure can be the maximum queue length which can be used to verify that the end of the queue does not influence the flow on a neighboring intersection. Equation (5) can be used to determine the maximum queue length (QM), where γ is the duration of the red-phase of the signal. The maximum queue length turns out to be 190.47 meters with a 40 s red phase:
Q M =   γ 3600 ×   ω B C ×   ω A B ω B C ω A B
Similarly, Equation (6) helps in calculating the time (TM) required for the complete dissipation of queue after the signal turns green. Substituting the calculated shockwave speeds and the red-phase duration, i.e., 40 s into the equation, the dissipation time is calculated to be 28.57 s. This value can also be verified with the shockwave diagram in Figure 7c, where the point of the intersection of shockwaves represents the queue dissipation time:
T M =   γ   ×     ω A B ω B C ω A B
Additionally, the accuracy of the values calculated for various performance indicators can be verified by measuring the mean absolute percentage error (MAPE). The estimated quantity can be compared with the observed ground truth value as shown in Equation (1). A mean error of 7.5% was calculated in the estimation of the maximum queue length in the above example. The ground truth queue length was calculated by multiplying the number of vehicles in the queue by the minimum headway distance (normally 25 feet (~7.6 m)).

5. Discussion

As mentioned earlier, UAVs or drones, have several potential applications for traffic analysis and management. Therefore, there is a need to streamline the complete process and conduct validation studies. Accordingly, this paper has aimed to demonstrate the real-life application of the UAVs for traffic analysis, particularly in the scenario of signalized intersection flow analysis. The overall analytical process is principally based on vehicle trajectories extracted via a previously-proposed automated UAV video processing framework [1]. Based on these vehicle trajectories, the signalized intersection traffic flow analysis has been conducted for a sub-urban 4-legged intersection, situated in Sint-Truiden, Belgium. The proposed methodological analysis conducted on such experimental data may serve as a proof-of-concept for the actual traffic-specific applications of the UAV-acquired data. Such studies can be of particular interest not only for researchers, but also for practitioners and traffic experts responsible for transport planning and management operations. Furthermore, this study can also lead to the integration of UAV-based traffic data with the more conventional traffic data collected via fixed cameras, loop detectors, etc. The UAV data can provide an additional dimension to the existing traditional data sources.
The UAV-acquired intersection traffic data was used to estimate various traffic parameters. These include the speed, flow, density, shockwaves, signal cycle length, queue lengths, queue dissipation time etc. Importantly, the values of estimated traffic parameters were found to be in accordance with the ground-truth values as well as with the values found in the literature for signalized intersections. The ground truth values were calculated manually based on the recorded videos and site observations. The estimated and the ground-truth values were then used to evaluate the mean absolute percentage error (MAPE). The mean error for the speed of the sample vehicle was approximately 5% while the error for the estimated queue length was approximately 7.5%. Additionally, the values of flow and density were also verified with the default values provided in the Highway Capacity Manual for the specific type of infrastructure and prevailing traffic conditions. Overall, the estimated traffic parameters did not have major errors.
Although, vehicle trajectories and the corresponding traffic parameters were extracted successfully using the UAV-acquired data, there are still some limitations attached with the automated UAV video processing. Various types of errors can occur in vehicle detection and tracking due to different reasons such as partial occlusions, shadows, objects in close proximity, false detections, etc. Therefore, the resulting trajectories may contain some noise and errors which have to be dealt-with accordingly. Additionally, some limitations regarding the current UAV technology also exist. These include the limited flight time of small UAVs, along with some other concerns regarding the safety of flight operations. The flight time of UAVs depends on internal, as well as external, factors. Internal factors include the size, payload, battery type, etc., whereas the external factors consist of weather conditions, wind conditions, status of GPS satellites, etc. Apart from limited flight times, the legal considerations, including the safety and privacy concerns, also limit the use of UAVs for practical applications. In particular, the current Belgian law restricts the small UAVs to fly directly above vehicles and population. Therefore, in this study, the UAV was hovered at an oblique angle to the intersection traffic, thereby compromising the accuracy of extracted trajectories, as well as complicating the overall video processing. Nevertheless, all these concerns will eventually fade away with the development of more reliable and robust technology in the coming years.

6. Conclusions

In this paper, the main focus has been on the traffic flow analysis of the extracted vehicle trajectories. For this purpose, an analytical methodology has been presented for analyzing traffic flow conditions at a signalized intersection. In order to validate the methodology, an experimental UAV-based dataset was collected at a sub-urban four-legged signalized intersection. A dataset of vehicle trajectories was extracted and illustrated graphically in the form of space-time diagrams. These extracted trajectories were then employed for further traffic flow analyses relevant for the signalized intersection traffic. The generation of simplified trajectories, shockwaves, and fundamental diagrams help in analyzing the interrupted flow conditions at the signalized intersection. Importantly, the values of the estimated traffic parameters did not have significant errors. The results of the analysis reflect the value of the flexibility and birds-eye view provided by UAV videos, thereby depicting the overall applicability of the UAV-based traffic analysis system. However, the factors affecting the robustness of the system have to be addressed in the future research in order to further optimize the use of UAVs for traffic data collection. Apart from it, the future research will also be focused on further improving and extending the traffic-related UAV applications. Various approaches for further automation and optimization of vehicle trajectories’ analysis, including the ‘critical point’ approach, will be explored in more detail. Additionally, the prospects of real-time processing and analysis of traffic data obtained via UAVs will also be inspected.

Acknowledgments

This research was supported by Transportation Research Institute (IMOB) of Hasselt University.

Author Contributions

Muhammad Arsalan Khan conceived and designed the concept; Muhammad Arsalan Khan, and Wim Ectors conducted the UAV data collection; Tom Bellemans, Davy Janssens, and Geert Wets reviewed the manuscript; and Muhammad Arsalan Khan performed the analysis and also wrote the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Khan, M.A.; Ectors, W.; Bellemans, T.; Janssens, D.; Wets, G. Unmanned Aerial Vehicle-Based Traffic Analysis: A Methodological Framework for Automated Multi-Vehicle Trajectory Extraction. Transp. Res. Rec. J. Transp. Res. Board 2017, 32, 25–33. [Google Scholar] [CrossRef]
  2. Coifman, B.; McCord, M.; Mishalani, R.G.; Iswalt, M.; Ji, Y. Roadway traffic monitoring from an unmanned aerial vehicle. IEE Proc. Intell. Transp. Syst. 2006, 153, 11–20. [Google Scholar] [CrossRef]
  3. Puri, A. A Survey of Unmanned Aerial Vehicles(UAV) for Traffic Surveillance; Technical Report; Department of Computer Science and Engineering, University of South Florida: Tampa, FL, USA, 2005. [Google Scholar]
  4. Barmpounakis, E.N.; Vlahogianni, E.I.; Golias, J.C. Extracting Kinematic Characteristics from Unmanned Aerial Vehicles. In Proceedings of the Transportation Research Board 95th Annual Meeting, Washington, DC, USA, 10–14 January 2016; p. 16. [Google Scholar]
  5. Vlahogianni, E.I. Computational Intelligence and Optimization for Transportation Big Data: Challenges and Opportunities. In Engineering and Applied Sciences Optimization; Springer: Berlin/Heidelberg, Germany, 2015. [Google Scholar]
  6. Salvo, G.; Caruso, L.; Scordo, A. Urban Traffic Analysis through an UAV. Procedia Soc. Behav. Sci. 2014, 111, 1083–1091. [Google Scholar] [CrossRef] [Green Version]
  7. Kanistras, K.; Martins, G.; Rutherford, M.J.; Valavanis, K.P. Survey of unmanned aerial vehicles (UAVs) for traffic monitoring. In Handbook of Unmanned Aerial Vehicles; Springer: Dordrecht, The Netherlands, 2015; pp. 2643–2666. [Google Scholar] [CrossRef]
  8. Lee, J.; Zhong, Z.; Kim, K.; Dimitrijevic, B.; Du, B.; Gutesa, S. Examining the Applicability of Small Quadcopter Drone for Traffic Surveillance and Roadway Incident Monitoring. In Proceedings of the Transportation Research Board 94th Annual Meeting, Washington, DC, USA, 11–15 January 2015. [Google Scholar]
  9. Khan, M.A.; Ectors, W.; Bellemans, T.; Janssens, D.; Wets, G. UAV-Based Traffic Analysis: A Universal Guiding Framework Based on Literature Survey. Transp. Res. Procedia 2017, 22, 541–550. [Google Scholar] [CrossRef]
  10. Puri, A.; Valavanis, K.; Kontitsis, M. Statistical profile generation for traffic monitoring using real-time UAV based video data. In Proceedings of the 2007 Mediterranean Conference on Control & Automation, Athens, Greece, 27–29 June 2007. [Google Scholar]
  11. Heintz, F.; Rudol, P.; Doherty, P. From images to traffic behavior—A UAV tracking and monitoring application. In Proceedings of the 2007 10th International Conference on Information Fusion, Quebec, QC, Canada, 9–12 July 2007. [Google Scholar]
  12. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
  13. Barmpounakis, E.N.; Vlahogianni, E.I.; Golias, J.C. Unmanned Aerial Aircraft Systems for transportation engineering: Current practice and future challenges. Int. J. Transp. Sci. Technol. 2017, 5, 111–122. [Google Scholar] [CrossRef]
  14. Salvo, G.; Caruso, L.; Scordo, A. Gap acceptance analysis in an urban intersection through a video acquired by an UAV. In Proceedings of the 5th European Conference of Civil Engineering (ECCIE ’14), Florence, Italy, 22–24 November 2014; pp. 199–205. [Google Scholar]
  15. Apeltauer, J.; Babinec, A.; Herman, D.; Apeltauer, T. Automatic Vehicle Trajectory Extraction for Traffic Analysis from Aerial Video Data. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, XL-3/W2, 9–15. [Google Scholar] [CrossRef]
  16. Gao, H.; Kong, S.L.; Zhou, S.; Lv, F.; Chen, Q. Automatic Extraction of Multi-Vehicle Trajectory Based on Traffic Videotaping from Quadcopter Model. Appl. Mech. Mater. 2014, 552, 232–239. [Google Scholar] [CrossRef]
  17. Oh, H.; Kim, S.; Shin, H.S.; Tsourdas, A.; White, B. Behaviour recognition of ground vehicle using airborne monitoring of unmanned aerial vehicles. Veh. Int. J. Syst. Sci. 2014, 45, 2499–2514. [Google Scholar] [CrossRef] [Green Version]
  18. Zheng, C.; Breton, A.; Iqbal, W.; Sadiq, I.; Elsayed, E.; Li, K. Driving-Behavior Monitoring Using an Unmanned Aircraft System (UAS). In Proceedings of the Digital Human Modeling: Applications in Health, Safety, Ergonomics and Risk Management: Ergonomics and Health: 6th International Conference, Los Angeles, CA, USA, 2–7 August 2015; pp. 305–312. [Google Scholar]
  19. Chai, Q.; Cheng, C.; Liu, C.; Chen, H. Vehicle Queue Length Measurement Based on a Modified Local Variance and LBP. In Emerging Intelligent Computing Technology and Applications, ICIC 2013; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  20. Hourdos, J.; Zitzow, S. Investigation of the Impact the I-94 ATM System has on the Safety of the I-94 Commons High Crash Area; Minnesota Department of Transportation Research Services & Library: St. Paul, MN, USA, 2014. [Google Scholar]
  21. Morris, B.T.; Shirazi, M.S. Intersection Monitoring Using Computer Vision Techniques for Capacity, Delay, and Safety Analysis. In Computer Vision and Imaging in Intelligent Transportation Systems in Communications in Computer and Information Science; Springer: Berlin/Heidelberg, Germany, 2017; pp. 163–193. [Google Scholar]
  22. Cheng, Y.; Qin, X.; Jin, J.; Ran, B. An Exploratory Shockwave Approach for Signalized Intersection Performance Measurements Using Probe Trajectories. In Proceedings of the Transportation Research Board 89th Annual Meeting, Washington, DC, USA, 10–14 January 2010; pp. 1–23. [Google Scholar]
  23. Cheng, Y.; Qin, X.; Jin, J.; Ran, B.; Anderson, J. Cycle-by-Cycle Queue Length Estimation for Signalized Intersections Using Sampled Trajectory Data. Transp. Res. Rec. J. Transp. Res. Board 2011, 2257, 87–94. [Google Scholar] [CrossRef]
  24. Izadpanah, P.; Hellinga, B.; Fu, L. Automatic Traffic Shockwave Identification Using Vehicles’ Trajectories. In Proceedings of the Transportation Research Board 88th Annual Meeting, Washington, DC, USA, 11–15 January 2009; p. 14. [Google Scholar]
  25. Lu, X.; Skabardonis, A. Freeway Traffic Shockwave Analysis: Exploring NGSIM Trajectory Data. In Proceedings of the Transportation Research Board 86th Annual Meeting, Washington, DC, USA, 21–25 January 2007; p. 19. [Google Scholar]
  26. Chen, P.; Zeng, W.; Yu, G.; Wang, Y. Surrogate Safety Analysis of Pedestrian-Vehicle Conflict at Intersections Using Unmanned Aerial Vehicle Videos. J. Adv. Transp. 2017, 2017, 5202150. [Google Scholar] [CrossRef]
  27. Cheng, K.; Chang, Y.; Peng, Z. A Computational Model for Stop-Start Wave Propagation Velocity Estimation Based on Unmanned Aerial Vehicle. In Proceedings of the Transportation Research Board 92nd Annual Meeting, Washington, DC, USA, 13–17 January 2013; p. 13. [Google Scholar]
  28. Transportation Research Board. HCM 2010: Highway Capacity Manual; Transportation Research Board: Washington, DC, USA, 2010. [Google Scholar]
Figure 1. The automated UAV video processing and analysis framework [1] with some modifications.
Figure 1. The automated UAV video processing and analysis framework [1] with some modifications.
Remotesensing 10 00458 g001
Figure 2. The proposed methodology for the extraction of signalized intersection traffic flow parameters.
Figure 2. The proposed methodology for the extraction of signalized intersection traffic flow parameters.
Remotesensing 10 00458 g002
Figure 3. The critical point (CP) extraction and traffic flow state identification algorithm.
Figure 3. The critical point (CP) extraction and traffic flow state identification algorithm.
Remotesensing 10 00458 g003
Figure 4. Location map of the observed area, along with the satellite and UAV images of the studied intersection (shown in the inset).
Figure 4. Location map of the observed area, along with the satellite and UAV images of the studied intersection (shown in the inset).
Remotesensing 10 00458 g004
Figure 5. The Argus-one UAV: (left) take-off position, and (right) in-flight.
Figure 5. The Argus-one UAV: (left) take-off position, and (right) in-flight.
Remotesensing 10 00458 g005
Figure 6. (a) The space-time diagram of the trajectory of a sample vehicle; (b) the speed profile of the sample vehicle; (c) the space-time diagram of extracted trajectories (labelled with assigned number) of a group of vehicles; and (d) an illustration of extracted trajectories.
Figure 6. (a) The space-time diagram of the trajectory of a sample vehicle; (b) the speed profile of the sample vehicle; (c) the space-time diagram of extracted trajectories (labelled with assigned number) of a group of vehicles; and (d) an illustration of extracted trajectories.
Remotesensing 10 00458 g006
Figure 7. (a) Extracted critical points (CP) on vehicle trajectory; (b) simplified trajectories with identified traffic states; (c) the generated shockwaves and signal cycle length; and (d) an illustration of various traffic states on the UAV-acquired image.
Figure 7. (a) Extracted critical points (CP) on vehicle trajectory; (b) simplified trajectories with identified traffic states; (c) the generated shockwaves and signal cycle length; and (d) an illustration of various traffic states on the UAV-acquired image.
Remotesensing 10 00458 g007
Figure 8. (a) Highlighted area for density estimation; and (b) density estimation for various traffic states.
Figure 8. (a) Highlighted area for density estimation; and (b) density estimation for various traffic states.
Remotesensing 10 00458 g008
Figure 9. Density-flow fundamental diagram with shockwaves.
Figure 9. Density-flow fundamental diagram with shockwaves.
Remotesensing 10 00458 g009
Table 1. Technical specifications of Argus-one UAV and the attached camera.
Table 1. Technical specifications of Argus-one UAV and the attached camera.
UAVCamera
Technical FeaturesTechnical Features
BodyCarbon fiberCameraPanasonic Lumix GH4 DSLM
Dimensions1200 mm × 1000 mm × 600 mmBody TypeSLR-style mirrorless
Number of Rotors8Weight560 g
Battery16,000 mAH Lipo BatteryMega Pixels16 MP
Flight TimeAround 12 minVideo Resolution4K (3840 × 2160 pixels)
Payload0–3 kgFrame Rate25 fps
GPS DJI A2 GPS-Compass Pro
Range1200 m
Speed0–80 km/h
Table 2. The traffic parameters for each traffic state.
Table 2. The traffic parameters for each traffic state.
Traffic StateFlow q (veh/h/lane)Speed v (km/h)Density k (veh/km/lane)
A12003040
B00160 (kj)
C1920 (qmax)2480

Share and Cite

MDPI and ACS Style

Khan, M.A.; Ectors, W.; Bellemans, T.; Janssens, D.; Wets, G. Unmanned Aerial Vehicle-Based Traffic Analysis: A Case Study for Shockwave Identification and Flow Parameters Estimation at Signalized Intersections. Remote Sens. 2018, 10, 458. https://doi.org/10.3390/rs10030458

AMA Style

Khan MA, Ectors W, Bellemans T, Janssens D, Wets G. Unmanned Aerial Vehicle-Based Traffic Analysis: A Case Study for Shockwave Identification and Flow Parameters Estimation at Signalized Intersections. Remote Sensing. 2018; 10(3):458. https://doi.org/10.3390/rs10030458

Chicago/Turabian Style

Khan, Muhammad Arsalan, Wim Ectors, Tom Bellemans, Davy Janssens, and Geert Wets. 2018. "Unmanned Aerial Vehicle-Based Traffic Analysis: A Case Study for Shockwave Identification and Flow Parameters Estimation at Signalized Intersections" Remote Sensing 10, no. 3: 458. https://doi.org/10.3390/rs10030458

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop