Next Article in Journal
Influence of the Coriolis Force on Spreading of River Plumes
Next Article in Special Issue
Hybrid-Scale Hierarchical Transformer for Remote Sensing Image Super-Resolution
Previous Article in Journal
Slow-Time MIMO Waveform Design Using Pulse-Agile-Phase-Coding for Range Ambiguity Mitigation
Previous Article in Special Issue
A Prediction Method of Ionospheric hmF2 Based on Machine Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

Application of Data Sensor Fusion Using Extended Kalman Filter Algorithm for Identification and Tracking of Moving Targets from LiDAR–Radar Data

by
Oscar Javier Montañez
1,
Marco Javier Suarez
2 and
Eduardo Avendano Fernandez
1,*
1
School of Electronic Engineering, Pedagogical and Technological University of Colombia, Sogamoso 152210, Colombia
2
School of Systems and Computing Engineering, Pedagogical and Technological University of Colombia, Sogamoso 152210, Colombia
*
Author to whom correspondence should be addressed.
Remote Sens. 2023, 15(13), 3396; https://doi.org/10.3390/rs15133396
Submission received: 30 April 2023 / Revised: 4 June 2023 / Accepted: 14 June 2023 / Published: 4 July 2023

Abstract

:
In surveillance and monitoring systems, the use of mobile vehicles or unmanned aerial vehicles (UAVs), like the drone type, provides advantages in terms of access to the environment with enhanced range, maneuverability, and safety due to the ability to move omnidirectionally to explore, identify, and perform some security tasks. These activities must be performed autonomously by capturing data from the environment; usually, the data present errors and uncertainties that impact the recognition and resolution in the detection and identification of objects. The resolution in the acquisition of data can be improved by integrating data sensor fusion systems to measure the same physical phenomenon from two or more sensors by retrieving information simultaneously. This paper uses the constant turn and rate velocity (CTRV) kinematic model of a drone but includes the angular velocity not considered in previous works as a complementary alternative in Lidar and Radar data sensor fusion retrieved using UAVs and applying the extended Kalman filter (EKF) for the detection of moving targets. The performance of the EKF is evaluated by using a dataset that jointly includes position data captured from a LiDAR and a Radar sensor for an object in movement following a trajectory with sudden changes. Additive white Gaussian noise is then introduced into the data to degrade the data. Then, the root mean square error (RMSE) versus the increase in noise power is evaluated, and the results show an improvement of 0.4 for object detection over other conventional kinematic models that do not consider significant trajectory changes.

1. Introduction

In surveillance and monitoring systems, the use of unmanned aerial vehicles (UAVs), such as drones or mobile vehicles, provides advantages in terms of access to the environment for exploration like augmented range, maneuverability, and safety due to their omnidirectional displacement capacity. These tasks must be performed autonomously by capturing information from sensors in the environment at scheduled or random points at specific times and areas. The collected data present errors and uncertainties that make object recognition difficult and depend on the resolution of the sensors for detection and identification. Data acquisition resolution can be improved by integrating sensor data fusion systems to measure the same physical phenomenon by capturing information from two or more sensors simultaneously and applying filtering or pattern recognition techniques to obtain better results than those obtained with only one sensor
Sensor data fusion consists of different techniques, inspired by the human cognitive ability to extract information from the environment by integrating different stimuli. In the case of sensor fusion, measurement variables are integrated through a set of sensors, often different from each other, that make inferences that cannot be possible from a single sensor [1].
The fusion of Radar (Radio Detecting and Ranging) and LiDAR (Light Detection and Ranging or Laser Imaging Detection and Ranging) sensor data presents a better response considering two key aspects: (i) the use of two coherent systems that allow an accurate phase capture and (ii) the improvement in the extraction of data from the environment, with the combination of two or more sensors arranged on the mobile vehicle or UAV [1,2]. This integration allows the error to be decreased in the detection of objects in a juxtaposition relationship by determining the distances through the reflection of radio frequency signals in the Radar case and through the reflection of a light beam (photons) for the case of the LiDAR sensor, generating a double observer facing the same event, in this case, the measurement of proximity and/or angular velocity [3,4]. Thus, the choice of Radar and LiDAR sensors requires special care, mainly about technical characteristics and compatibility [5,6], coherence in range, and data acquisition. The above allows a complementary performance to be achieved with its associated element in data fusion, facilitating a better understanding of the three-dimensional environment that feeds the data processing system integrated into the UAV [7] or at a remote site.
A proper sensor fusion of LiDAR and Radar data must rely on the use of estimators to achieve higher consistency in the measurements to mitigate the uncertainties by using three parameters: Radar measurements, LiDAR measurements, and Kalman filtering. This improves the estimation of the measured variable. The Kalman filtering technique allows the description of the real world using linear differential equations to be expressed as a function of state variables. In most real-world problems, the measurements may not be a linear function of the states of the system. However, applying extended Kalman filtering (EKF) techniques counteracts this situation by modeling the phenomenon using a set of nonlinear differential equations, X k , which describe the dynamics of the system. The EKF allows “projecting” in time the behavior of the system to be filtered, with variables that are non-measurable but are calculable from the measurable variables. Then, by predicting the future data and their deviation concerning the measured data, the Kalman gain, K k , is calculated, and it continuously adapts to the dynamics of the system. Finally, updating the matrix state x ¯ k and the covariance matrix P k associated with the filtered system. This process is graphically described in Figure 1.
In this work, sensor data fusion was performed for target tracking from a UAV, using an EKF and taking into consideration the results from data fusions performed in autonomous driving. The kinematic modeling Constant Turn Rate and Velocity (CTRV) [8] was taken as a reference, and this model includes in its description the angular velocity variable, provided by the Radar, a parameter that introduces an improvement in omnidirectional motion detection.
This paper shows the performance of an implementation of data sensor fusion using LiDAR and Radar through an EKF for the tracking of moving targets, taking into account changes in their direction and trajectory, to generate a three-dimensional reconstruction when the information is captured from a UAV.

2. Dynamic Model of UAV

The UAV dynamics were obtained from the 2D CTRV model [8] for vehicle and pedestrian detection on highways. It is assumed that the possible movements of the elements around the UAV are not completely arbitrary and not holonomous, in which case there will be displacements in a bi-dimensional plane. The curvilinear model (CTRV) includes angular velocities and angular movements in its modeling, which allows a better description of the changes in the direction and velocity of an object in a linear model. The CTRV model is shown in Figure 2.
The velocity variable provides the system model the ability to calculate the target’s lateral position variations for a correct prediction of the future position of the target, thus starting from initial positions x and y and projecting this location over time, defined as x + Δx and y + Δy for the target as shown in Figure 3.
The CTRV model for the UAV system’s moving target in the three-dimensional case determines the projection of the position of the target xi+1 on the axis, starting from the values of the angular frequency w and the angle θ [9,10,11]⁠ for xi, and equally for yi and its position projection. Therefore, the variables of interest in the system are the position x and y; these are calculated by modeling their projection through the frontal velocity v, the angle θ formed between the Radar and the target, the angular frequency w of the target, and finally the angular frequency wd of the UAV. The set of state variables involved in the system is the following:
x ¯ = [ x , y , v , θ , w , w d ]
The kinematic equations describing the change from an initial position of the UAV to a future position are as follows:
x ¯ i + 1 = x i + [ v o b j e c t v d r o n e ] Δ T
y ¯ i + 1 = y i + [ v o b j e c t v d r o n e ] Δ T
The state variables are the frontal velocity, the theta angle, the target angular velocity, and the angular velocity of the UAV.
v = w Δ T θ = 0 w = 0 w d = 0
Because the data sensor fusion operation is bidimensional, the CTRV model does not include motion in the position around the z-axis in its state variables. To maintain a bi-dimensional analysis, the UAV velocity [10] is projected as
v x = V cos ϕ
v y = V sin ϕ
In this way, ϕ represents the elevation of the UAV concerning the sensed target, this angle allows the velocities of the drone to be projected in the xz plane, and the x + Δx or y + Δy to be determined, as shown in Figure 3, concerning the position prediction. To simplify the model and to have a congruence of the LiDAR and Radar models in the sensor data fusion, a data acquisition method is proposed in which the UAV only uses pitch (rotation on the lateral Y axis) and yaw (rotation on the vertical Z axis) movements, and their projection in a three-dimensional coordinate system. These motions are included in the CTRV model through the projection of the UAV velocity vd, through the angles ϕ and θ, as shown below.
v d = [ V d x V d y V d z W d z ] = [ V 0 cos ϕ cos θ V 0 cos ϕ sin θ V 0 sin ϕ 0 ]
The difference between the estimated position and the actual position of the target is determined by the displacement generated by w and θ, i.e., (ΔT·w + θ) [8]⁠, so the space and velocity projections are also a function of these variations. The velocity equations are obtained from xi and yi, which correspond to the first derivative, such that vdx and vdy are expressed as
x ˙ i + 1 = v w [ s i n ( Δ T w + θ ) s i n θ ] v d x cos θ cos ϕ y ˙ i + 1 = v w [ cos θ cos ( Δ T w + θ ) ] v d y cos θ sin ϕ a = w ˙ θ ˙ = 0 w ˙ = 0 w ˙ d = 0
When the target has an initial angular velocity w = 0, the expressions change to [8]
x ¯ i + 1 = x i + v cos θ Δ T v d x cos θ cos ϕ Δ T y ¯ i + 1 = y i + v s i n θ Δ T v d x cos θ s i n ϕ Δ T v = 0 θ = 0 w = 0 w d = 0
The EKF performs the filtering in a bi-dimensional plane formed by the intersection of the range of the LiDAR and Radar sensor to achieve a three-dimensional reconstruction of the sensed target and a rotation is accomplished on the x-axis of the sensors, using the cylindrical coordinates as orientation. Figure 4 shows the dynamics between the UAV for generating a three-dimensional reconstruction from bi-dimensional data gathered by the sensors and the target in the XYZ plane.
For the data fusion design, the RPLIDAR Slam S1 LiDAR sensor and the Positio2go BGT24MTR12 Radar were used as references. The LiDAR sensor operates in 2D with rotation capability, delivering data for a 360° scan, and the Radar achieves a range of 10 m. The range of the sensors according to the implementation of data fusion in the UAV is shown in Figure 5.
Now, to improve the estimation of the measured variable from the noisy sensors, the Kalman filter is implemented through sequential steps: (i) the estimation or prediction of the system behavior from the nonlinear equations; (ii) the calculation of the Kalman gain to reduce the error of the prediction of the current state versus the previous state, and (iii) the update of the measurement matrix, as well as the covariance associated with the uncertainty of the system. The representation from the selected state variables corresponds to the following equation:
x ˙ = f ( x ) + w
where x ˙ is the vector of the system states and f(x) is a nonlinear function of the states. This state–space model of the system allows us to determine the future states and the output is obtained by filtering the input signal. The Kalman filter performs estimations and corrections iteratively, where the possible errors of the system will be reflected in the covariance values present between the measured values and the values estimated by the filter. The forward projection of the covariance error has the following representation:
M k = Φ k P k Φ k T + Q k
The system update is implemented according to the following equations:
K k = M k H T ( H M k H T + R k ) 1
P k = ( I K k H ) M k
z ¯ k = H x ¯ + V k
x ¯ k = Φ k x ¯ k 1 + K k ( z k H Φ k x ¯ k 1 )
Based on these state variables, the fundamental matrix for the extended Kalman filter is calculated and must satisfy the condition
F = δ f ( x ) δ x | x = x ^
Φ k = I + F Δ T
Making α = ΔT + θ, β = −sinθ + sinα, and χ = −cosθ + cosα in Equation (18), the fundamental matrix is
Φ k = [ 1 0 β w v w χ Δ T v w [ c o s α v w 2 β ] v d s i n θ c o s Φ 0 1 χ w v w β Δ T v w [ s i n α + v w 2 χ ] v d c o s θ s i n Φ 0 0 1 Δ T 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 ]
When the angular velocity of the target is zero, the fundamental matrix reduces to the following matrix:
Φ k = [ 1 0 cos θ Δ T v s i n θ Δ T 0 v d s i n θ c o s ϕ 0 1 s i n θ Δ T v cos θ Δ T 0 v d c o s θ s i n ϕ 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 ]
The matrix associated with the system noise Q k was calculated from the discrete output matrix G k . The matrix must consider the output variables on which the Kalman filter can act. For the proposed model, the angular acceleration of the target has been taken into account, as well as the angular velocity and acceleration of the UAV. The output matrix G k for the EKF is presented below.
G k μ = 0 T s Φ k ( τ ) G d ( τ )
G k μ = [ Δ T 2 2 cos θ 0 cos θ cos ϕ Δ T Δ T 2 2 s i n θ 0 cos θ sin ϕ Δ T Δ T 0 0 0 0 0 0 Δ T 0 0 0 Δ T ] [ μ a μ w μ w d ]
The noise matrix from the output matrix is calculated through the following expression:
Q k = G k E [ μ μ T ] G k T
where
E   [ μ   ·   μ T   ] = [ σ a 2 0 0 0 σ w 2 0 0 0 σ w d 2 ]
With γ = cosθ cosφ, ν = cosφ sinφ, η = sinθ sinφ, κ = sinθ cosθ, σ = cosθ sinφ, λ = cosφ sinφ, the noise matrix is defined as
Q k = [ ( Δ T 2 2 σ a cos θ ) 2 + ( Δ T σ w d γ ) 2 Δ T 4 4 σ a 2 κ + ( Δ T σ w d cos θ ) 2 λ Δ T 3 4 σ a 2 κ 0 0 ( Δ T σ w d ) 2 γ ( Δ T 2 2 σ a s i n θ ) 2 + ( Δ T σ w d cos θ ) 2 ν Δ T 4 4 σ a 2 κ + ( Δ T σ w d σ ) 2 Δ T 3 2 σ a 2 s i n θ 0 0 ( Δ T σ w d ) 2 η Δ T 3 2 σ a 2 cos θ Δ T 3 2 σ a 2 s i n θ Δ T 2 σ a 2 0 0 0 0 0 0 0 0 0 0 0 0 0 Δ T 2 2 σ w ˙ 0 ( Δ T σ w d ) 2 γ ( Δ T σ w d ) 2 cos θ sin ϕ 0 0 0 ( Δ T σ w d ) 2 ]
Regarding the variables obtained from the sensors, it should be taken into account that the LiDAR and Radar sensors provide the measurements in different formats. For the LiDAR case, position data are retrieved in rectangular coordinates for x and y that correspond to the first two variables of the state vector. Since x k has six state variables, the measurement matrix for LiDAR data processing should operate only on the x and y variables, making the product between the state vector x k and H, conformable, i.e.,
H L = [ 1 0 0 0 0 0 0 1 0 0 0 0 ]
For Radar, the measurement matrix changes to
H R = [ 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 ]
The measurement error covariance matrix of the LiDAR sensor, obtained from the statistical analysis of the dataset obtained from this sensor, is as follows:
R k L = [ 0.0222 0 0 0.0222 ]
R k L is obtained from the variance in the LiDAR dataset. Likewise, the measurement error covariance matrix of the Radar sensor obtained is
R k R = [ 0.088 0 0 0 0 0 0.00088 0 0 0 0 0 0.088 0 0 0 0 0 0.0088 0 0 0 0 0 0.08 ]
These covariance values are directly related to the resolution and reliability of both the LiDAR and the Radar sensors. In the LiDAR case, the uncertainty is present in its measurement of the target distance, measured and represented as x and y coordinates, while for the Radar, this uncertainty is found in this same measurement, but is represented as a vector distance of magnitude ρ and angle θ. Likewise, the covariance matrix for the Radar includes the estimated velocity at the target.

3. Results and Discussion

The EKF filter was implemented under numerical evaluation using Matlab®. To determine its performance, a dataset combining position measurements from a LiDAR and Radar sensor for a pedestrian and real position measurements for the pedestrian were used, and with these results, an estimation of the performance was obtained using the RMSE [12]. To evaluate the robustness of the model, the dataset was contaminated with different levels of additive white Gaussian noise (AWGN).
The system was initialized by predefining values for the state matrix as shown in Figure 6, as well as the fundamental matrix, the system covariance matrix, and the noise matrix. Each new LiDAR or Radar sensor input triggers the filtering, starting by determining the time-lapse DT concerning the previous measurement. Next, the state matrix is estimated using the set of Equations (8) or (9) when w = 0, the fundamental matrix according to Equations (18) or (19) if w = 0, the noise matrix given by (24), and the system covariance matrix as given by Equation (11). Next, the configuration of the measurement and uncertainty matrices, (25) and (27) for the LiDAR case and (26) and (28) for the Radar case, is performed. The Kalman gain given by Equation (13) is determined, and, finally, an updating of the measurement matrix given by (15), as well as the system and state covariance matrix, is achieved.
The CTRV model proposed by [8] in the context of autonomous driving was designed, taking as reference highways and locations commonly used by automotive vehicles. In these scenarios, the tangential velocity changes to the sensors are presented to a lesser extent concerning the same scenario, but with measurements taken from a UAV. This behavior is accentuated when it is necessary to perform a three-dimensional reconstruction of the moving target. The CTRV model developed in this work includes the angular velocity of the drone, modifying the fundamental matrix of the system, as well as the noise matrix associated with the system, and a favorable response of the filter to the newly established changes was observed.
To evaluate the response of the x and y position variables to measurements contaminated with noise, the equations of the CTRV model were implemented in Matlab®, and a sweep of the position variables contaminated with AWGN was performed. The response of the x and y state variables of the EKF to these contaminated measurements is shown in Figure 7.
The response of the EKF to significant changes in the angular velocity of the target, represented as a change in the direction of the trajectory on the x-axis, is presented below. The EKF succeeds in predicting the target (green band), even at the point of greatest deflection. The zoom of the filter’s response to the change in trajectory is shown in Figure 8. The EKF was tested with the help of a dataset that provides 1225 positions and angles from simultaneous Radar and LiDAR measurements, where the Radar sensor provides the distance along with the angle of displacement, concerning the horizontal of the Radar, and also the angular velocity detected by the Radar; the LiDAR sensor gives the position in x and y coordinates. This dataset was contaminated with AWGN by increasing the noise power progressively and testing the EKF.
The representation of the real data versus the measured data from the Radar and LiDAR sensors is visualized in Figure 9.
To draw a comparison with other fusion models with Radar data, the root mean square error (RMSE) was computed, and it was possible to validate the effectiveness of the proposed CTRV model against fusions of previously used sensor data. The root mean square error is reduced from 0.21 to 0.163 in terms of the linear model; however, in contrast to the state-of-the-art [13] the unscented Kalman filter (UKF) maintains a better response compared to EKF based on the CTRV model. The response of the EKF to AWGN variations in the input data is presented below.
Figure 10 shows that the EKF acts by reducing the difference between the real values (red signal) and the values contaminated with Gaussian noise (blue signal). In [13], the authors state that the RMSE response can be improved with the unscented Kalman filter; however, it should be noted that this filter implies a higher computational complexity concerning the EKF. On the other hand, the response of the KF with a linear model and increasing AWGN is shown in Figure 11.

4. Conclusions

An architecture for LiDAR and Radar sensor data fusion through the extended Kalman filter model was implemented based on the CTRV model and the angular velocity projection of a UAV (a parameter not identified in related previous research). The robustness against trajectory changes for a moving target was demonstrated and determined by the angular velocity and angle of the target concerning the UAV provided by the LiDAR and Radar sensor. The evaluation of this model from the dataset allowed an accurate tracking of the target in the face of position changes. In CTRV modeling, the angle of radar and angular velocity of drone, when working together ensure a better response of the EKF. In the review of the state of the art, no references have been found that include the angular velocity of the drone. The projection of the UAV angular velocity on an xz-plane allows a bi-dimensional analysis to be performed, as well as the modeling of the drone–moving target system, without negatively affecting the EKF response.
The CTRV model proposed in this article for the drone–moving target system was validated by numerical analysis using real data captured from LiDAR and Radar sensors. In future work, when the implemented system in a UAV including the kinematic proposed model requires a performance validation of the data sensor fusion using the EKF, the implementation of the EKF algorithm must be evaluated in a Field-Programmable Gate Array (FPGA) or System-on-Chip module due to their parallel processing capacity.

Author Contributions

Conceptualization, O.J.M., E.A.F. and M.J.S.; methodology, O.J.M.; software, O.J.M. and E.A.F.; validation, O.J.M., E.A.F. and M.J.S.; formal analysis, O.J.M.; investigation, O.J.M. and E.A.F.; resources O.J.M.; writing—original draft preparation, O.J.M., E.A.F. and M.J.S.; writing—review and editing, O.J.M. and E.A.F.; visualization, E.A.F. and M.J.S.; supervision, E.A.F.; project administration, E.A.F. All authors have read and agreed to the published version of the manuscript.

Funding

This study was supported by the UPTC SGI 3139 Clarifier Research Project, as a partner of an International Research Cooperation agreement under the NATO Science for Peace Program—SPS G5888.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

All authors would like to sincerely thank the reviewers and editors for their suggestions and opinions for improving this article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lee, S.; Har, D.; Kum, D. Drone-Assisted Disaster Management: Finding Victims via Infrared Camera and Lidar Sensor Fusion. In Proceedings of the 2016 3rd Asia-Pacific World Congress on Computer Science and Engineering (APWC on CSE), Nadi, Fiji, 5–6 December 2016; pp. 84–89. [Google Scholar] [CrossRef]
  2. Ki, M.; Cha, J.; Lyu, H. Detect and avoid system based on multi-sensor fusion for UAV. In Proceedings of the 2018 International Conference on Information and Communication Technology Convergence (ICTC), Jeju, Korea, 17–19 October 2018; pp. 1107–1109. [Google Scholar] [CrossRef]
  3. Steinbaeck, J.; Steger, C.; Holweg, G.; Druml, N. Design of a low-level radar and time-of-flight sensor fusion framework. In Proceedings of the 2018 21st Euromicro Conference on Digital System Design (DSD), Prague, Czech Republic, 29–31 August 2018; pp. 268–275. [Google Scholar] [CrossRef]
  4. De Silva, V.; Roche, J.; Kondoz, A. Fusion of LiDAR and Camera Sensor Data for Environment Sensing in Driverless Vehicles. arXiv 2017. Available online: http://arxiv.org/abs/1710.06230 (accessed on 21 January 2023).
  5. Na, K.; Byun, J.; Roh, M.; Seo, B. Fusion of multiple 2D LiDAR and RADAR for object detection and tracking in all directions. In Proceedings of the 2014 International Conference on Connected Vehicles and Expo (ICCVE), Vienna, Austria, 3–7 November 2014; pp. 1058–1059. [Google Scholar] [CrossRef]
  6. Kwon, S.K.; Son, S.H.; Hyun, E.; Lee, J.H.; Lee, J. Radar-Lidar Sensor Fusion Sheme Using Occluded Depth Generation for Pedestrian Detection. In Proceedings of the 2017 International Conference on Computational Science and Computational Intelligence (CSCI), Las Vegas, NV, USA, 14–16 December 2017; pp. 1811–1812. [Google Scholar] [CrossRef]
  7. Wittmann, D.; Chucholowski, F.; Lienkamp, M. Improving lidar data evaluation for object detection and tracking using a priori knowledge and Sensor fusion. In Proceedings of the 2014 11th International Conference on Informatics in Control, Automation and Robotics (ICINCO), Vienna, Austria, 1–3 September 2014; Volume 1, pp. 794–801. [Google Scholar] [CrossRef]
  8. Wang, Y.; Liu, D.; Matson, E. Accurate Perception for Autonomous Driving: Application of Kalman Filter for Sensor Fusion. In Proceedings of the 2020 IEEE Sensors Applications Symposium (SAS), Kuala Lumpur, Malaysia, 9–11 March 2020. [Google Scholar] [CrossRef]
  9. Lee, H.; Chae, H.; Yi, K. A Geometric Model based 2D LiDAR/Radar Sensor Fusion for Tracking Surrounding Vehicles. IFAC-PapersOnLine 2019, 52, 277–282. [Google Scholar] [CrossRef]
  10. Kim, B.; Yi, K.; Yoo, H.J.; Chong, H.J.; Ko, B. An IMM/EKF approach for enhanced multitarget state estimation for application to integrated risk management system. IEEE Trans. Veh. Technol. 2015, 64, 876–889. [Google Scholar] [CrossRef]
  11. Han, J.; Kim, J.; Son, N.S. Persistent automatic tracking of multiple surface vessels by fusing radar and lidar. In Proceedings of the OCEANS 2017-Aberdeen, Aberdeen, UK, 19–22 June 2017; pp. 1–5. [Google Scholar] [CrossRef]
  12. Farag, W. Kalman-filter-based sensor fusion applied to road-objects detection and tracking for autonomous vehicles. Proc. Inst. Mech. Eng. Part I J. Syst. Control Eng. 2021, 235, 1125–1138. [Google Scholar] [CrossRef]
  13. Tian, K.; Radovnikovich, M.; Cheok, K. Comparing EKF, UKF, and PF Performance for Autonomous Vehicle Multi-Sensor Fusion and Tracking in Highway Scenario. In Proceedings of the 2022 IEEE International Systems Conference (SysCon), Montreal, QC, Canada, 25–28 April 2022; pp. 1–6. [Google Scholar] [CrossRef]
Figure 1. The schematic diagram for an extended Kalman filter.
Figure 1. The schematic diagram for an extended Kalman filter.
Remotesensing 15 03396 g001
Figure 2. CTRV model for a moving object.
Figure 2. CTRV model for a moving object.
Remotesensing 15 03396 g002
Figure 3. Position prediction through the CTRV model.
Figure 3. Position prediction through the CTRV model.
Remotesensing 15 03396 g003
Figure 4. Three-dimensional reconstruction with UAV.
Figure 4. Three-dimensional reconstruction with UAV.
Remotesensing 15 03396 g004
Figure 5. LiDAR and Radar sensor range.
Figure 5. LiDAR and Radar sensor range.
Remotesensing 15 03396 g005
Figure 6. Schematic for EKF implementation.
Figure 6. Schematic for EKF implementation.
Remotesensing 15 03396 g006
Figure 7. Response of the EKF in the state variables (a) x and (b) y to measurements contaminated with AWGN. Source: authors.
Figure 7. Response of the EKF in the state variables (a) x and (b) y to measurements contaminated with AWGN. Source: authors.
Remotesensing 15 03396 g007
Figure 8. EKF response to trajectory changes in the moving target.
Figure 8. EKF response to trajectory changes in the moving target.
Remotesensing 15 03396 g008
Figure 9. The plot of x and y position values for a 16 dB AWGN-contaminated LiDAR and Radar fusion.
Figure 9. The plot of x and y position values for a 16 dB AWGN-contaminated LiDAR and Radar fusion.
Remotesensing 15 03396 g009
Figure 10. RMSE vs. AWGN in CTRV model.
Figure 10. RMSE vs. AWGN in CTRV model.
Remotesensing 15 03396 g010
Figure 11. RMSE vs. AWGN in the linear model.
Figure 11. RMSE vs. AWGN in the linear model.
Remotesensing 15 03396 g011
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Montañez, O.J.; Suarez, M.J.; Fernandez, E.A. Application of Data Sensor Fusion Using Extended Kalman Filter Algorithm for Identification and Tracking of Moving Targets from LiDAR–Radar Data. Remote Sens. 2023, 15, 3396. https://doi.org/10.3390/rs15133396

AMA Style

Montañez OJ, Suarez MJ, Fernandez EA. Application of Data Sensor Fusion Using Extended Kalman Filter Algorithm for Identification and Tracking of Moving Targets from LiDAR–Radar Data. Remote Sensing. 2023; 15(13):3396. https://doi.org/10.3390/rs15133396

Chicago/Turabian Style

Montañez, Oscar Javier, Marco Javier Suarez, and Eduardo Avendano Fernandez. 2023. "Application of Data Sensor Fusion Using Extended Kalman Filter Algorithm for Identification and Tracking of Moving Targets from LiDAR–Radar Data" Remote Sensing 15, no. 13: 3396. https://doi.org/10.3390/rs15133396

APA Style

Montañez, O. J., Suarez, M. J., & Fernandez, E. A. (2023). Application of Data Sensor Fusion Using Extended Kalman Filter Algorithm for Identification and Tracking of Moving Targets from LiDAR–Radar Data. Remote Sensing, 15(13), 3396. https://doi.org/10.3390/rs15133396

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop