Next Article in Journal
Synchronous Response Analysis of Features for Remote Sensing Crop Classification Based on Optical and SAR Time-Series Data
Next Article in Special Issue
A Localization and Tracking Approach in NLOS Environment Based on Distance and Angle Probability Model
Previous Article in Journal
Edible Gelatin Diagnosis Using Laser-Induced Breakdown Spectroscopy and Partial Least Square Assisted Support Vector Machine
Previous Article in Special Issue
RSS Indoor Localization Based on a Single Access Point
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Computationally Efficient Labeled Multi-Bernoulli Smoother for Multi-Target Tracking †

1
National Key Laboratory of Science and Technology on ATR, College of Electronic Science, National University of Defense Technology, Changsha 410073, China
2
Key Laboratory of Information Fusion Technology (Ministry of Education), School of Automation, Northwestern Polytechnical University, Xi’an 710072, China
*
Authors to whom correspondence should be addressed.
This paper is an extended and modified version of our conference paper “A Forward-Backward Labeled Multi-Bernoulli Smoother” published in Proceedings of the 16th International Conference on Distributed Computing and Artificial Intelligence, Avila, Spain, 26–28 June 2019.
Sensors 2019, 19(19), 4226; https://doi.org/10.3390/s19194226
Submission received: 8 September 2019 / Revised: 23 September 2019 / Accepted: 26 September 2019 / Published: 28 September 2019
(This article belongs to the Special Issue Sensors Localization in Indoor Wireless Networks)

Abstract

:
A forward–backward labeled multi-Bernoulli (LMB) smoother is proposed for multi-target tracking. The proposed smoother consists of two components corresponding to forward LMB filtering and backward LMB smoothing, respectively. The former is the standard LMB filter and the latter is proved to be closed under LMB prior. It is also shown that the proposed LMB smoother can improve both the cardinality estimation and the state estimation, and the major computational complexity is linear with the number of targets. Implementation based on the Sequential Monte Carlo method in a representative scenario has demonstrated the effectiveness and computational efficiency of the proposed smoother in comparison to existing approaches.

1. Introduction

Multi-target tracking (MTT) has been widely used in many engineering fields including aerospace surveillance, biomedical analytics, autonomous driving, indoor localization, robotic networks and so on [1,2,3,4,5]. In the applications, both the number of targets and their states may vary in time, and the measurements are obscured by clutter and missed detection [2,3]. Traditionally, approaches to MTT are built on the base of appropriate data association methods such as typically joint probabilistic data association [1] or multiple hypothesis tracking [6,7]. A novel approach has been developed by Mahler based on the random finite set (RFS) theory [2] which has attracted substantial attention in the last decade. Simply speaking, the RFS methods model the target states and the measurements into the RFSs explicitly and have gained tremendous interest in recent years [8]. A variety of RFS filters have been proposed, including probability hypothesis density (PHD) filter [9,10], cardinalized PHD (CPHD) filter [11], Cardinality Balanced multi-Bernoulli (CBMeMBer) filter [12], generalized labeled multi-Bernoulli (GLMB) filter [13,14] and labeled multi-Bernoulli (LMB) filter [15]. Compared with the filtering that refers the current state, the smoothing [16,17,18,19] typically refers to estimating the target state of the past time using all measurements till the current time which has a better accuracy than filtering but also suffers from a higher computational cost. Therefore, it is of great significance in practice to develop an RFS smoother that is computationally efficient, reliable and accurate.
The Bernoulli smoother was first proposed in [20,21]; it performs better than the Bernoulli filter, but adapts to, at most, one target. For the multi-target problem, the forward–backward PHD smoothers are proposed in [17,18,22,23,24]. It has been shown in [17] that the PHD smoother can improve the accuracy of position estimation as compared with the PHD filter, but does not necessarily gain better cardinality estimation. The CPHD smoother is proposed in [25] which uses an approximate scheme to overcome the intractability of the classic CPHD smoother [26] but bears a complicated algorithm structure. A forward–backward multi-target multi-Bernoulli (MeMBer) smoother is proposed in [27] which also does not necessarily improve the cardinality estimation. However, these smoothers, as well as the corresponding filters, do not provide the information for each track. Therefore, labelled RFS-based filters and smoothers have been developed [28,29,30,31,32], for generating track estimates which is also the focus of this paper.
The GLMB smoother is proposed in [32], which is the first exact closed form solution to the smoothing recursion based on labeled RFS but has not been implemented in practice due to the overcomplicated data associations. Recently, Chen has pointed out the challenge to form tracks since the optimal solution is not as simple as labeling [33]. Relevantly, a multi-scan GLMB filter based on the smoothing-while-filtering framework is proposed in [34] for better labeling. However, the truncation of the multi-scan GLMB filter needs to solve an NP-hard multi-dimensional assignment problem. In short, these existing labeled smoothers have a high computational complexity even if it is practically implementable. Furthermore, we point out that, what is also related to the framework of smoothing-while-filtering [34] is the joint smoothing and filtering [19,35,36] based on which so far, however, only a single target is considered.
In this paper, we derive a forward–backward LMB smoother for multi-target tracking. Preliminary and limited results have been published in [37]. This paper provides additional results, complete proofs, and additional experiments. The proposed smoother consists of two parts regarding the forward LMB filtering and backward LMB smoothing, respectively. While in the former we apply the standard LMB filter [15], the key contribution of our work lies in the backward smoothing algorithm design. We prove that the proposed backward LMB smoothing is closed under the LMB prior for the standard multi-target system models and the backward smoothed density of each track is similar to the Bernoulli backward smoothed density [20]. Superior to the approximate parameteric/Gaussian (mixture) approximation [38], the Sequential Monte Carlo (SMC) method is a powerful method for representing arbitrary/non-Gaussian models [4]. Based on the SMC method, the proposed smoother reduces both the state error and the cardinality error as compared with the PHD smoother [17], the MeMBer smoother [27] and the LMB filter [15], and has a lower computational complexity as compared with the PHD smoother and the MeMBer smoother.
The rest of the paper is organized as follows. Basic definitions of labeled RFS and the multi-target Bayes forward–backward smoother are reviewed briefly in Section 2. The proposed forward–backward LMB smoother is detailed in theory in Section 3 and implemented based on the SMC in Section 4, respectively. Simulation results are presented in Section 5 before we conclude in Section 6.

2. Background

2.1. Notation

Single target states are denoted by lowercase letters, for example, x and x . Multi-target states are denoted by uppercase letters, for example, X and X . x and X are the unlabeled state representations. x and X are the labeled state representations. State spaces are denoted by blackboard bold letters, for example, a label space which contains a countable number of distinct labels is denoted as L , and a unlabeled target state space is denoted as X . A labeled single target state has the form x = x , , where x X and L . A labeled multi-target state has the form X = x 1 , . . . , x i , . . . , x X , where x i denotes a labeled single target state in X , · denotes the cardinality (or the number of targets) of a multi-target set, and X X × L .
The projection L : X × L L is to make L x = and L X = L ( x ) : x X . Then define a generalized Kronecker delta function and inclusion function as:
δ Y ( X ) 1 , X = Y ; 0 , otherwise .
1 Y ( X ) 1 , X Y ; 0 , otherwise .
Define Δ X δ X L X . The inner product of f x and g x on a labeled single target space is defined as
f , g = f x g x d x = f ( x ) g ( x ) d x
The multi-target exponential form of X is defined as
h X 1 , X = ; x X h ( x ) , X .

2.2. GLMB and LMB RFS

The GLMB RFS X is a Labeled RFS constructed by labeled multi-target states. The distribution of an GLMB RFS [13] is exactly closed under the prediction and update of the multi-target Bayes filter. The GLMB distribution is denoted as
π X = Δ X c C ω c L X p c X
where C is a discrete index set, p c · , is the density of the track , ω c I is the nonnegative weight of the hypothesis c , I , and I L c C ω c I = 1 , p c x , d x = 1 .
The distribution of an LMB RFS with the parameter set π = { ( r , p ) } L is given by [15]
π X = Δ X ω L X p X
where
ω I = L 1 r I 1 L r 1 r
p p ( x , )
where r denotes the existence probability of the track , p · , denotes the density, and ω I denotes the weight of the hypothesis I = 1 , . . . , I . Note that L is a discrete countable space and the number of labels in L equals to the number of the Bernoulli components (with non-zero existence probability) in the LMB RFS.
An LMB RFS is a special case of an GLMB RFS, and the densities and the existence probabilities of different tracks in an LMB RFS are both uncorrelated. Two properties associated with an LMB RFS are as follows. More specifically, the cardinality distribution [2] is given by
p n = L 1 r σ L , n r 1 1 r 1 , · · · , r L 1 r L
where σ ν , n x 1 , · · · , x ν is the elementary homogeneous symmetric function of degree n in ν variables.
Lemma 1.
If X a and X b are both LMB RFSs with the probability densities π a X a and π b X b , respectively, where X a X × L a , X b X × L b and L a L b = , X = X a X b is an LMB RFS with the probability density π X and vice versa. The probability densities of π a X a , π b X b and π X have the relations as follows :
π X = π a X a π b X b
The proof of Lemma 1 is given in the Appendix A.

2.3. Multi-Target Bayes Forward–Backward Smoother

The recursion of multi-target Bayes forward–backward smoother is shown in Figure 1 which consists of forward Bayes filtering and backward Bayes smoothing. The multi-target state at t is X t , where X t X × L 1 : t and L 1 : t denotes the label space of the targets at t (including those born prior to t). The measurement set at t is Z t . The prediction and update for the forward Bayes filtering can be denoted as [2]
π t | t 1 X t = f t | t 1 X t | X t 1 π t 1 | t 1 X t 1 δ X t 1
π t | t X t = g t Z t | X t π t | t 1 X t g t Z t | X π t | t 1 X δ X
where π t | t 1 is denoted as the predicted multi-target density from t 1 to t, f t | t 1 is denoted as the multi-target Markov transition density at t, π t | t is denoted as the multi-target posterior density at t, and g t is denoted as the multi-target likelihood function at t. The integral in the equations is the set integral. If the backward smoothed density from t to k ( k t ) is denoted by π k | t Y which is initialized with π t | t Y , the backward Bayes smoothed density π k 1 | t X at k 1 can be written as [32]
π k 1 | t X = π k 1 | k 1 X f k | k 1 Y | X π k | t Y π k | k 1 Y δ Y

2.4. Multi-Target Motion and Measurement Models

Let p s , t | t 1 p s , t | t 1 x , denote the surviving probability of the target x , from t 1 to t. f t | t 1 x + | x , denotes the single target Markov transition density from t 1 to t. X denotes the multi-target state at t 1 while Y denotes the survival multi-target state from t 1 to t, where X X × L 1 : t 1 and Y X . Considering the survival targets only, multi-target Markov transition density f s , t | t 1 is [2,13]
f s , t | t 1 Y | X = Δ Y Δ X 1 p s , t | t 1 X L Y 1 L X p s , t | t 1 f t | t 1 y | x , 1 p s , t | t 1
It is assumed that the newborn targets are an LMB RFS denoted as Y + , and Y + X × L t , where L t denotes the label space of the newborn targets at t. r B , t | t 1 denotes the birth probability of target x , at t and p B , t | t 1 y , denotes the corresponded density. The density f B , t | t 1 of newborn targets can be denoted as [2,13]
f B , t | t 1 Y + = Δ Y + L t 1 r B , t | t 1 L Y + 1 L t r B , t | t 1 p B , t | t 1 y , 1 r B , t | t 1
The multi-target state at t can be denoted as Y = Y + Y where Y + and Y are disjoint and independent. From Lemma 1, the joint multi-target Markov transition density f t | t 1 is written as [2,13]
f t | t 1 Y | X = f B , t | t 1 Y + f s , t | t 1 Y | X
Multi-target likelihood function g t Z t | X is given as [2]
g t Z t | X = e κ , 1 κ Z t 1 p D , t X θ Θ θ > 0 p D , t g t z θ | x , 1 p D , t κ z θ
where p D , t p D , t x , denotes the detection probability of target x , at t. g t z | x , denotes the probability that target x , generates the measurement z if detected. The intensity function (or the PHD) of the Poisson clutter is κ · . The association function θ : L X 0 , 1 , . . . , Z t has the property that θ i = θ i > 0 implies i = i . Θ denotes the set of all association functions and θ Θ . When θ = 0 , the target x , is missed in detection. When θ > 0 , z θ denotes the measurement associated with target x , . Z t / L X , θ > 0 z θ denotes the false alarms at time t.

3. LMB Smoother

In this section, we detail the proposed LMB smoother and discuss the cardinality estimation. The LMB smoother framework can be simply depicted as shown in Figure 2 which consists of forward LMB filtering and backward LMB smoothing. The forward LMB filtering used in our approach is the standard LMB filter [15]. The backward LMB smoothing each time step has L d -step backward smoothing recursions where L d denotes the fixed lag.

3.1. Forward LMB Filtering

3.1.1. Prediction

Given that the multi-target prior at time t 1 is an LMB (distribution) parameterized as π t 1 | t 1 = r t 1 | t 1 , p t 1 | t 1 L 1 : t 1 and the density of the newborn targets is an LMB parameterized as π B , t = r B , t , p B , t L t , then the predicted multi-target density at t remains an LMB which can be denoted as [12,15]
π t | t 1 = r S , t | t 1 , p S , t | t 1 L 1 : t 1 r B , t , p B , t L t = r t | t 1 , p t | t 1 L 1 : t
where
η S , t | t 1 = p s , t | t 1 · , p t 1 | t 1 ·
r S , t | t 1 = η S , t | t 1 r t 1 | t 1
p S , t | t 1 x + , = p s , t | t 1 · , f t | t 1 x + | · , , p t 1 | t 1 · , η S , t | t 1
p t | t 1 x + , = 1 L 1 : t 1 p S , t | t 1 x + , + 1 L t p B , t x + ,

3.1.2. Update

Assume that the predicted multi-target density at t is an LMB with the parameter set π t | t 1 = r t | t 1 , p t | t 1 L 1 : t , namely,
π t | t 1 ( X ) = Δ X ω t | t 1 L X p t | t 1 X
Under the likelihood function of (17), the updated multi-target posterior density is an GLMB (Strictly speaking, it’s a δ -GLMB which is a special case of an GLMB [13]) which can be denoted as [13]
π t | t X = Δ X I + , θ F L 1 : t × Θ I + ω t | t I + , θ Z δ I + L X p t | t θ x , X
where F L 1 : t denotes the class of L 1 : t , I + = 1 , . . . , I + denotes the label set of a hypothesis, Θ I + is the set of the association function θ : I + 0 , 1 , . . . , Z , and
ω t | t I + , θ ω t | t 1 I + η Z θ I +
p t | t θ x , = p t | t 1 x , ψ Z x , ; θ η Z θ
η Z θ = p t | t 1 · , , ψ Z · , ; θ
ψ Z x , ; θ = p D x , g z θ | x , κ z θ , θ > 0 1 p D x , , θ = 0
In the update of the forward LMB filtering, the multi-target posterior density matches the first-order moment of (24) for computing simplification, i.e., [15]
π t | t r t | t , p t | t L 1 : t
where
r t | t = I + , θ F L 1 : t × Θ I + ω t | t I + , θ Z 1 I +
p t | t x , = 1 r t | t I + , θ F L 1 : t × Θ I + ω t | t I + , θ Z 1 I + p t | t θ x ,
The approximate posterior density (29) of the forward LMB filtering preserves the first-order moment of the posterior density (24). It is proved in [29,31] that the approximate posterior density (29) in the LMB class minimizes the Kullback-Leibler divergence (KLD) relative to the posterior density (24).

3.2. Backward LMB Smoothing

In this subsection, the backward LMB smoothing is derived. Our derivation directly relies on the multi-target backward smoothing recursion (13), and is different from the derivation of the backward GLMB smoothing [32] which uses the backward corrector recursion [18] as an intermediate process to avoid the set integral of the quotient of two GLMBs.
Proposition 1.
Given that the multi-target posterior π k 1 | k 1 X and the predicted multi-target density π k | k 1 Y are both LMBs, and the backward smoothed density π k | t Y from t to k ( k t ) is an LMB, then the backward smoothed density π k 1 | t X from t to k 1 is also an LMB which can be written as
π k 1 | t = r k 1 | t , p k 1 | t L 1 : k 1
where
r k 1 | t = 1 1 r k 1 | k 1 1 r k | t 1 r k | k 1
p k 1 | t x , = p k 1 | k 1 x , α s , k | t x , + β s , k | t x , f k | k 1 y | x , p k | t y , p k | k 1 y , d y p k 1 | k 1 x , α s , k | t x , + β s , k | t x , f k | k 1 y | x , p k | t y , p k | k 1 y , d y d x
α s , k | t x , 1 r k | t 1 p s , k | k 1 x , 1 r k | k 1
β s , k | t x , r k | t p s , k | k 1 x , r k | k 1
The proof of Proposition 1 is given in Appendix B. It is observed that the smoothed density ( r k 1 | t , p k 1 | t ) of the track has the same form as the smoothed Bernoulli density when it doesn’t take into account the newborn target [20]. Therefore, the proposed LMB smoother can be deemed as an extension of the Bernoulli smoother [20] to multiple targets. From (33), the existence probability of the track relates to r k 1 | k 1 , r k | k 1 and r k | t . From (34), the density of the track contains two terms, where one term only relates to p k 1 | k 1 x , and α s , k | t x , which preserves the forward filtering state at k 1 , and the other term relates to the backward smoothing.
Remark 1.
The proposed LMB smoother owns a good computationally efficiency to two strategies: First, the newborn tracks are uncorrelated with the backward smoothing since the newborn tracks cannot be alive prior to the birth time, resulting in a simple label space. Second, the existence probabilities and the probability densities of different tracks for an LMB RFS are uncorrelated, and then each track component can be calculated separately, so the computational complexity of the backward smoothing is linear with the number of the tracks.
Remark 2.
The proposed LMB smoother is also approximate Bayes optimal because the LMB family is closed under the prediction operation and the backward smoothing operation. Although the LMB family is not closed under the update operation, the first-order moment approximation in the LMB class minimizes the KLD relative to the posterior density.

4. SMC Implementation and Algorithm Analysis

This section presents the SMC implementation and the state extraction, and analyze the algorithm complexity of the proposed LMB smoother.

4.1. SMC Implementation

Prediction: Consider the multi-target posterior at t 1 is π t 1 | t 1 = r t 1 | t 1 , p t 1 | t 1 L 1 : t 1 . In the SMC implementation, p t 1 | t 1 is represented by a set of weighted particles ω t 1 | t 1 i , x t 1 | t 1 i i = 1 J t 1 | t 1 . The density of the newborn targets at t is π B , t = r B , t , p B , t L t , where p B , t is also represented by a set of weighted particles ω B , t i , x B , t i i = 1 J B , t . Through the prediction of (18)–(22), the predicted multi-target density at t is π t | t 1 = r S , t | t 1 , p S , t | t 1 L 1 : t 1 r B , t , p B , t L t where r S , t | t 1 is calculated by (20) and p S , t | t 1 is also represented by ω S , t | t 1 i , x S , t | t 1 i i = 1 J t 1 | t 1 . That is,
ω S , t | t 1 i = p s , t | t 1 x t 1 | t 1 i , ω t 1 | t 1 i η S , t | t 1
η S , t | t 1 = i = 1 J t 1 | t 1 p s , t | t 1 x t 1 | t 1 i , ω t 1 | t 1 i
x S , t | t 1 i f t | t 1 · | x t 1 | t 1 i ,
where f t | t 1 is chosen as the importance sampling density.
Update: Denoting the predicted density as π t | t 1 = r t | t 1 , p t | t 1 L 1 : t , it can be written as the form of (6) given by ω t | t 1 I + , p t | t 1 X I + L 1 : t , where ω t | t 1 I + is denoted by (7) and I + = L X . p t | t 1 is represented by a set of weighted particles ω t | t 1 i , x t | t 1 i i = 1 J t | t 1 . The K-shortest paths algorithm [14] is used to truncate the predicted multi-target density in order to reduce the number of hypotheses. The multi-target posterior density given in (24) is denoted by ω t | t I + , θ Z , p t | t θ X I + , θ F L 1 : t × Θ I + . We compute p t | t θ x , and ω t | t I + , θ as follows.
p t | t θ x , can also be represented by a set of particles ω t | t i , θ , x t | t i , θ ( ) i = 1 J t | t 1 where
ω t | t i , θ = ω t | t 1 i ψ Z x t | t 1 i , ; θ η Z θ
x t | t i , θ ( ) = x t | t 1 i
and
η Z θ = i = 1 J t | t 1 ω t | t 1 i ψ Z x t | t 1 i , ; θ
ψ Z x t | t 1 i , ; θ = p D x t | t 1 i , g z θ | x t | t 1 i , κ z θ , θ > 0 1 p D x t | t 1 i , , θ = 0
The weight ω t | t I + , θ of the hypothesis I + , θ is given as
ω t | t I + , θ Z = ω ˜ t | t I + , θ Z I + , θ F L 1 : t × Θ I + ω ˜ t | t I + , θ Z
ω ˜ t | t I + , θ Z = ω t | t 1 I + η Z θ I +
where θ is an association function and θ Θ I + . I + , θ is a hypothesis and I + , θ F L 1 : t × Θ I + . To avoid computing all the hypotheses and their weights, we only reserve a specific number T h of largest weights and the ranked optimal assignment algorithm [14] is applied to truncate the multi-target posterior density.
The multi-target posterior density of the forward LMB filtering matches the first-order moment of (24) which can be denoted as an LMB with the parameter set π t | t r t | t , p t | t L 1 : t where r t | t is given by (30) and p t | t is represented by a set of weighted particles ω t | t j , x t | t j j = 1 J t | t with
ω t | t j = ω t | t I + , θ Z 1 I + ω t | t i , θ r t | t
x t | t j = x t | t i , θ ( )
where I + , θ F L 1 : t × Θ I + . The number of particles for p t | t increases rapidly and resampling [39] is needed.
Backward smoothing: From the forward LMB filtering, the multi-target posterior density is π k 1 | k 1 = r k 1 | k 1 , p k 1 | k 1 L 1 : k 1 at k 1 , where p k 1 | k 1 is approximated by ω k 1 | k 1 i , x k 1 | k 1 i i = 1 J . The predicted multi-target density from k 1 to k is an LMB and the existence probability of track is r k | k 1 where L 1 : k . The multi-target backward smoothed density from t to k ( k t ) is denoted as π k | t Y = r k | t , p k | t L 1 : k , where p k | t is represented by a set of weighted particles ω k | t j , y k | t j j = 1 Q .
Proposition 1 implies that the multi-target backward smoothed density from t to k 1 is also an LMB given by π k 1 | t = r k 1 | t , p k 1 | t L 1 : k 1 where r k 1 | t is calculated by (33) and p k 1 | t is represented by a set of weighted particles ω k 1 | t i , x k 1 | t i i = 1 J . The detailed formula of p k 1 | t is given as
ω k 1 | t i = ω ˜ k 1 | t i i = 1 J ω ˜ k 1 | t i
ω ˜ k 1 | t i = 1 r k | t 1 p s , k | k 1 x k 1 | k 1 i , ω k 1 | k 1 i 1 r k | k 1 + r k | t p s , k | k 1 x k 1 | k 1 i , ω k 1 | k 1 i r k | k 1 × j = 1 Q f k | k 1 y k | t j | x k 1 | k 1 i , ω k | t j p k | k 1 y k | t j ,
x k 1 | t i = x k 1 | k 1 i
where i = 1 , · · · , J and
p k | k 1 y k | t j , = r k 1 | k 1 r k | k 1 × i = 1 J ω k 1 | k 1 i p s , k | k 1 ( x k 1 | k 1 i ( ) , ) f k | k 1 y k | t j | x k 1 | k 1 i ,
Note that the predicted density (22) cannot be directly used as p k | k 1 · , in (49) for smoothing. Because the forward LMB filtering performs the resampling procedure [39] in each filtering step, the particles of p k | t · , from p t | t · , (initial smoothed density) are different from those of p k | k 1 · , . We need to estimate p k | k 1 y k | t j , for each particle y k | t j , , j = 1 , · · · , Q as in (51).

4.2. Backward Smoothing and State Extraction

The pseudo code of the proposed backward smoothing algorithm is given in Algorithm 1. The forward filtering is up to t and the lag of the backward smoothing is L d . We need to store L d + 1 multi-target posterior densities from t L d to t and L d multi-target predicted densities from t L d + 1 to t for L d -step backward smoothing recursions. The backward smoothed density π k | t Y is initialized with π t | t Y . In the SMC implementation, pruning, truncation and track cleanup are required and the label set varies with the time, so L 1 : i is replaced with L i | j , where L i | j denotes the label set of the corresponded density π i | j ( · ) and L i | j L 1 : i . Therefore, we have L k 1 | t = L k 1 | k 1 L k | t to represent the label set of backward smoothing at k 1 which eliminates the labels of newborn tracks at k and the labels of the pruned tracks in the forward filtering.
Note that, in our approach, resampling [39] can be applied either as the final step of the forward filtering or after the backward smoothing, which lead to insignificant difference in performance. Target states are extracted from the output r t L d | t , p t L d | t L t L d | t . That is, the target number N is first estimated by
N ^ = arg max n { p ( n ) }
where p ( n ) is the cardinality distribution (9). Then N ^ tracks with the largest existence probabilities are extracted and the target states are mean of the corresponding probability densities.
Algorithm 1: The proposed backward LMB smoothing algorithm.
Input: lag L d at time t, r k | k , p k | k L k | k k = max t L d , 1 t , r k | k 1 L k | k 1 k = max t L d , 1 + 1 t ;
 initialize π k | t Y with π t | t Y ;
 for k=t:-1:max( t L d ,1)+1
        L k 1 | t = L k 1 | k 1 L k | t ;
       for q = 1:size( L k 1 | t ,2)
             compute r k 1 | t q according to (33);
             for j=1: Q q
                   estimate p k | k 1 y k | t j q , q according to (51);
             end
             for i=1: J q
                   compute ω k 1 | t i q according to (48)–(49);
                    x k 1 | t i q = x k 1 | k 1 i q ;
             end
       end
 end
Output: r k 1 | t , p k 1 | t L k 1 | t k = max t L d , 1 + 1 t .

4.3. Algorithm Complexity

The major computational cost of the LMB smoother is due to backward LMB smoothing. The computational complexity of the backward LMB smoothing is O L d N L 2 which can be obtained from the four for-loops of Algorithm 1, where N is the number of tracks (or targets) and L is the number of particles per track. The computational complexity of the innermost two parallel for-loops is O L 2 , so the computational complexity of the backward LMB smoothing is O L d N L 2 multiplying by the outermost two for-loops.
The structures of backward smoothing for PHD smoother [17] and MeMBer smoother [27] are similar to that of LMB smoother, and the difference is that the particles of all tracks are used for backward smoothing for PHD smoother and MeMBer smoother while the particles of each track are used for their respective backward smoothing in our LMB smoother. We can obtain that the major computational costs for the PHD smoother (Here, we consider the classic SMC implementation [17] instead of the fast SMC implementation [23]) and the MeMBer smoother are both approximately to o ( L d N 2 L 2 ) . Therefore, the computational complexity of the proposed LMB smoother is lower than those of PHD smoother [17] and MeMBer smoother [27].

5. Simulation Result

A nearly constant turn (NCT) model is considered which has varying turn rate together with noisy range and azimuth measurements [12,13]. The state of the target with label at t is denoted as x t = x t , , where x t = x ˜ t T , w t T , x ˜ t = p x , t , p ˙ x , t , p y , t , p ˙ y , t T denotes the planar position and velocity of the target, and w t denotes the turn rate. The NCT model can be written as
x ˜ t = F w t 1 x ˜ t 1 + G W t 1
w t = w t 1 + u t 1 T
where W t 1 and u t 1 are the state noises of velocity and turn rate, respectively. W t 1 N W ; 0 , σ W 2 I 2 and u t 1 N u ; 0 , σ u 2 I 1 where N · ; m N , P N denotes a Gaussian density with mean m N and variance P N , and I i represents an i order identity matrix. The noise of velocity uses σ W = 5 m/s 2 and the noise of turn rate uses σ u = π / 180 rad/s 2 . The state transition matrix and noise transition matrix are given as follows, respectively,
F w = 1 sin w T w 0 1 cos w T w 0 cos w T 0 sin w T 0 1 cos w T w 1 sin w T w 0 sin w T 0 cos w T , G = T 2 2 0 T 0 0 T 2 2 0 T
The state transition model is denoted by (53) and (54). The sampling interval is T = 1 s. The target survival probability is P s = 0.99 . Newborn targets of each time step is an LMB RFS which can be represented as π B , t = r B , t , p B , t B where B = 1 , 2 , r B , t 1 = 0.02 , r B , t 2 = 0.03 , and p B , t i N · ; m B , t i , P B . The mean of track 1 is m B , k 1 = 1500 , 0 , 250 , 0 , 0 T , the mean of track 2 is m B , k 2 = 1000 , 0 , 1500 , 0 , 0 T , and the variances of both are P B = d i a g 50 , 50 , 50 , 50 , 6 π / 180 2 . The unit of position p x , p y is m, the unit of velocity p ˙ x , p ˙ y is m/s, and the unit of the turn rate w is rad/s.
The measurement equation is
z t = p x , t 2 + p y , t 2 , arctan p y , t p x , t T + ε t
where the measurement noise is ε t N · ; 0 , P ε with P ε = d i a g σ r 2 , σ θ 2 , σ r = 10 m, and σ θ = 2 π / 180 rad. The observation region is 0 , 2000 m × 0 , π rad with the detection probability p D = 0.98 . The density of the Poisson clutter is κ · = 10 / 2000 π as the average number of clutter measurements is 10 per scan.
The number of particles per hypothesized track is set to 1000. We prune the tracks with a weight smaller than P T = 10 4 . The lag of the smoother is L d = 3 . The OSPA metric [40] with cut-off parameter c = 100 m and order parameter p = 1 is used. We compare the proposed LMB smoother with the LMB filter [15], the PHD filter [9], the PHD smoother [17], the CBMeMBer filter [12] and the MeMBer smoother [27] over 100 Monte Carlo trials.
Figure 3 and Figure 4 show the results of the LMB smoother in one trial, in the plane and in x and y coordinates over time, respectively. The number of targets changes over time due to target birth or death, and there are at most five targets in the scenario. The track of each target is smoothing.
More specifically, Figure 5, Figure 6 and Figure 7 show the cardinality estimation for different methods over 100 Monte Carlo trials. We can see from Figure 5 that the estimated cardinality mean converges to the true cardinality most of time for all methods. Figure 6 shows the errors of the cardinality mean which is given by the estimated cardinality mean minus the true cardinality. At k = 1 s, 10 s, 40 s and 60 s, there is one or two newborn targets at each time step and the cardinality errors are negative for the LMB filter, because of newborn target detection delays. At k = 67 s, 81 s and 91 s, one target disappears and the cardinality errors are positive for the LMB filter, because of the detection delay of the target death. At k = 64∼66 s, 78∼80 s and 88∼90 s, the cardinality mean errors are negative for PHD and MeMBer smoother, because of premature deaths of targets. Figure 7 shows the standard deviation of the cardinality estimate for different methods. The standard deviations of the PHD and the MeMBer smoother are larger than those of LMB filter and LMB smoother. In short, the LMB smoother can accurately estimate the cardinality (except for a detection delay at k = 10 s) and yields the best accuracy.
Figure 8 shows the average OSPA errors of the PHD smoother, the MeMBer smoother, the LMB filter, and the LMB smoother with 100 Monte Carlo trials. It can be seen that the OSPA of the LMB smoother is less than those of the other methods almost all the time. We also give the average OSPA errors of different methods in Table 1. It can be seen that all three kinds of smoothers can effectively reduce OSPA localization components as compared with the corresponding filters. However, the PHD smoother and the MeMBer smoother do not necessarily reduce the OSPA cardinality components, whereas the proposed LMB smoother can improve the cardinality estimation significantly.
Figure 9 shows the average execution time of different methods over 100 Monte Carlo trials. Summing the time of every time step, we get the times per simulation for the PHD smoother, the MeMBer smoother, the LMB filter, and the LMB smoother which are 1341 s, 1537 s, 156 s and 356 s, respectively. The proposed LMB smoother have a higher computational complexity than the LMB filter, but obviously lower than the PHD smoother and the MeMBer smoother. The results comply with our theoretical analysis.

6. Conclusions

The paper derives a computationally efficient forward–backward LMB smoother which is closed under the backward smoothing operation and has the advantages for maintaining the independence of different tracks and their track outputs. Both numerical and simulation analyses have demonstrated that the proposed LMB smoother can effectively improve the tracking performance as compared to the PHD smoother, the MeMBer smoother and the LMB filter and have a lower computational complexity as compared with the PHD smoother and the MeMBer smoother. We should point out that our smoother cannot solve the problem of track fragmentation [34] when the label of a track changes before the track ends. It is our future work to investigate the curve/track fitting approach [19,35,36] for improving the continuity and smoothness of the estimated tracks.

Author Contributions

Methodology, software and draft, R.L.; methodology, review and revision, H.F.; methodology, review and revision, T.L.; methodology, supervision and validation, H.X.

Funding

This work was supported in part by the Fundamental Research Funds for the Central Universities under grant 3102019ZDHQD08 and by National Natural Science Foundation of China under grant 51975482.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Proof of Lemma 1.
If X a and X b are both LMB RFSs and L a L b = , X = X a X b is an LMB RFS. The conclusion can be obtained from the fundamental convolution theorem [2] and also from the independence of the existence probabilities and the densities of different tracks for LMB RFSs [31].
The product of π a X a and π b X b can be denoted as
π a X a π b X b = Δ X a X b L a L b 1 r × L X a X b 1 L a L b r 1 r p x , X a X b = Δ X L 1 r L X 1 L r 1 r p x , X = π X
where L = L a L b represents the label space of X . Note that the computation of (A1) is reversible. That is, the union of multiple LMB RFSs on the disjoint subspaces can compose a single LMB RFS on the joint space and vice versa. □

Appendix B

Proof of Proposition 1.
The proof of Proposition 1 can be summarized as:
Firstly, we can decompose the set integral of the backward smoothing recursion (13) into two parts as (A4): One relates to the survived targets, and the other relates to the newborn targets.
Then, the second part for the newborn targets is equal to 1 as in (A7). We can conclude that newborn targets cannot transmit to the time before birth, and the set integral of (13) equals to the set integral of the first part for the survived targets.
Finally, we prove that the set integral of the first part for the survived targets is an LMB.
From the assumptions, π k | t Y , π k | k 1 Y and π k 1 | k 1 X are all LMB distributions. π k | t Y is denoted as
π k | t Y = Δ Y L 1 : k 1 r k | t L Y 1 L 1 : k r k | t p k | t y , 1 r k | t
The label space of π k | k 1 Y is L 1 : k . π k | t Y is initialized with π t | t Y and its label space is L 1 : k when k = t . π k | t Y keeps the same label space with π k | k 1 Y at all other times as to be explained after (A7). The newborn targets can be denoted by an LMB RFS π B , k | k 1 = r B , k | k 1 , p B , k | k 1 L k . The multi-target transition density f k | k 1 Y | X can be denoted by (16).
Let Y = Y + Y . Y + denotes the newborn targets at k and L Y + L k . Y denotes the surviving targets from k 1 to k and L Y L 1 : k 1 . The backward smoothing recursion (13) can be reformulated as
π k 1 | t X = π k 1 | k 1 X f B , k | k 1 Y + f s , k | k 1 Y | X π k | t Y π k | k 1 Y δ Y
= π k 1 | k 1 X f s , k | k 1 Y | X π k | t Y π k | k 1 Y δ Y Survived targets f B , k | k 1 Y + π k | t + Y + π k | k 1 + Y + δ Y + Born targets
where
π k | t Y = π k | t Y π k | t + Y +
π k | k 1 Y = π k | k 1 Y π k | k 1 + Y +
The decomposition of π k | t Y by (A5) as well as π k | k 1 Y by (A6) complies with Lemma 1. The derivation from (A3) to (A4) also applies the Proposition in Section 3.5.3 of [2] that the single set integral on the joint space can be denoted as a multiple set integral on the disjoint subspaces. The Formula (A4) consists of two parts: One relates to X which involves the smoothing of survived targets, and the other is the smoothing of the newborn targets. As f B , k | k 1 Y + = π k | k 1 + Y + , the second part of (A4) is equal to 1 and the result can be obtained from
f B , k | k 1 Y + π k | t + Y + π k | k 1 + Y + δ Y + = π k | t + Y + δ Y + = 1
The Formula (A7) can be explained as follows: Newborn targets cannot transmit to the time before birth, i.e., if a target is born at k, it cannot be alive at k 1 via backward LMB smoothing. Furthermore, π k | t Y cannot have the targets born after k, and so π k | t Y and π k | k 1 Y have the same label space L 1 : k which is the union of the label space at k 1 and the label space of newborn targets at k.
Combining (35)–(36) and (A7), the Formula (A4) can be further written as
π k 1 | t X = π k 1 | k 1 X f s , k | k 1 Y | X π k | t Y π k | k 1 Y δ Y = π k 1 | k 1 X Δ X 1 r k | t 1 r k | k 1 L 1 : k 1 1 p s , k | k 1 x , X Δ Y × 1 L X L Y 1 r k | k 1 p s , k | k 1 x , r k | t f k | k 1 y | x , p k | t y , 1 p s , k | k 1 x , 1 r k | t r k | k 1 p k | k 1 y , Y δ Y = π k 1 | k 1 X 1 r k | t 1 r k | k 1 L 1 : k 1 L X α s , k | t x , X Δ Y × 1 L X L Y β s , k | t x , α s , k | t x , L Y f k | k 1 y | x , p k | t y , p k | k 1 y , Y δ Y
Applying Lemma 3 of [13] (or Lemma 1 in Section 15.5.1 of [2]) and the power-functional identity in Section 3.7 of [2], we have
π k 1 | t X = π k 1 | k 1 X 1 r k | t 1 r k | k 1 L 1 : k 1 L X α s , k | t x , X × L Y L X L Y β s , k | t x , f k | k 1 y | x , p k | t y , α s , k | t x , p k | k 1 y , d y = π k 1 | k 1 X 1 r k | t 1 r k | k 1 L 1 : k 1 L X α s , k | t x , X × L X 1 + β s , k | t x , f k | k 1 y | x , p k | t y , α s , k | t x , p k | k 1 y , d y = π k 1 | k 1 X 1 r k | t 1 r k | k 1 L 1 : k 1 L X × L X α s , k | t x , + β s , k | t x , f k | k 1 y | x , p k | t y , p k | k 1 y , d y
Substituting π k 1 | k 1 X into (A9) yields
π k 1 | t X = Δ X L 1 : k 1 L X 1 r k 1 | k 1 1 r k | t 1 r k | k 1 L X 1 L 1 : k 1 r k 1 | k 1 × p k 1 | k 1 x , α s , k | t x , + β s , k | t x , f k | k 1 y | x , p k | t y , p k | k 1 y , d y
Let
π k 1 | t x , = r k 1 | t p k 1 | t x , = r k 1 | k 1 p k 1 | k 1 x , α s , k | t x , + β s , k | t x , f k | k 1 y | x , p k | t y , p k | k 1 y , d y
We can obtain that
r k 1 | t = π k 1 | t x , d x
p k 1 | t x , = π k 1 | t x , r k 1 | t = π k 1 | t x , π k 1 | t x , d x
From (A12), we can obtain (33) and the proof is further given in Appendix C. We can also obtain (34) by substituting (A11) into (A13).
Substituting (A12) and (A13) (or (33) and (34)) into (A10) yields
π k 1 | t X = Δ X L 1 : k 1 L X 1 r k 1 | t L X 1 L 1 : k 1 r k 1 | t p k 1 | t x , = Δ X L 1 : k 1 1 r k 1 | t L X 1 L 1 : k 1 r k 1 | t p k 1 | t x , 1 r k 1 | t
From (A14), we get the parameter set (32) which indicates that π k 1 | t X is an LMB where r k 1 | t and p k 1 | t x , are given in (33) and (34), respectively. □

Appendix C

Proof. (from (A12) to (33)).
Firstly, it is known that the target x , at k is the survived target in (A12), L 1 : k 1 and L k . This is because newborn targets cannot transmit to the time before birth via smoothing from (61) (that is, π k 1 / t X cannot have the targets born after k 1 ). Therefore, we can apply the labeled single target predicted density formula (the same as (20) and (21) for the survived target)
r k | k 1 = r k 1 | k 1 p s , k | k 1 x , p k 1 | k 1 x , d x
p k | k 1 y , = r k 1 | k 1 p k 1 | k 1 x , p s , k | k 1 x , f k | k 1 y | x , d x r k | k 1
Then, from the definition of (A11) and (A12), we can obtain
r k 1 | t = π k 1 | t x , d x = α s , k | t x , r k 1 | k 1 p k 1 | k 1 x , d x + r k 1 | k 1 p k 1 | k 1 x , β s , k | t x , f k | k 1 y | x , p k | t y , p k | k 1 y , d y d x
where the first term of the right of (A17) can be written as
α s , k | t x , r k 1 | k 1 p k 1 | k 1 x , d x = 1 r k | t r k 1 | k 1 1 r k | k 1 1 p s , k | k 1 x , p k 1 | k 1 x , d x = 1 r k | t 1 r k | k 1 r k 1 | k 1 r k | k 1
The second term of the right of (A17) can be written as
r k 1 | k 1 p k 1 | k 1 x , β s , k | t x , f k | k 1 y | x , p k | t y , p k | k 1 y , d y d x = r k 1 | k 1 p k 1 | k 1 x , p s , k | k 1 x , f k | k 1 y | x , d x r k | k 1 r k | t p k | t y , p k | k 1 y , d y = r k | t p k | t y , d y = r k | t
Finally, combining (A17)–(A19) yields
r k 1 | t = 1 r k | t 1 r k | k 1 r k 1 | k 1 r k | k 1 + r k | t = 1 1 r k 1 | k 1 1 r k | t 1 r k | k 1
 □

References

  1. Bar-Shalom, Y.; Willett, P.; Tian, X. Tracking and Data Fusion: A Handbook of Algorithms; YBS Publishing: Storrs, CT, USA, 2001. [Google Scholar]
  2. Mahler, R.P.S. Advances in Statistical Multisource-Multitarget Information Fusion; Artech House: Norwood, MA, USA, 2014. [Google Scholar]
  3. Vo, B.N.; Mallick, M.; Bar-shalom, Y.; Coraluppi, S.; Osborne, R.; Mahler, R.P.S.; Vo, B.T. Multitarget Tracking. In Wiley Encyclopedia of Electrical and Electronics Engineering; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2015. [Google Scholar]
  4. Wang, X.; Li, T.; Sun, S.; Corchado, J.M. A Survey of Recent Advances in Particle Filters and Remaining Challenges for Multitarget Tracking. Sensors 2017, 17, 2707. [Google Scholar] [CrossRef] [PubMed]
  5. Meyer, F.; Kropfreiter, T.; Williams, J.L.; Lau, R.; Hlawatsch, F.; Braca, P.; Win, M.Z. Message Passing Algorithms for Scalable Multitarget Tracking. Proc. IEEE 2018, 106, 221–259. [Google Scholar] [CrossRef]
  6. Blackman, S.S.; Popoli, R. Design and Analysis of Modern Tracking Systems; Artech House: Norwood, MA, USA, 1999. [Google Scholar]
  7. Willett, P.; Ruan, Y.; Streit, R. PMHT: Problems and some solutions. IEEE Trans. Aerosp. Electron. Syst. 2002, 38, 738–754. [Google Scholar] [CrossRef]
  8. Mahler, R.P.S. ”Statistics 103” for Multitarget Tracking. Sensors 2019, 19, 202. [Google Scholar] [CrossRef] [PubMed]
  9. Mahler, R.P.S. Multitarget Bayes Filtering via First-Order Multitarget Moments. IEEE Trans. Aerosp. Electron. Syst. 2003, 39, 1152–1178. [Google Scholar] [CrossRef]
  10. Yang, F.; Tang, W.; Liang, Y. A novel track initialization algorithm based on random sample consensus in dense clutter. Int. J. Adv. Robot. Syst. 2018, 15. [Google Scholar] [CrossRef] [Green Version]
  11. Vo, B.T.; Vo, B.N.; Cantoni, A. Analytic Implementations of the Cardinalized Probability Hypothesis Density Filter. IEEE Trans. Signal Process. 2007, 55, 3553–3567. [Google Scholar] [CrossRef]
  12. Vo, B.T.; Vo, B.N.; Cantoni, A. The Cardinality Balanced Multi-Target Multi-Bernoulli Filter and Its Implementations. IEEE Trans. Signal Process. 2009, 57, 409–423. [Google Scholar]
  13. Vo, B.T.; Vo, B.N. Labeled Random Finite Sets and Multi-Object Conjugate Priors. IEEE Trans. Signal Process. 2013, 61, 3460–3475. [Google Scholar] [CrossRef]
  14. Vo, B.N.; Vo, B.T.; Phung, D. Labeled Random Finite Sets and the Bayes Multi-Target Tracking Filter. IEEE Trans. Signal Process. 2014, 62, 6554–6567. [Google Scholar] [CrossRef] [Green Version]
  15. Reuter, S.; Vo, B.T.; Vo, B.N.; Dietmayer, K. The Labeled Multi-Bernoulli Filter. IEEE Trans. Signal Process. 2014, 62, 3246–3260. [Google Scholar]
  16. Arnaud Doucet, A.M.J. A Tutorial on Particle Filtering and Smoothing: Fifteen years later. In The Oxford Handbook of Nonlinear Filtering; Crisan, D., Rozovskii, B., Eds.; Oxford University Press: Oxford, UK, 2008; pp. 656–704. [Google Scholar]
  17. Mahler, R.P.S.; Vo, B.T. Forward-Backward Probability Hypothesis Density Smoothing. IEEE Trans. Aerosp. Electron. Syst. 2012, 48, 707–728. [Google Scholar] [CrossRef]
  18. Vo, B.N.; Vo, B.T.; Mahler, R.P.S. Closed-form solutions to forward-backward smoothing. IEEE Trans. Signal Process. 2011, 60, 2–17. [Google Scholar] [CrossRef]
  19. Li, T.; Chen, H.; Sun, S.; Corchado, J.M. Joint Smoothing and Tracking Based on Continuous-Time Target Trajectory Function Fitting. IEEE Trans. Autom. Sci. Eng. 2019, 16, 1476–1483. [Google Scholar] [CrossRef]
  20. Vo, B.T.; Clark, D.; Vo, B.N.; Ristic, B. Bernoulli Forward-Backward Smoothing for Joint Target Detection and Tracking. IEEE Trans. Signal Process. 2011, 59, 4473–4477. [Google Scholar] [CrossRef]
  21. Wong, S.; Vo, B.T.; Papi, F. Bernoulli Forward-Backward Smoothing for Track-Before-Detect. IEEE Signal Process. Lett. 2014, 21, 727–731. [Google Scholar] [CrossRef]
  22. Nadarajah, N.; Kirubarajan, T.L.T. Multitarget Tracking using Probability Hypothesis Density Smoothing. IEEE Trans. Aerosp. Electron. Syst. 2011, 47, 2344–2360. [Google Scholar] [CrossRef]
  23. Nagappa, S.; Clark, D.E. Fast Sequential Monte Carlo PHD Smoothing. In Proceedings of the International Conference on Information Fusion, Chicago, IL, USA, 5–8 July 2011; pp. 1819–1825. [Google Scholar]
  24. He, X.; Liu, G. Improved Gaussian mixture probability hypothesis density smoother. Signal Process. 2016, 120, 56–63. [Google Scholar] [CrossRef]
  25. Nagappa, S.; Delande, E.D.; Clark, D.E.; Houssineau, J. A Tractable Forward-Backward CPHD Smoother. IEEE Trans. Aerosp. Electron. Syst. 2017, 53, 201–217. [Google Scholar] [CrossRef]
  26. Clark, D.E. First-moment multi-object forward-backward smoothing. In Proceedings of the International Conference on Information Fusion, Edinburgh, UK, 26–29 July 2010; pp. 1–6. [Google Scholar]
  27. Dong, L.; Hou, C.; Yi, D. Multi-Bernoulli smoother for multi-target tracking. Aerosp. Sci. Technol. 2016, 48, 234–245. [Google Scholar]
  28. Vo, B.T.; Vo, B.N.; Gia, H. An Efficient Implementation of the Generalized Labeled Multi-Bernoulli Filter. IEEE Trans. Signal Process. 2017, 65, 1975–1987. [Google Scholar] [CrossRef]
  29. Papi, F.; Vo, B.T.; Vo, B.N.; Fantacci, C.; Beard, M. Generalized Labeled Multi-Bernoulli approximation of Multi-object densities. IEEE Trans. Signal Process. 2015, 63, 5487–5497. [Google Scholar] [CrossRef]
  30. Beard, M.; Vo, B.T.; Vo, B.N.; Arulampalam, S. Void probabilities and Cauchy-Schwarz divergence for Generalized Labeled Multi-Bernoulli models. IEEE Trans. Signal Process. 2017, 65, 5047–5061. [Google Scholar] [CrossRef]
  31. Li, S.; Wei, Y.; Hoseinnezhad, R.; Wang, B.; Kong, L.J. Multi-object Tracking for Generic Observation Model Using Labeled Random Finite Sets. IEEE Trans. Signal Process. 2018, 66, 368–383. [Google Scholar] [CrossRef]
  32. Beard, M.; Vo, B.T.; Vo, B.N. Generalised labelled multi-Bernoulli forward-backward smoothing. In Proceedings of the International Conference on Information Fusion, Heidelberg, Germany, 5–8 July 2016; pp. 1–7. [Google Scholar]
  33. Chen, L. From labels to tracks: It’s complicated. Proc. SPIE 2018, 10646. [Google Scholar] [CrossRef]
  34. Vo, B.N.; Vo, B.T. A Multi-Scan Labeled Random Finite Set Model for Multi-Object State Estimation. IEEE Trans. Signal Process. 2019, 67, 4948–4963. [Google Scholar] [CrossRef] [Green Version]
  35. Li, T. Single-Road-Constrained Positioning Based on Deterministic Trajectory Geometry. IEEE Commun. Lett. 2019, 23, 80–83. [Google Scholar] [CrossRef]
  36. Li, T.; Wang, X.; Liang, Y.; Yan, J.; Fan, H. A Track-oriented Approach to Target Tracking with Random Finite Set Observations. In Proceedings of the ICCAIS 2019, Chengdu, China, 23–26 October 2019. [Google Scholar]
  37. Liu, R.; Fan, H.; Xiao, H. A forward-backward labeled Multi-Bernoulli smoother. In Proceedings of the International Conference on Distributed Computing and Artificial Intelligence, Avila, Spain, 26–28 June 2019; pp. 253–261. [Google Scholar]
  38. Li, T.; Su, J.; Liu, W.; Corchado, J.M. Approximate Gaussian conjugacy: Recursive parametric filtering under nonlinearity, multimodality, uncertainty, and constraint, and beyond. Front. Inf. Technol. Electron. Eng. 2017, 18, 1913–1939. [Google Scholar] [CrossRef]
  39. Li, T.; Bolic, M.; Djuric, P. Resampling methods for particle filtering: Classification, Implementation, and Strategies. IEEE Signal Process. Mag. 2015, 32, 70–86. [Google Scholar] [CrossRef]
  40. Schuhmacher, D.; Vo, B.T.; Vo, B.N. A consistent metric for performance evaluation of multi-object filters. IEEE Trans. Signal Process. 2008, 56, 3447–3457. [Google Scholar] [CrossRef]
Figure 1. The recursion of the multi-target Bayes forward–backward smoother.
Figure 1. The recursion of the multi-target Bayes forward–backward smoother.
Sensors 19 04226 g001
Figure 2. The proposed LMB smoother framework.
Figure 2. The proposed LMB smoother framework.
Sensors 19 04226 g002
Figure 3. The true and estimated trajectories of targets. Different trajectories estimated by the LMB smoother are denoted by different color dots where ‘∘’ denotes the initiations and ‘Δ’ denotes the terminations of the trajectories.
Figure 3. The true and estimated trajectories of targets. Different trajectories estimated by the LMB smoother are denoted by different color dots where ‘∘’ denotes the initiations and ‘Δ’ denotes the terminations of the trajectories.
Sensors 19 04226 g003
Figure 4. The true and estimated trajectories of targets in x and y coordinates, respectively.
Figure 4. The true and estimated trajectories of targets in x and y coordinates, respectively.
Sensors 19 04226 g004
Figure 5. The true and estimated cardinalities of the PHD smoother, the MeMBer smoother, the LMB filter and the proposed LMB smoother.
Figure 5. The true and estimated cardinalities of the PHD smoother, the MeMBer smoother, the LMB filter and the proposed LMB smoother.
Sensors 19 04226 g005
Figure 6. The estimated cardinality errors over time.
Figure 6. The estimated cardinality errors over time.
Sensors 19 04226 g006
Figure 7. The estimated standard deviations of the cardinality over time.
Figure 7. The estimated standard deviations of the cardinality over time.
Sensors 19 04226 g007
Figure 8. OSPA errors yielded by different methods.
Figure 8. OSPA errors yielded by different methods.
Sensors 19 04226 g008
Figure 9. The average execution time of each time step for different methods.
Figure 9. The average execution time of each time step for different methods.
Sensors 19 04226 g009
Table 1. Average OSPA miss distances of different methods for all time.
Table 1. Average OSPA miss distances of different methods for all time.
MethodTotalLocalizationCardinality
OSPA (m)Component (m)Component (m)
PHD filter34.382126.40947.9727
PHD smoother27.319918.65268.6673
CBMeMBer filter30.284222.35667.9276
MeMBer smoother22.072114.66647.4056
LMB filter25.680022.65743.0226
LMB smoother15.176214.49320.6830

Share and Cite

MDPI and ACS Style

Liu, R.; Fan, H.; Li, T.; Xiao, H. A Computationally Efficient Labeled Multi-Bernoulli Smoother for Multi-Target Tracking. Sensors 2019, 19, 4226. https://doi.org/10.3390/s19194226

AMA Style

Liu R, Fan H, Li T, Xiao H. A Computationally Efficient Labeled Multi-Bernoulli Smoother for Multi-Target Tracking. Sensors. 2019; 19(19):4226. https://doi.org/10.3390/s19194226

Chicago/Turabian Style

Liu, Rang, Hongqi Fan, Tiancheng Li, and Huaitie Xiao. 2019. "A Computationally Efficient Labeled Multi-Bernoulli Smoother for Multi-Target Tracking" Sensors 19, no. 19: 4226. https://doi.org/10.3390/s19194226

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop