Next Article in Journal
Topic Recommendation to Expand Knowledge and Interest in Question-and-Answer Agents
Next Article in Special Issue
Advanced Sensors and Sensing Technologies for Indoor Localization
Previous Article in Journal
Experimental Comparison of Direct and Active Throttle Control of a 7 kW Turboelectric Power System for Unmanned Aircraft
Previous Article in Special Issue
An Introduction to Indoor Localization Techniques. Case of Study: A Multi-Trilateration-Based Localization System with User–Environment Interaction Feature
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Forklift Tracking: Industry 4.0 Implementation in Large-Scale Warehouses through UWB Sensor Fusion

1
Department of Information Engineering, University of Pisa, 56122 Pisa, Italy
2
Department of Energy, Systems, Territory and Constructions Engineering, University of Pisa, 56122 Pisa, Italy
3
Institute of Electronics, Computer and Telecommunication Engineering (IEIIT), Italian National Research Council (CNR), 10129 Turin, Italy
*
Author to whom correspondence should be addressed.
Current address: Department of Information Engineering, University of Pisa, 56122 Pisa, Italy.
Appl. Sci. 2021, 11(22), 10607; https://doi.org/10.3390/app112210607
Submission received: 19 October 2021 / Revised: 4 November 2021 / Accepted: 6 November 2021 / Published: 11 November 2021
(This article belongs to the Special Issue Advanced Sensors and Sensing Technologies for Indoor Localization)

Abstract

:
This article addresses the problem of determining the location of pallets carried by forklifts inside a warehouse, which are recognized thanks to an onboard Radio Frequency IDentification (RFID) system at the ultra-high-frequency (UHF) band. By reconstructing the forklift trajectory and orientation, the location of the pallets can be associated with the forklift position at the time of unloading events. The localization task is accomplished by means of an easy-to-deploy combination of onboard sensors, i.e., an inertial measurement unit (IMU) and an optical flow sensor (OFS), with a commercial ultra-wideband (UWB) system through an Unscented Kalman Filter (UKF) algorithm, which estimates the forklift pose over time. The proposed sensor fusion approach contributes to the localization error mitigation by preventing drifts in the trajectory reconstruction. The designed methos was at first evaluated by means of a simulation framework and then through an experimental analysis conducted in a large warehouse with a size of about 4000 m2.

1. Introduction

Locating robots and other vehicles has become a key topic in recent years in the Industry 4.0 framework [1]. Locating industrial vehicles, such as forklifts or laser-guided vehicles (LGVs), is critical to improve the management of large warehouses [2]. It was observed that the sequence in which products are picked and transported by forklifts may account for 55% of the total cost of warehouse operations [3,4]. Therefore, optimizing this process through a continuous monitoring of forklift routes can contribute significantly to the plant efficiency. Additionally, localization of forklifts or other vehicles is the first necessary requirement to implement autonomous vehicles [5].
As field practitioners are now aware, the lack of a reliable Global Navigation Satellite System (GNSS) signal in indoor environments, such as industrial plants and warehouses, makes it necessary to develop localization solutions based on alternative technologies [6]. In many applications, the vehicle coordinates, together with the vehicle orientation, i.e., the vehicle pose, must be estimated [7]. A vehicle can be theoretically localized through proprioceptive sensors that measure its ego-motion parameters, such as inertial measurement units (IMUs) [8], rotary encoders [9], or optical flow sensors [10,11], through a dead reckoning process [12]. However, such a solution is not applicable in the long-term, as the nature of these sensors leads to an accumulation of localization error and a consequent drift of the estimated trajectory. Therefore, to bind the localization uncertainty, proprioceptive sensor data are usually merged with exteroceptive sensor data, providing information about the surrounding environment [13,14].
Several technologies exist that can be adopted as providers of external information in the sound [15,16,17], optical [18,19,20,21,22,23], and electromagnetic domains [24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56].
The sound domain is particularly suitable for performing vehicle localization through rotating sonars equipped on the vehicle [17], but it suffers from several problems, such as noise pollution, which could be significant in industrial environments.
Particularly interesting is the optical domain, which is widely represented in the state-of-the-art [19,20,21,22,23]. The optical domain concerns both solutions with 2D infrared [19] and 3D rotating Light Detection And Ranging (LiDAR) [20] or computer vision (CV) techniques in the visible light domain [21,22,23]. The first type of solutions can achieve accurate localization performance thanks to the recent improvements in localization and simultaneous localization and mapping (SLAM) algorithms for mobile robots [19]. Rotating LiDARs are able to reconstruct the environment map and to measure the trajectory of the vehicle contextually, but they suffer from the problem of moving obstacles in their field of view, which may cause the map-creation process to fail. Therefore, a crowded or dynamic environment could not be suitable for these systems. The CV techniques based on visible-light cameras are robust, but indeed suffer similar issues. Moreover, the computational burden of these methods is high; poor light conditions may invalidate the quality of the gathered images, and privacy issues can be raised if the camera gets images/videos of people.
Some technologies in the electromagnetic domain, instead, provide a more robust way to release from the environmental effects. For example, if the area to be monitored is small, magnetic systems can be used [24,25,26]. Magnetic fields are also useful in vehicle localization for magnetometers contained in IMUs, used to calibrate gyroscope biases and to improve the robot tracking performance [27].
Radio frequency (RF) technologies are more suitable for several distance-measurement strategies depending on the available bandwidth. For solving the localization task, most widespread radio wave technologies require an infrastructure of devices installed at known locations, namely anchors. Then, onboard equipment exchanges a signal with the anchors to measure the vehicle–anchor distance, which is then exploited to compute the vehicle location through multilateration algorithms [28]. The vehicle–anchor distance can be measured by leveraging the received signal power [29], or the Time Of Flight (TOF) if the available communication bandwidth is enough [30]. Wi-Fi signals at 2.4 GHz are very widespread, as they can usually take advantage of a pre-existing infrastructure for data exchange inside the plant/warehouse. The Wi-Fi service bandwidth is limited to 85 MHz, which is not sufficient to perform TOF measurements; therefore, distance measurements are done by retrieving electromagnetic path loss models, which aim at linking the received power with the distance from two devices [31,32]. Unfortunately, received power measurements are strongly affected by the multipath propagation phenomena. Indeed, Wi-Fi fingerprinting techniques can be used to circumvent this problem by avoiding to perform distance measurements with path loss models, at the expense of long calibration procedures, which are hard to make in large industrial environments [33].
Recently, Radio Frequency IDentification (RFID) technology has become popular for mobile robot localization [34], especially passive RFID technology at an ultra-high frequency (UHF) band. Such systems have the advantage of not requiring any power supply to the anchors, but their reading range is limited to few meters; therefore, it could be difficult to implement such a system in a large industrial environment, unless a low-density deployment of RFID anchors is achieved [35].
Ultra-wideband (UWB) systems were expensive in the past; however, nowadays, commercial solutions exist at a reduced cost, by effectively introducing this technology as one of the most promising for performing radio wave-based ranging measurements [36,37]. The large bandwidth allows for precise and accurate ranging measurements based on the time of arrival (TOA) and time difference of arrival (TDOA) schemes, with a few cm error and with a 20–30 m signal distance coverage for each anchor. Therefore, by deploying a fine-grained infrastructure of UWB anchors, the vehicle location can be estimated through a multilateration algorithm, which leverages the estimated distances from an onboard UWB tag [38,39,40]. With a pair of UWB tags installed onboard, the vehicle orientation might be measured, too, at the expense of a more complex signal processing [41].
UWB systems suffer from multipath phenomena, and they are not insensitive to the environment [42]. However, thanks to machine learning and other techniques [43,44,45,46,47], the effect of detrimental multipath or non-line-of-sight (NLOS) conditions can be mitigated. Alternatively, multi-sensor data fusion, also known as sensor fusion, can be used to help the UWB system in the localization task, as proposed in this paper [48,49,50,51,52,53,54,55,56].
By generally speaking, sensor fusion localization schemes foresee combining the sensed data from different technologies to obtain a more accurate and reliable location estimation. The fused sensors can be of different natures, and it is common to find in the literature solutions that combine together several exteroceptive sensors [49,50,51,52,53,54]. However, the installation of multiple exteroceptive sensor systems can be wasteful, in terms of time, cost, and computational burden. Therefore, the fusion of proprioceptive sensors and a single exteroceptive sensor is preferable [55,56]. Most of these kinds of schemes rely on the usage of recursive dynamic state system estimators, such as the Kalman Filter (KF) and its variants [57], or Monte Carlo estimators, such as the Particle Filter (PF) and its variants [58]. These methods foresee two stages: (i) the prediction step, where the proprioceptive sensors are used to compute a first hypothesis on the vehicle motion; (ii) the update step, where the exteroceptive sensors are used to correct the first hypothesis on the vehicle motion and to reduce the error by preventing trajectory drifts and unbounded growths of the localization uncertainty.
In this paper, we will show an RFID Smart Forklift designed to operate within a Warehouse 4.0 for tissue paper storage. The products are organized in pallets, which are tagged with passive UHF-RFID tags that allow their identification. The forklift is equipped with a series of sensors that enable its self-localization thanks to an on-site processing unit. In particular, the smart forklift is equipped with a UWB tag, which receives the signal from a set of UWB anchors installed at the warehouse ceiling, an IMU, an optical flow sensor (OFS) [10,11], and four short-range time-of-flight laser-ranging distance sensors used to monitor the height above ground of the optical sensor. The OFS consists of a small infrared camera pointing to the floor to measure the forklift ego-motion, so it is not affected by environmental issues, such as poor light conditions or moving people and obstacles. In addition, the forklift is equipped with a UHF-RFID reader connected to four UHF-RFID antennas for pallet recognition, and an ultrasonic sensor to verify the forklift loaded/unloaded status. Through a sensor fusion tracking algorithm based on the Unscented Kalman Filter (UKF) [59], the trajectory and orientation of the forklift are reconstructed by fusing together the data acquired by the UWB system and the onboard sensors. The position of the pallets is then simply associated with the one of the forklift at the time of unloading events. All loading/unloading events and the position of the forklift over time are sent via a Wi-Fi connection to a central server for the operations management. In this way, it is possible to trace the movements of all the goods inside the warehouse. The focus of the work is to show how the proposed tracking algorithm is able to ensure good localization accuracy and a correct estimation of the vehicle orientation, which is essential to accurately determine the exact position of the unloaded pallets. The advantages of the proposed tracking methods are as follows: (i) a single UWB tag and the onboard kinematic sensors allow to correctly reconstruct the forklift orientation; (ii) the optical flow sensor, considered here as a proprioceptive sensor, is used in combination with the UWB technology to obtain high-accuracy tracking performance; (iii) the sensor fusion scheme only foresees a single exteroceptive sensor technology for providing environmental external data (i.e., the UWB system), so it is simple and easy to install; (iv) the computational burden is low and the method can be fruitfully exploited for real-time tracking.
The paper is structured as follows: Section 2 provides an algebraic explanation of the proposed localization method, by describing the UKF design and the UWB localization technique. Section 3 shows a simulated analysis to verify the localization algorithm capability. Section 4 presents a detailed description of the experimental equipment, the performed trials, and the experimental results. Finally, Section 5 states conclusions and future works.

2. UWB Forklift Localization Method

2.1. Forklift Motion Model

In this section, we present the forklift 2D tracking method based on the integration of IMU, OFS, and UWB sensors in a sensor-fusion algorithm. The tracking method is based on the dynamic system state estimation theory and relies on an UKF estimator. A forklift is usually a heavy vehicle with relatively high inertia, and can be modeled as a rigid body with size L × W × H , being L, W, H the length, width, and height of the vehicle, respectively. In this analysis, we suppose that the IMU and OFS are located at the same location, whereas the onboard UWB tag is in another location. This configuration will also be adopted in the experimental analysis in Section 4.
First, we define a right-handed local reference frame N = { O n , X n , Y n , Z n } as the reference frame of the target area, which could be a plant or a warehouse (we will refer to N as the plant/warehouse reference frame). O n indicates the origin of the reference frame and X n , Y n , and Z n are the x-, y- and z-axes, respectively.
We define at each timestep k a body reference frame B ( k ) = { O b ( k ) , X b ( k ) , Y b ( k ) , Z b ( k ) } where O b ( k ) is the origin of the reference frame, which is centered on the IMU/OFS location, whereas X b ( k ) , Y b ( k ) , and Z b ( k ) are the local x-, y- and z-axes at timestep k, respectively. The IMU and the OFS sensor reference frames are considered oriented as the forklift body frame. The local forklift reference frame and the plant/warehouse reference frame are then linked by a rigid roto-translational transformation, which depends on the location [ x k , y k ] T and orientation θ k of the forklift at each timestep k, as described in Figure 1, where [ x k , y k ] T corresponds with O b ( k ) , and θ k is the angle between X b ( k ) and X n . The UWB tag has a constant displacement in the body frame, with respect to O b ( k ) , which can be defined as Δ p UWB = [ Δ x U W B , Δ y U W B , Δ z U W B ] T . This offset must be translated in the plant/warehouse frame according to the estimated orientation θ k .
The UWB tag location is estimated through a multilateration algorithm, which is supposed to involve a maximum of four UWB anchors at once, as the case of the experimental analysis. Without loss of generality, we suppose that the UWB system measures each UWB tag coordinate with an additive white Gaussian error with zero-mean value and standard deviation σ U W B , namely ν x k N ( 0 , σ U W B 2 ) , and ν y k N ( 0 , σ U W B 2 ) for the x- and y-coordinates, respectively. The noise standard deviation σ U W B increases when the tag anchor distance increases due to a lower signal quality. However, as it will be shown in the experimental-analysis section, the anchors are attached at the ceiling, and the forklift is constrained to travel in such a way that the distances of the four closest UWB anchors are always similar; consequently, the noise parameters can be considered as constant.
The onboard sensors measure kinematic quantities with respect to the body frame. In particular, since the forklift is constrained to move on the ground, we consider the 2D acceleration a k = [ a x k , a y k ] T , the angular speed ω k , and the 2D forklift velocity v k = [ v x k , v y k ] T . The accelerations are measured through the IMU and they are considered as affected by a zero-mean additive white Gaussian noise, named as ν a x k N ( 0 , σ a 2 ) , and ν a y k N ( 0 , σ a 2 ) , where σ a is the acceleration-noise standard deviation. Similarly, the angular speed noise is ν ω k N ( 0 , σ ω 2 ) , where σ ω is the angular speed-noise standard deviation. Consider that, since the forklift is a rigid body, the angular speed can be measured in any point of the vehicle, and it is not mandatory to measure it at the rotation center of the vehicle. The OFS measures the displacement of the forklift among two consecutive timesteps, thanks to a computer vision algorithm that analyzes floor images acquired at different timesteps. The velocity measurements can be computed by dividing the displacement for the sampling time Δ t . For the OFS, the quantization noise is preponderant with respect to other noise sources, so the velocity measurements can be considered affected by two uniformly distributed random variables named q v x k U ( [ r / ( 2 Δ t ) , r / ( 2 Δ t ) ] ) , and q v y k U ( [ r / ( 2 Δ t ) , r / ( 2 Δ t ) ] ) , where r is the sensor spatial resolution and Δ t is the sampling time.
The dynamic state vector is s k = [ x k , y k , v x k , v y k , a x k , a y k , ω k , θ k ] T R 8 × 1 . The dynamic state vector at timestep k + 1 , s k + 1 , can be written as a non-linear function of s k through the following state transition model:
s k + 1 = f ( s k ) = x k + 1 y k + 1 v x k + 1 v y k + 1 a x k + 1 a y k + 1 ω k + 1 θ k + 1 = x k + Δ t ( v x k cos ( θ k ) v y k sin ( θ k ) ) y k + Δ t ( v x k sin ( θ k ) + v y k cos ( θ k ) ) v x k + Δ t a x k v y k + Δ t a y k a x k a y k ω k θ k + Δ t ω k
The high inertia of the forklift makes reasonable the assumption of a quasi-constant velocity motion model between two consecutive timesteps. Therefore, we can suppose that the process noise only applies on the acceleration quantities and the angular speed. Without loss of generality, we will assume that the process noise statistics have the same statistics of the measurement noise for the same variable. The process noise is not additive. In (2), we made explicit the presence of the state noise vector w k = [ ν a x k , ν a y k , ν ω k , q ^ v x k , q ^ v y k ] T in the state transition model:
s k + 1 = f ( s k , w k ) = x k + 1 y k + 1 v x k + 1 v y k + 1 a x k + 1 a y k + 1 ω k + 1 θ k + 1 = x k + Δ t [ ( v x k + q ^ v x k ) cos ( θ k ) ( v y k + q ^ v y k ) sin ( θ k ) ] y k + Δ t [ ( v x k + q ^ v x k ) sin ( θ k ) + ( v y k + q ^ v y k ) cos ( θ k ) ] v x k + q ^ v x k + Δ t ( a x k + ν ^ a x k ) v y k + q ^ v y k + Δ t ( a y k + ν ^ a y k ) a x k + ν ^ a x k a y k + ν ^ a y k ω k + ν ^ ω k θ k + Δ t ( ω k + ν ^ ω k )
where with the symbol “ · ^ ” we indicate the corresponding process noise variable. At each timestep k, the following set of measurements z k = h ( s k , n k ) R 7 × 1 is available, where n k represents the set of measurement noises:
z k = h ( s k , n k ) = x k ˜ y k ˜ v x k ˜ v y k ˜ a x k ˜ a y k ˜ ω k ˜ = x k U W B Δ x U W B cos ( θ k ) + Δ y U W B sin ( θ k ) + ν x k y k U W B Δ x U W B sin ( θ k ) Δ y U W B cos ( θ k ) + ν y k v x k + q v x k v y k + q v y k a x k + ν a x k a y k + ν a y k ω k + ν ω k
where x k U W B and y k U W B are the UWB tag coordinates measured in the plant/warehouse frame through the UWB system and which will be described in the next section. Thanks to these two sets of Equations (2) and (3), a UKF can be adopted to estimate the forklift location in the plant/warehouse reference frame and its orientation. The UKF filter needs, for several initial parameters, to run, for instance, the parameters α , K, and β [59]. α and K size the spread of the sigma-points, whereas the parameter β must be chosen according to the data distribution. Typical recommendation is α = 10 3 , K = 0 , and β = 2 [60]. The initial covariance matrix P 0 R 8 × 8 is set to the following:
P 0 = σ U W B 2 0 0 0 0 0 0 0 0 σ U W B 2 0 0 0 0 0 0 0 0 r 2 12 Δ t 2 0 0 0 0 0 0 0 0 r 2 12 Δ t 2 0 0 0 0 0 0 0 0 σ a 2 0 0 0 0 0 0 0 0 σ a 2 0 0 0 0 0 0 0 0 σ ω 2 0 0 0 0 0 0 0 0 σ θ 2
where σ θ is the standard deviation of the forklift initial orientation error.

2.2. UWB Positioning

Several UWB anchors are deployed in the target area to perform the forklift localization through a simple multilateration algorithm. Since the commercial system adopted in the experimental analysis can use up to four anchors per each timestep, the following method is described by considering only four anchors, even if more of them are detected.
Let us consider a set of N U W B UWB anchors deployed in the warehouse at known locations p u , j = [ x u , j , y u , j , z u , j ] T , j [ 1 , , N U W B ] , as outlined in Figure 2. At each timestep k, we can write the vector r k of N U W B elements, in which the j-th element r k ( j ) { 0 , 1 } is 1 if the j-th UWB anchor data are available at the timestep k, and it is 0 otherwise. Let us assume that, at each timestep, at least four anchors are detected by the onboard UWB tag, and a distance measurement d i is available for each anchor, where i represents the index of an anchor in the set of available anchors at the timestep k. For simplicity, let us assume that the set of indexes i = [ 1 , 2 , 3 , 4 ] is always chosen for each timestep.
At each timestep, the following equations system can be written:
( x u , 1 x k U W B ) 2 + ( y u , 1 y k U W B ) 2 + Δ z 2 = d 1 2 ( x u , 2 x k U W B ) 2 + ( y u , 2 y k U W B ) 2 + Δ z 2 = d 2 2 ( x u , 3 x k U W B ) 2 + ( y u , 3 y k U W B ) 2 + Δ z 2 = d 3 2 ( x u , 4 x k U W B ) 2 + ( y u , 4 y k U W B ) 2 + Δ z 2 = d 4 2
where we consider Δ z = z u , j z k U W B a constant for each timestep and for each anchor. The system can be linearized if we subtract the first row to all the others. With such an operation, all of the quadratic unknown terms are canceled. After some algebraic manipulations, we obtain:
( x u , 2 x u , 1 ) ( y u , 2 y u , 1 ) ( x u , 3 x u , 1 ) ( y u , 3 y u , 1 ) ( x u , 4 x u , 1 ) ( y u , 4 y u , 1 ) x k U W B y k U W B = ( d 1 2 d 2 2 ) + ( x u , 1 2 + y u , 1 2 ) ( x u , 2 2 + y u , 2 2 ) ( d 1 2 d 3 2 ) + ( x u , 1 2 + y u , 1 2 ) ( x u , 3 2 + y u , 3 2 ) ( d 1 2 d 4 2 ) + ( x u , 1 2 + y u , 1 2 ) ( x u , 4 2 + y u , 4 2 )
which is an over-determined linear equation system of the form
A x = b
which can be solved by computing the Moore–Penrose pseudo-inverse matrix A to get a measurement of [ x k U W B , y k U W B ] T :
x k U W B y k U W B = x = ( A T A ) 1 A T b = A b

3. Numerical Analysis

The performance of the proposed UKF algorithm are here demonstrated through a numerical analysis. We consider an indoor environment, where N U W B = 35 UWB anchors are deployed according to a regular 7 × 5 grid in a 40 m × 25 m area at the ceiling and at a height h = 6 m. The spacing along the x-coordinate of each anchor is 6.6 m, whereas along the y-coordinate is 6.25 m. The forklift is here considered as a single point with all of the sensor payloads (IMU, OFS, and UWB tag) placed at the same point. Therefore, Δ x U W B = Δ y U W B = 0 m. The height of the UWB tag is 2 m, so Δ z = 4 m. The sampling time Δ t is set to 10 ms. At each timestep k, the algorithm exploits the measured distance between the UWB tag and the four closest UWB anchors to perform the multilateration. At the same time, the IMU and the OFS generate acceleration, angular speed, and velocity data. In accordance to the data sheet of the sensors that will be used for the experimental analysis, we set σ a = 16.7 mm/s2, σ ω = 2.4 mrad/s, r = 2.3 cm, σ U W B = 0.15 m, and σ θ = 0.1 rad. The forklift starts a “square wave-shaped” path from location [ x 0 , y 0 ] T = [ 0 , 0 ] T m with an orientation θ 0 = π / 2 rad, and travels with a speed of 0.75 m/s until the location [ x N c , y N c ] T = [ 32.6 , 15 ] T m after N c = 10,000 algorithm steps. The path-length is approximately 75 m. The UKF filter is run with α = 10 3 , β = 2 , and K = 0 .
A top view of the performed simulated trajectory is depicted in Figure 3. The UWB anchors are the gray squared markers, and the ground truth trajectory is depicted as a continuous gray line. We represent, with a continuous blue line, the trajectory estimated with the UWB system only. The output of the dead-reckoning process by using only IMU and OFS is instead depicted as a dashed red line, whereas the output of the proposed UKF-based algorithm is the pointed green line.
As can be seen, the UWB system alone is not affected by drifts, but the output trajectory is not smooth; therefore, is not easy to accurately estimate the vehicle orientation. The output of the dead-reckoning process (dashed red line) is a smooth curve, but since only proprioceptive sensors are involved, it is affected by a detrimental drift, especially caused by the errors on the measured angular speed, which have a great impact on the vehicle orientation estimation. The output of the UKF (pointed green line) is a smooth trajectory without drifts, meaning that the filter converges. As expected, the trajectory of the vehicle gets further from the ground truth trajectory along the curves, but the algorithm is able to recover the track in few steps. In Figure 4, we depict the localization errors along the x-coordinate ϵ x (Figure 4a), along the y-coordinate ϵ y (Figure 4b), and the distance error ϵ d = ϵ x 2 + ϵ y 2 (Figure 4c). From the figures, an initial transient stage is clearly observable, where the UKF must achieve the convergence, which is indeed reached in a few seconds.
To better investigate the method performance, a Monte Carlo analysis with M = 100 test cases with similar features as the one represented in Figure 3 was conducted. The results of the cumulative distribution function of the distance error ϵ d for the different tracking methods is represented in Figure 5. As apparent, the error of the UKF is bound to 0.5 m. Figure 6, instead, shows the histogram of the orientation error σ θ . It can be noticed that the orientation error is lower for the UKF. In particular, the global mean orientation errors are ϵ ¯ θ U W B = 0.31 rad, ϵ ¯ θ I M U + O F S = 0.26 rad, and ϵ ¯ θ U K F = 0.01 rad, whereas the standard deviations are σ θ U W B = 2.11 rad, σ θ I M U + O F S = 0.11 rad, and σ θ U K F = 0.05 rad. This means that the accuracy of UWB system alone with a single tag is not enough to correctly retrieve the forklift orientation.

3.1. Effect of Forklift Speed

The forklift speed might impact the UKF performance. To check it, we simulated the forklift trajectory as that depicted in Figure 3, but we changed the forklift speed among the values v = [ 0.75 , 1.11 , 1.67 , 2.22 , 2.75 ] m/s. The number of samples for each trajectory was then resized in order to keep unchanged the path length to 75 m for all trials. For each speed value, we performed 100 test cases and then we computed the mean value of ϵ d , named here as ϵ ¯ d . We show the results in Figure 7. The UWB performance is here unchanged due to the assumed simulation model. Instead, the mean distance error for the dead-reckoning process (circular red markers) increases with the forklift speed, and so, also, the mean distance error of the UKF, due to the fact that the sensor fusion approach relies on both UWB and proprioceptive sensor data. On the other hand, the UKF keeps “bounded” the increasing of the mean distance error, which is lower than the error committed by the UWB system, even when the forklift travels at 2.75 m/s. Given the fact that forklifts usually cannot travel faster than 2–3 m/s for safety reasons, we can conclude that the UKF is robust with respect to the increase of the forklift speed.

3.2. Effect of Initial Uncertainty

Tracking algorithms should be robust with respect to errors in the filter initialization. Positioning systems as the UWB systems are not affected by this problem. Dead-reckoning, instead, is invalidated by large initialization error. A sensor fusion algorithm based on sequential estimators such as the UKF, should converge even if an initial error on the dynamic state estimation is present. For this purpose, we performed an analysis of the method performance when the matrix P 0 changes. We considered different values of σ U W B and σ θ to perform the method with 11 different configurations of the initial error standard deviation. The values of the changed parameters in each configuration are shown in Table 1. For each configuration, we ran the UKF 100 times and we observed the mean distance error ϵ ¯ d .
The results are depicted in Figure 8, for a forklift speed of v = 0.75 m/s. As it can be noticed, the UKF performance is not influenced by the initial error uncertainty because the mean localization error does not change with different initial conditions.

4. Experimental Analysis

4.1. The RFID Smart Forklift

This section aims at describing the sensors installed on the forklift. Forklifts are electrically driven vehicles and exist with either three wheels or four. The employed forklift is a four-wheel OM Still RX 20-18P/Li-Ion. It has a front wheel drive, so traction is carried out by the two front wheels, which can also rotate for the vehicle steering.
The odometry data were not available and, therefore, we could not adopt the unicycle or bicycle vehicle model. However, thanks to the presence of the IMU and the OFS, it is still possible to reconstruct the dynamic state model of the vehicle.
The sizes of the vehicle are L = 2.78 m, W = 1.14 m, H = 2.08 m. The onboard hardware is the following:
  • Decawave DWM1001 UWB module [61] configured as tag;
  • A box containing the kinematic proprioceptive sensors: one Agilent ADNS3080 Optical Flow Sensor [62], four STMicroelectronics VL53L0X Time-of-Flight laser-ranging distance sensors [63], STMicroelectronics LSMDS3 inertial measurement unit platform [64];
  • a RENESAS SK-S7G2 microcontroller [65];
The forklift also has, installed, onboard, the RFID hardware for the detection of the loaded pallets:
  • UHF-RFID reader Impinj Speedway Revolution R420 [66];
  • two UHF-RFID Keonn Advantenna−p11 antennas for loaded pallet recognition [67];
  • two UHF-RFID Alien A0501 antennas for stocked pallet detection and localization [68];
A picture of the RFID Smart Forklift is shown in Figure 9. The box containing the OFS, the four Time-of-Flight laser-ranging distance sensors, and the IMU was installed close to the right-front wheel of the forklift at a height of 15.5 cm, in such a way that the OFS camera pointed towards the floor. The box is assumed as the origin of the body reference frame O b ( k ) . The OFS and the IMU were jointly used for the sensor fusion localization algorithm together with the UWB data, whereas the Time-of-Flight laser-ranging distance sensors were used to accurately measure the height of the OFS from the floor and keep track of possible height variations of the sensor. Such measurements are necessary to calibrate the sensor measurements, which turns out to have a spatial resolution of r = 2.3 cm when the sensor is placed at a 15.5 cm height. According to the datasheet of the LSMDS3 IMU, the acceleration noise has a standard deviation σ a = 16.7 mm/s2, whereas the angular speed noise exhibits a standard deviation σ ω = 2.4 mrad/s.
Considering that the UWB anchors will be installed at the room ceiling, the UWB tag is instead positioned on the roof of the forklift, to ensure its visibility. The smart forklift is equipped with a Decawave DWM1001 UWB module of the MDEK1001 development kit placed on the forklift roof with a displacement of Δ p UWB = [ Δ x U W B , Δ y U W B , Δ z U W B ] T = [ 0.61 , 0.56 , 2.04 ] T m, with respect to the box containing the IMU and the OFS. Due to forklift structure reasons, it was not possible to place the UWB tag directly above the IMU and OFS, so it also necessary to foresee that the 2D position measured by the UWB system does not correspond to that of the other two sensors. All onboard sensors are properly time-synchronized together through the RENESAS SK-S7G2 microcontroller, which sets the sampling time at Δ t = 10 ms.

4.2. UWB Anchors

During the project, a demonstrator of the system was developed in one of the Sofidel S.p.a. large warehouses, by selecting a target area of size 4560 m2, divided in two corridors. Thirty UWB Decawave anchors were installed at the ceiling of the warehouse designed area, by taking advantage of the presence of six metallic bars at a height of around 6 m oriented along the y-direction of the warehouse and spaced along the x-direction of around 10 m. A total of 30 UWB Decawave (Figure 10) DWM1001 modules configured to work as UWB anchors were installed on the bars (Figure 10a). The map of their deployment is shown in Figure 10b, where each UWB anchor is denoted as a blue asterisk. The location of UWB anchors was measured through the Leica Flexline TS03 manual total station [69]. It is a device equipped with lasers and a high-precision orientation meter that can provide a measurement of three-dimensional coordinates of a point within the space, with millimeter accuracy.

4.3. Results

To test the effectiveness of the UKF localization for the RFID Smart Forklift, tests were carried out in the warehouse. An adhesive tape was placed on the ground to create a reference path for the forklift dirver during the trials. By analyzing the estimated trajectories and comparing them with the reference path, it was possible to evaluate the localization performance of the UWB system alone, of the dead-reckoning process (IMU + OFS), and of the UKF. It must be highlighted that, the method used by the Decawave UWB system to estimate the tag location is not known to users, but it is known that no more than four UWB anchors at the same time are employed (this is why we performed the simulations in Section 3 by considering only four anchors per timestep when computing the multilateration).
Figure 11 shows a photo of a part of the reference path indicated by the adhesive tape stuck on the ground. To make sure that the forklift was following the reference path accurately, the OFS mounted on the forklift was used. In fact, this sensor projects on the ground a red-light beam footprint, which can be used to check if the reference points marked with the tape are correctly crossed during the forklift motion.
Nine forklift trajectories were performed with different speeds. The main features of the performed test case trajectories were resumed in Table 2 in terms of forklift forward speed v, path-length L, path shape, and starting location and orientation.
In Figure 12, we represent two test-case trajectories about the cases IV (Figure 12a) and IX (Figure 12b), respectively, which have comparable forklift speeds (1.2 m/s and 1.6 m/s, respectively), but different path-lengths (∼100 m and ∼289 m, respectively). The ground truth trajectory (continuous gray line) is the pre-determined reference path built by using the adhesive tape. The measured path with the UWB is depicted as continuous blue line, and it is the output of the Decawave commercial system after a mobile-median filter that filters out the outliers. The measured trajectory with IMU and OFS only is the dashed red line, and the estimated trajectory by using the UKF is the pointed green line. As it can be seen, the dead-reckoning and the UKF localization methods, thanks to the use of onboard kinematic sensors, outputs very smooth trajectories, which are therefore realistic due the high inertia of heavy industrial vehicles, such as forklifts. On the contrary, the UWB localization system exploiting multilateration can have some large oscillations around the reference path, as, for example, in the “straights” portions of the trajectory represented in Figure 12b. However, for long trajectories, and consequently during the regular running of forklift operations in the warehouse where the localization system must never be interrupted, the dead-reckoning suffers from drifts. An example of this phenomenon can be seen in Figure 12b, when the forklift performs a closed-loop path, but the dashed red line does not close itself to its starting point.
To better highlight the tracking performance improvement given by the employment of the proposed UKF algorithm in this two test cases, Figure 13 shows the boxplot of the errors along the x-coordinate (Figure 13a for test case IV and Figure 13b for test case IX), the y-coordinate (Figure 13c for test case IV and Figure 13d for test case IX), the distance error (Figure 13e for test case IV and Figure 13f for test case IX), and the orientation error ϵ θ (Figure 13g for test case IV and Figure 13h for test case IX).
From the boxplots, it is possible to retrieve the mean localization errors in terms of distance error. Such errors are reported in Table 3. As it can be seen, the UKF is able to reduce both localization and orientation errors.

4.4. Effect of Forklift Speed

As expected, the forklift speed might impact the UKF performance. Therefore an error analysis with varying forklift speed was conducted also via experimental tests. By considering test case trajectories I-VI, the forklift speed assumes the values 0.75 m/s (test case I and test case II), 1.6 m/s (test case III and test case IV), and 2.75 m/s (test case V and test case VI), and the path length is around 100 m. Figure 14 shows the mean distance error ϵ ¯ d when the forklift speed varies. Different from the analysis conducted with the numerical simulations (Figure 7), the error given by the UWB system varies with the speed, but it decreases until the forklift speed reaches 1.6 m/s (IV) and then rises again. This effect could be explained by considering that for low speeds the forklift spends more time by doing maneuvers and curves, in which the UWB system does not perform well. When the forklift travels fast, instead, as in test cases V and VI, the UWB system performs worse. The onboard kinematic sensors, however, perform better for low-speed trajectories, and their error increases with the forklift speed. Hence, by looking at Figure 14, it can be concluded that low forklift speeds are preferred by the proposed system, which is able anyway to reduce the localization error to values always lower than those guaranteed by the UWB system only, or by the kinematic sensors (IMU + OFS). The minimum obtained error with the UKF is around 1 m in for the test cases I and II.

4.5. Effect of Initial Uncertainty

This section aims at investigating whether the proposed UKF-based tracking algorithm is robust with respect to errors in the initialization of the filter. We performed an analysis of the method performance by artificially introducing an error on the initial state s k , namely by changing the initial error covariance matrix P 0 according to the same values reported in Table 1. For each configuration, we ran the UKF and we observed the mean distance error ϵ ¯ d when the forklift performed the test-case trajectory IX. The results are depicted in Figure 15. As it can be noticed, as well as in real situations, the UKF performance is not influenced by the initial error uncertainty. The trajectory reconstruction with the onboard sensors only, instead, is greatly influenced by the initial uncertainty error, as typical for dead-reckoning approaches.

4.6. Global Performance

Figure 16 shows the cumulative distribution function of the distance error ϵ d for the different tracking methods (UWB, IMU + OFS, UKF) by considering all the test cases I–IX. As it can be noticed, the error of the UKF is bound to 4 m, which is acceptable for a tracking method in an industrial scenario of 4560 m2 by also considering the forklift sizes. The forklift orientation, indeed, is better reconstructed with the UKF as shown in the histogram in Figure 17. In particular the global mean orientation errors are ϵ ¯ θ U W B = 0.1 rad, ϵ ¯ θ I M U + O F S = 0.18 rad, and ϵ ¯ θ U K F = 0.014 rad, whereas the standard deviations are σ θ U W B = 0.85 rad, σ θ I M U + O F S = 1.12 rad, and σ θ U K F = 0.22 rad. Accordingly to what was observed in the simulated analysis, the UWB system with a single tag is still not able to correctly retrieve the forklift orientation, but it has a better behavior with respect to the simulated analysis. This effect is obtained thanks the data processing implemented by the UWB system. The small orientation error committed with the UKF allows for determining the forklift orientation at the time of pallet unloading and, therefore, to better recognize the location in which the pallet is unloaded.

4.7. Computational Burden

Finally, before installing the localization algorithm on the onboard microcontroller, we demonstrated the relatively low computational burden of the proposed method. For such an analysis, the elaboration time of the test cases I–IX has been measured when the processing was done on a laptop with an Intel(R) Core(TM) i7-7700HQ CPU @ 2.80 GHz and 16 GB RAM through a MATLAB code. The trajectory processing time is reported in Table 4. By considering the number of UKF timesteps, we can observe that the average processing time for each timestep is 0.82 ms. By considering a sampling time of Δ t = 10 ms, we can therefore conclude that the method is suitable for real-time applications, even with less performing CPUs. Indeed, the proposed system might also be considered in the future to be tested for safety applications to prevent forklift-to-forklift and forklift-to-human collisions [70].

5. Conclusions

In this paper, a system consisting of an RFID-equipped forklift that is able to recognize the loaded pallets and to self-localize was addressed for a tissue paper factory warehouse. Thanks to the forklift trajectory and orientation reconstruction, the location of the pallets can be associated with the forklift position at the time of unloading events. The forklift self-localization was conducted through a sensor fusion algorithm that combined data acquired from an onboard inertial measurement unit (IMU) and an onboard optical flow sensor (OFS) with a commercial ultra-wide band (UWB) system through an unscented Kalman filter (UKF) estimator. The proposed method is relevant to the state-of-the-art because of the usage of the optical flow sensor as an onboard proprioceptive sensor to measure the forklift ego-motion, which allows the employment of an optical domain sensor without the disadvantage of the dependence from environmental light conditions. After a first numerical investigation of the method performance, the whole system was evaluated in a 4560 m2 warehouse where thirty Decawave commercial UWB anchors were installed at the ceiling, at known locations. Nine different test case trajectories were performed with different paths and forklift speed in order to investigate the sensor robustness to different operating conditions. The experimental analysis showed a decrease of the performance of onboard kinematic sensors and the UWB system with increasing forklift speed, which can be mitigated by the usage of the UKF. Indeed, the proposed method achieved a global experimental mean localization error of 1 m, with a maximum error of 4 m, and the orientation of the forklift was reconstructed with a mean error of 0.014 rad. Moreover, the UKF resulted in being “robust”, with respect to different initialization conditions, even for long paths (almost 300 m). An analysis of the UKF computational burden suggests the real-time feasibility of the proposed method. The achieved localization performance together with a limited computational burden makes the proposed low-cost, robust, and easy-to-deploy system suitable, not only for real-time monitoring of the warehouse status, but also as a potential candidate to be tested for collision avoidance systems.

Author Contributions

Conceptualization, A.M.; software, A.M.; methodology, A.M., A.B., P.N.; validation, A.M., A.B.; resources, P.N.; writing, A.M., A.B., P.N.; review and editing, all authors. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by Regione Toscana (POR FESR 2014-2020—Line 1 Research and Development Strategic Projects) through the Project IREAD4.0 under grant CUP 7165.24052017.112000028, and in part by the Italian Ministry of Education and Research (MIUR) in the framework of the CrossLab project (Departments of Excellence). Info: Paolo Nepa (e-mail: [email protected]).

Data Availability Statement

Not applicable.

Acknowledgments

The authors wish to thank the company Sofidel S.p.a., Porcari, Italy, in particular Mario Pesi and Antonio Congi. A grateful “thank you” also goes to ISE s.r.l. (Ingegneria dei Sistemi Elettronici, Vecchiano, Italy, for their precious technical support.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Laoudias, C.; Moreira, A.; Kim, S.; Lee, S.; Wirola, L.; Fischione, C. A Survey of Enabling Technologies for Network Localization, Tracking, and Navigation. IEEE Commun. Surv. Tutor. 2018, 20, 3607–3644. [Google Scholar] [CrossRef] [Green Version]
  2. Nepa, P.; Motroni, A.; Congi, A.; Ferro, E.M.; Pesi, M.; Giorgi, G.; Buffi, A.; Lazzarotti, M.; Bellucci, J.; Galigani, S.; et al. I-READ 4.0: Internet-of-READers for an efficient asset management in large warehouses with high stock rotation index. In Proceedings of the 2019 IEEE 5th International forum on Research and Technology for Society and Industry (RTSI), Firenze, Italy, 9–12 September 2019; pp. 67–72. [Google Scholar] [CrossRef]
  3. Bottani, E.; Montanari, R.; Rinaldi, M.; Vignali, G. Intelligent Algorithms for Warehouse Management. In Intelligent Techniques in Engineering Management: Theory and Applications; Kahraman, C., Çevik Onar, S., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 645–667. [Google Scholar] [CrossRef]
  4. Zhao, K.; Zhu, M.; Xiao, B.; Yang, X.; Gong, C.; Wu, J. Joint RFID and UWB Technologies in Intelligent Warehousing Management System. IEEE Internet Things J. 2020, 7, 11640–11655. [Google Scholar] [CrossRef]
  5. Draganjac, I.; Miklić, D.; Kovačić, Z.; Vasiljević, G.; Bogdan, S. Decentralized Control of Multi-AGV Systems in Autonomous Warehousing Applications. IEEE Trans. Autom. Sci. Eng. 2016, 13, 1433–1447. [Google Scholar] [CrossRef]
  6. Yassin, A.; Nasser, Y.; Awad, M.; Al-Dubai, A.; Liu, R.; Yuen, C.; Raulefs, R.; Aboutanios, E. Recent Advances in Indoor Localization: A Survey on Theoretical Approaches and Applications. IEEE Commun. Surv. Tutor. 2017, 19, 1327–1346. [Google Scholar] [CrossRef] [Green Version]
  7. Magnago, V.; Palopoli, L.; Buffi, A.; Tellini, B.; Motroni, A.; Nepa, P.; Macii, D.; Fontanelli, D. Ranging-Free UHF-RFID Robot Positioning Through Phase Measurements of Passive Tags. IEEE Trans. Instrum. Meas. 2020, 69, 2408–2418. [Google Scholar] [CrossRef]
  8. Lou, L.; Xu, X.; Cao, J.; Chen, Z.; Xu, Y. Sensor fusion-based attitude estimation using low-cost MEMS-IMU for mobile robot navigation. In Proceedings of the 2011 6th IEEE Joint International Information Technology and Artificial Intelligence Conference, Chongqing, China, 20–22 August 2011; Volume 2, pp. 465–468. [Google Scholar] [CrossRef]
  9. Dorigoni, D.; Fontanelli, D. An Uncertainty-driven Analysis for Delayed Mapping SLAM. In Proceedings of the 2021 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Glasgow, UK, 17–20 May 2021; pp. 1–6. [Google Scholar] [CrossRef]
  10. Lee, S.; Song, J.B. Robust mobile robot localization using optical flow sensors and encoders. In Proceedings of the IEEE International Conference on Robotics and Automation, ICRA ’04. 2004, New Orleans, LA, USA, 26 April–1 May 2004; Volume 1, pp. 1039–1044. [Google Scholar] [CrossRef]
  11. Ursel, T.; Olinski, M. Displacement Estimation Based on Optical and Inertial Sensor Fusion. Sensors 2021, 21, 1390. [Google Scholar] [CrossRef]
  12. Yu, J.X.; Cai, Z.X.; Duan, Z.H. Dead reckoning of mobile robot in complex terrain based on proprioceptive sensors. In Proceedings of the 2008 International Conference on Machine Learning and Cybernetics, Kunming, China, 12–15 July 2008; Volume 4, pp. 1930–1935. [Google Scholar] [CrossRef]
  13. Jetto, L.; Longhi, S.; Venturini, G. Development and experimental validation of an adaptive extended Kalman filter for the localization of mobile robots. IEEE Trans. Robot. Autom. 1999, 15, 219–229. [Google Scholar] [CrossRef]
  14. Yin, M.T.; Lian, F.L. Robot localization and mapping by matching the environmental features from proprioceptive and exteroceptive sensors. In Proceedings of the SICE Annual Conference 2010, Taipei, Taiwan, 18–21 August 2010; pp. 191–196. [Google Scholar]
  15. De Angelis, A.; Moschitta, A.; Carbone, P.; Calderini, M.; Neri, S.; Borgna, R.; Peppucci, M. Design and Characterization of a Portable Ultrasonic Indoor 3-D Positioning System. IEEE Trans. Instrum. Meas. 2015, 64, 2616–2625. [Google Scholar] [CrossRef]
  16. Comuniello, A.; De Angelis, A.; Moschitta, A. A low-cost TDoA-based ultrasonic positioning system. In Proceedings of the 2020 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Dubrovnik, Croatia, 25–28 May 2020; pp. 1–6. [Google Scholar] [CrossRef]
  17. Paral, P.; Chatterjee, A.; Rakshit, A. Human Position Estimation Based on Filtered Sonar Scan Matching: A Novel Localization Approach Using DENCLUE. IEEE Sens. J. 2021, 21, 8055–8064. [Google Scholar] [CrossRef]
  18. Scaramuzza, D.; Fraundorfer, F. Visual Odometry [Tutorial]. IEEE Robot. Autom. Mag. 2011, 18, 80–92. [Google Scholar] [CrossRef]
  19. Hess, W.; Kohler, D.; Rapp, H.; Andor, D. Real-time loop closure in 2D LIDAR SLAM. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 1271–1278. [Google Scholar] [CrossRef]
  20. Behrje, U.; Himstedt, M.; Maehle, E. An Autonomous Forklift with 3D Time-of-Flight Camera-Based Localization and Navigation. In Proceedings of the 2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV), Singapore, 18–21 November 2018; pp. 1739–1746. [Google Scholar] [CrossRef]
  21. Heo, S.; Park, C.G. Consistent EKF-Based Visual-Inertial Odometry on Matrix Lie Group. IEEE Sens. J. 2018, 18, 3780–3788. [Google Scholar] [CrossRef]
  22. Flueratoru, L.; Simona Lohan, E.; Nurmi, J.; Niculescu, D. HTC Vive as a Ground-Truth System for Anchor-Based Indoor Localization. In Proceedings of the 2020 12th International Congress on Ultra Modern Telecommunications and Control Systems and Workshops (ICUMT), Brno, Czech Republic, 5–7 October 2020; pp. 214–221. [Google Scholar] [CrossRef]
  23. Wang, F.; Lü, E.; Wang, Y.; Qiu, G.; Lu, H. Efficient Stereo Visual Simultaneous Localization and Mapping for an Autonomous Unmanned Forklift in an Unstructured Warehouse. Appl. Sci. 2020, 10, 698. [Google Scholar] [CrossRef] [Green Version]
  24. De Angelis, G.; Pasku, V.; De Angelis, A.; Dionigi, M.; Mongiardo, M.; Moschitta, A.; Carbone, P. An Indoor AC Magnetic Positioning System. IEEE Trans. Instrum. Meas. 2015, 64, 1267–1275. [Google Scholar] [CrossRef]
  25. Pasku, V.; De Angelis, A.; Dionigi, M.; Moschitta, A.; De Angelis, G.; Carbone, P. Analysis of Nonideal Effects and Performance in Magnetic Positioning Systems. IEEE Trans. Instrum. Meas. 2016, 65, 2816–2827. [Google Scholar] [CrossRef]
  26. Santoni, F.; De Angelis, A.; Moschitta, A.; Carbone, P. MagIK: A Hand-Tracking Magnetic Positioning System Based on a Kinematic Model of the Hand. IEEE Trans. Instrum. Meas. 2021, 70, 1–13. [Google Scholar] [CrossRef]
  27. Chung, H.Y.; Hou, C.C.; Chen, Y.S. Indoor Intelligent Mobile Robot Localization Using Fuzzy Compensation and Kalman Filter to Fuse the Data of Gyroscope and Magnetometer. IEEE Trans. Ind. Electron. 2015, 62, 6436–6447. [Google Scholar] [CrossRef]
  28. Rezazadeh, J.; Moradi, M.; Ismail, A.S.; Dutkiewicz, E. Superior Path Planning Mechanism for Mobile Beacon-Assisted Localization in Wireless Sensor Networks. IEEE Sens. J. 2014, 14, 3052–3064. [Google Scholar] [CrossRef]
  29. Mazuelas, S.; Bahillo, A.; Lorenzo, R.M.; Fernandez, P.; Lago, F.A.; Garcia, E.; Blas, J.; Abril, E.J. Robust Indoor Positioning Provided by Real-Time RSSI Values in Unmodified WLAN Networks. IEEE J. Sel. Top. Signal Process. 2009, 3, 821–831. [Google Scholar] [CrossRef]
  30. Arthaber, H.; Faseth, T.; Galler, F. Spread-Spectrum Based Ranging of Passive UHF EPC RFID Tags. IEEE Commun. Lett. 2015, 19, 1734–1737. [Google Scholar] [CrossRef]
  31. Koo, J.; Cha, H. Localizing WiFi Access Points Using Signal Strength. IEEE Commun. Lett. 2011, 15, 187–189. [Google Scholar] [CrossRef]
  32. Preusser, K.; Schmeink, A. Robust Channel Modelling of 2.4 GHz and 5 GHz Indoor Measurements: Empirical, Ray Tracing and Artificial Neural Network Models. IEEE Trans. Antennas Propag. 2021. [Google Scholar] [CrossRef]
  33. Biswas, J.; Veloso, M. WiFi localization and navigation for autonomous indoor mobile robots. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–8 May 2010; pp. 4379–4384. [Google Scholar] [CrossRef] [Green Version]
  34. Motroni, A.; Buffi, A.; Nepa, P. A Survey on Indoor Vehicle Localization Through RFID Technology. IEEE Access 2021, 9, 17921–17942. [Google Scholar] [CrossRef]
  35. Motroni, A.; Buffi, A.; Nepa, P.; Tellini, B. Sensor Fusion and Tracking Method for Indoor Vehicles With Low-Density UHF-RFID Tags. IEEE Trans. Instrum. Meas. 2021, 70, 1–14. [Google Scholar] [CrossRef]
  36. Jiménez Ruiz, A.R.; Seco Granja, F. Comparing Ubisense, BeSpoon, and DecaWave UWB Location Systems: Indoor Performance Analysis. IEEE Trans. Instrum. Meas. 2017, 66, 2106–2117. [Google Scholar] [CrossRef]
  37. Barbieri, L.; Brambilla, M.; Trabattoni, A.; Mervic, S.; Nicoli, M. UWB Localization in a Smart Factory: Augmentation Methods and Experimental Assessment. IEEE Trans. Instrum. Meas. 2021, 70, 1–18. [Google Scholar] [CrossRef]
  38. De Angelis, A.; Dionigi, M.; Moschitta, A.; Giglietti, R.; Carbone, P. Characterization and Modeling of an Experimental UWB Pulse-Based Distance Measurement System. IEEE Trans. Instrum. Meas. 2009, 58, 1479–1486. [Google Scholar] [CrossRef]
  39. De Angelis, A.; Dionigi, M.; Moschitta, A.; Carbone, P. A Low-Cost Ultra-Wideband Indoor Ranging System. IEEE Trans. Instrum. Meas. 2009, 58, 3935–3942. [Google Scholar] [CrossRef]
  40. Lazzari, F.; Buffi, A.; Nepa, P.; Lazzari, S. Numerical Investigation of an UWB Localization Technique for Unmanned Aerial Vehicles in Outdoor Scenarios. IEEE Sens. J. 2017, 17, 2896–2903. [Google Scholar] [CrossRef]
  41. Ghanem, E.; O’Keefe, K.; Klukas, R. Testing Vehicle-to-Vehicle Relative Position and Attitude Estimation using Multiple UWB Ranging. In Proceedings of the 2020 IEEE 92nd Vehicular Technology Conference (VTC2020-Fall), Victoria, BC, Canada, 18 November–16 December 2020; pp. 1–5. [Google Scholar] [CrossRef]
  42. Rath, M.; Kulmer, J.; Leitinger, E.; Witrisal, K. Single-Anchor Positioning: Multipath Processing with Non-Coherent Directional Measurements. IEEE Access 2020, 8, 88115–88132. [Google Scholar] [CrossRef]
  43. Leitinger, E.; Meyer, F.; Hlawatsch, F.; Witrisal, K.; Tufvesson, F.; Win, M.Z. A Belief Propagation Algorithm for Multipath-Based SLAM. IEEE Trans. Wirel. Commun. 2019, 18, 5613–5629. [Google Scholar] [CrossRef] [Green Version]
  44. Xiao, Z.; Wen, H.; Markham, A.; Trigoni, N.; Blunsom, P.; Frolik, J. Non-Line-of-Sight Identification and Mitigation Using Received Signal Strength. IEEE Trans. Wirel. Commun. 2015, 14, 1689–1702. [Google Scholar] [CrossRef]
  45. Yu, K.; Wen, K.; Li, Y.; Zhang, S.; Zhang, K. A Novel NLOS Mitigation Algorithm for UWB Localization in Harsh Indoor Environments. IEEE Trans. Veh. Technol. 2019, 68, 686–699. [Google Scholar] [CrossRef]
  46. Wang, T.; Hu, K.; Li, Z.; Lin, K.; Wang, J.; Shen, Y. A Semi-Supervised Learning Approach for UWB Ranging Error Mitigation. IEEE Wirel. Commun. Lett. 2021, 10, 688–691. [Google Scholar] [CrossRef]
  47. Wymeersch, H.; Marano, S.; Gifford, W.M.; Win, M.Z. A Machine Learning Approach to Ranging Error Mitigation for UWB Localization. IEEE Trans. Commun. 2012, 60, 1719–1728. [Google Scholar] [CrossRef] [Green Version]
  48. Zhang, H.; Zhang, Z.; Zhao, R.; Lu, J.; Wang, Y.; Jia, P. Review on UWB-based and multi-sensor fusion positioning algorithms in indoor environment. In Proceedings of the 2021 IEEE 5th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), Chongqing, China, 12–14 March 2021; Volume 5, pp. 1594–1598. [Google Scholar] [CrossRef]
  49. Kolakowski, M.; Djaja-Josko, V.; Kolakowski, J. Static LiDAR Assisted UWB Anchor Nodes Localization. IEEE Sens. J. 2020. [Google Scholar] [CrossRef]
  50. Nguyen, T.H.; Nguyen, T.M.; Xie, L. Range-Focused Fusion of Camera-IMU-UWB for Accurate and Drift-Reduced Localization. IEEE Robot. Autom. Lett. 2021, 6, 1678–1685. [Google Scholar] [CrossRef]
  51. Sadruddin, H.; Mahmoud, A.; Atia, M. An Indoor Navigation System using Stereo Vision, IMU and UWB Sensor Fusion. In Proceedings of the 2019 IEEE SENSORS, Montreal, QC, Canada, 27–30 October 2019; pp. 1–4. [Google Scholar] [CrossRef]
  52. Xu, Y.; Shmaliy, Y.S.; Ahn, C.K.; Shen, T.; Zhuang, Y. Tightly Coupled Integration of INS and UWB Using Fixed-Lag Extended UFIR Smoothing for Quadrotor Localization. IEEE Internet Things J. 2021, 8, 1716–1727. [Google Scholar] [CrossRef]
  53. Zhou, H.; Yao, Z.; Lu, M. Lidar/UWB Fusion Based SLAM With Anti-Degeneration Capability. IEEE Trans. Veh. Technol. 2021, 70, 820–830. [Google Scholar] [CrossRef]
  54. Zhou, H.; Yao, Z.; Lu, M. UWB/Lidar Coordinate Matching Method with Anti-Degeneration Capability. IEEE Sens. J. 2021, 21, 3344–3352. [Google Scholar] [CrossRef]
  55. Magnago, V.; Corbalán, P.; Picco, G.P.; Palopoli, L.; Fontanelli, D. Robot Localization via Odometry-assisted Ultra-wideband Ranging with Stochastic Guarantees. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 1607–1613. [Google Scholar] [CrossRef]
  56. Fontanelli, D.; Shamsfakhr, F.; Macii, D.; Palopoli, L. An Uncertainty-Driven and Observability-Based State Estimator for Nonholonomic Robots. IEEE Trans. Instrum. Meas. 2021, 70, 1–12. [Google Scholar] [CrossRef]
  57. Feng, D.; Wang, C.; He, C.; Zhuang, Y.; Xia, X.G. Kalman-Filter-Based Integration of IMU and UWB for High-Accuracy Indoor Positioning and Navigation. IEEE Internet Things J. 2020, 7, 3133–3146. [Google Scholar] [CrossRef]
  58. Arulampalam, M.; Maskell, S.; Gordon, N.; Clapp, T. A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking. IEEE Trans. Signal Process. 2002, 50, 174–188. [Google Scholar] [CrossRef] [Green Version]
  59. Julier, S.; Uhlmann, J. Unscented filtering and nonlinear estimation. Proc. IEEE 2004, 92, 401–422. [Google Scholar] [CrossRef] [Green Version]
  60. Wan, E.; Van Der Merwe, R. The unscented Kalman filter for nonlinear estimation. In Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium (Cat. No.00EX373), Lake Louise, AB, Canada, 4 October 2000; pp. 153–158. [Google Scholar] [CrossRef]
  61. Decawave MDEK1001 Evaluation Kit. Available online: https://www.decawave.com/product/mdek1001-deployment-kit/ (accessed on 9 October 2021).
  62. Avago ADNS-3080 Optical Flow Sensor. Available online: https://datasheet.octopart.com/ADNS-3080-Avago-datasheet-10310392.pdf/ (accessed on 9 October 2021).
  63. STMicroelectronics VL53L0X Laser-Ranging Distance Sensors. Available online: https://www.st.com/en/imaging-and-photonics-solutions/vl53l0x.html (accessed on 9 October 2021).
  64. STMicroelectronics LSMDS3 Inertial Measurement Unit Platform. Available online: https://www.st.com/en/mems-and-sensors/lsm6ds3.html/ (accessed on 9 October 2021).
  65. RENESAS SK-S7G2 Microcontroller. Available online: https://www.renesas.com/us/en/products/microcontrollers-microprocessors/renesas-synergy-platform-mcus/yssks7g2e30-sk-s7g2-starter-kit (accessed on 9 October 2021).
  66. UHF-RFID Reader Impinj Speedway Revolution R420. Available online: https://support.impinj.com/hc/en-us/articles/202755388-Impinj-Speedway-RAIN-RFID-Reader-Family-Product-Brief-Datasheet (accessed on 9 October 2021).
  67. UHF-RFID Keonn Advantenna-p11 Antenna. Available online: https://keonn.com/components-product/advantenna-p11/ (accessed on 9 October 2021).
  68. UHF-RFID Alien A0501 Antenna. Available online: https://www.alientechnology.com/products/antennas/alr-a0501/ (accessed on 9 October 2021).
  69. Leica Flexline TS03 Manual Total Station. Available online: https://leica-geosystems.com/it-it/products/total-stations/manual-total-stations/leica-flexline-ts03 (accessed on 9 October 2021).
  70. Salvador, C.; Zani, F.; Biffi Gentili, G. RFID and sensor network technologies for safety managing in hazardous environments. In Proceedings of the 2011 IEEE International Conference on RFID-Technologies and Applications, Sitges, Spain, 15–16 September 2011; pp. 68–72. [Google Scholar] [CrossRef]
Figure 1. Schematic representation of the RFID Smart Forklift equipped with IMU, OFS, and UWB tag along with the plant/warehouse reference frame and the local body frame.
Figure 1. Schematic representation of the RFID Smart Forklift equipped with IMU, OFS, and UWB tag along with the plant/warehouse reference frame and the local body frame.
Applsci 11 10607 g001
Figure 2. RFID Smart Forklift in the Smart Warehouse, along with part of the UWB infrastructure for its localization.
Figure 2. RFID Smart Forklift in the Smart Warehouse, along with part of the UWB infrastructure for its localization.
Applsci 11 10607 g002
Figure 3. Top view of the simulated trajectory along with the UWB anchors (gray squared markers). Ground truth trajectory (continuous gray line), UWB (continuous blue line), IMU + OFS (dashed red line), and UKF (pointed green line) estimated trajectories of the forklift are also represented.
Figure 3. Top view of the simulated trajectory along with the UWB anchors (gray squared markers). Ground truth trajectory (continuous gray line), UWB (continuous blue line), IMU + OFS (dashed red line), and UKF (pointed green line) estimated trajectories of the forklift are also represented.
Applsci 11 10607 g003
Figure 4. Localization errors for the different considered localization methods for the simulated trajectory of Figure 3: (a) x-coordinate, (b) y-coordinate, (c) distance error.
Figure 4. Localization errors for the different considered localization methods for the simulated trajectory of Figure 3: (a) x-coordinate, (b) y-coordinate, (c) distance error.
Applsci 11 10607 g004
Figure 5. Cumulative distribution function of the distance error ϵ d for M = 100 test cases with similar trajectories as the one represented in Figure 3.
Figure 5. Cumulative distribution function of the distance error ϵ d for M = 100 test cases with similar trajectories as the one represented in Figure 3.
Applsci 11 10607 g005
Figure 6. Histogram of the orientation error ϵ θ for M = 100 test cases with similar trajectories as the one represented in Figure 3.
Figure 6. Histogram of the orientation error ϵ θ for M = 100 test cases with similar trajectories as the one represented in Figure 3.
Applsci 11 10607 g006
Figure 7. Mean distance error ϵ ¯ d as a function of the forklift speed v. For each value of v, 100 test cases were conducted with similar trajectories as the one represented in Figure 3.
Figure 7. Mean distance error ϵ ¯ d as a function of the forklift speed v. For each value of v, 100 test cases were conducted with similar trajectories as the one represented in Figure 3.
Applsci 11 10607 g007
Figure 8. Mean distance error ϵ ¯ d as a function of the initial uncertainty matrix P 0 . For each configuration described by Table 1, 100 test cases with similar features as the one represented in Figure 3 were conducted.
Figure 8. Mean distance error ϵ ¯ d as a function of the initial uncertainty matrix P 0 . For each configuration described by Table 1, 100 test cases with similar features as the one represented in Figure 3 were conducted.
Applsci 11 10607 g008
Figure 9. The RFID Smart Forklift with all the equipped sensors. The body local reference frame is also depicted centered on the box containing IMU and OFS.
Figure 9. The RFID Smart Forklift with all the equipped sensors. The body local reference frame is also depicted centered on the box containing IMU and OFS.
Applsci 11 10607 g009
Figure 10. (a) A UWB anchor installed at a metallic bar. (b) Position of UWB anchors in the warehouse area (blue asterisks).
Figure 10. (a) A UWB anchor installed at a metallic bar. (b) Position of UWB anchors in the warehouse area (blue asterisks).
Applsci 11 10607 g010
Figure 11. Part of the reference path for the forklift indicated by the adhesive tape (left) and red light beam (right) projected by the OFS to make sure the forklift was actually following the path.
Figure 11. Part of the reference path for the forklift indicated by the adhesive tape (left) and red light beam (right) projected by the OFS to make sure the forklift was actually following the path.
Applsci 11 10607 g011
Figure 12. Top view of the estimated trajectories along with the UWB anchors (gray squared markers). Ground truth trajectory (continuous gray line), UWB (continuous blue line), IMU + OFS (dashed red line), and UKF (pointed green line) measured trajectories of the forklift are represented, too: (a) Test Case IV, (b) Test Case IX by referring to Table 2.
Figure 12. Top view of the estimated trajectories along with the UWB anchors (gray squared markers). Ground truth trajectory (continuous gray line), UWB (continuous blue line), IMU + OFS (dashed red line), and UKF (pointed green line) measured trajectories of the forklift are represented, too: (a) Test Case IV, (b) Test Case IX by referring to Table 2.
Applsci 11 10607 g012
Figure 13. Boxplots of the errors by considering the three different tracking systems (UWB, IMU + OFS, and UKF) (a) of the x-coordinate for test case IV, (b) of the x-coordinate for test case IX, (c) of the y-coordinate for test case IV, (d) of the y-coordinate for test case IX, (e) of the distance error for test case IV, (f) of the distance error for test case IX, (g) of the orientation error for test case IV, and (h) of the orientation error for test case IX.
Figure 13. Boxplots of the errors by considering the three different tracking systems (UWB, IMU + OFS, and UKF) (a) of the x-coordinate for test case IV, (b) of the x-coordinate for test case IX, (c) of the y-coordinate for test case IV, (d) of the y-coordinate for test case IX, (e) of the distance error for test case IV, (f) of the distance error for test case IX, (g) of the orientation error for test case IV, and (h) of the orientation error for test case IX.
Applsci 11 10607 g013
Figure 14. Mean distance error ϵ ¯ d in function of the forklift speed v by analyzing the test case trajectories I–VI.
Figure 14. Mean distance error ϵ ¯ d in function of the forklift speed v by analyzing the test case trajectories I–VI.
Applsci 11 10607 g014
Figure 15. Mean distance error ϵ ¯ d as a function of the initial uncertainty matrix P 0 computed for test case IX.
Figure 15. Mean distance error ϵ ¯ d as a function of the initial uncertainty matrix P 0 computed for test case IX.
Applsci 11 10607 g015
Figure 16. Cumulative distribution function of the distance error ϵ d for the test cases I–IX.
Figure 16. Cumulative distribution function of the distance error ϵ d for the test cases I–IX.
Applsci 11 10607 g016
Figure 17. Histogram of the orientation error ϵ θ for the test cases I–IX.
Figure 17. Histogram of the orientation error ϵ θ for the test cases I–IX.
Applsci 11 10607 g017
Table 1. Values of σ U W B and σ θ for the analysis of the UKF performance.
Table 1. Values of σ U W B and σ θ for the analysis of the UKF performance.
Trial σ UWB  [m] σ θ  [rad]
A0.10.1
B0.20.1
C0.40.14
D0.60.21
E0.80.28
F10.35
G1.20.42
H1.40.49
I1.60.56
J1.80.63
K20.7
Table 2. Main features of the performed test case trajectories.
Table 2. Main features of the performed test case trajectories.
Test CaseForklift Speed v [m/s]Path-Length L [m]Path Shape [ x 0 , y 0 ] T  [m] θ 0  [rad]
I0.75∼100Closed-loop path [ 11.5 , 74.4 ] T π / 2
II0.75∼100Closed-loop path [ 11.5 , 74.4 ] T π / 2
III1.6∼100Closed-loop path [ 11.5 , 74.4 ] T π / 2
IV1.6∼100Closed-loop path [ 11.5 , 74.4 ] T π / 2
V2.75∼100Closed-loop path [ 11.5 , 74.4 ] T π / 2
VI2.75∼100Closed-loop path [ 11.5 , 74.4 ] T π / 2
VII1.2∼137Rectilinear path with U-turn [ 41.7 , 74.4 ] T π / 2
VIII1.2∼137Rectilinear path with U-turn [ 41.7 , 74.4 ] T π / 2
IX1.28∼289Closed-loop path [ 11.5 , 74.4 ] T π / 2
Table 3. Mean distance localization errors for test cases IV and IX in Table 2.
Table 3. Mean distance localization errors for test cases IV and IX in Table 2.
Test Case ϵ ¯ d UWB  (m) ϵ ¯ d IMU + OFS  (m) ϵ ¯ d UKF  (m) ϵ ¯ θ UWB  (rad) ϵ ¯ θ IMU + OFS  (rad) ϵ ¯ θ UKF  (rad)
IV3.842.812.13−0.04−0.040.02
IX1.772.431.170.060.060.01
Table 4. Elaboration Time compared to trial duration for the test cases I–IX.
Table 4. Elaboration Time compared to trial duration for the test cases I–IX.
Test CaseTrial Duration [s]Total Elaboration Time [s]Number of SamplesElaboration Time for Sample [ms]
I13610.8113,6410.79
II13210.0513,2650.75
III635.2363690.82
IV645.1864320.8
V363.4336060.95
VI363.3436110.92
VII1149.2811,3460.81
VIII1109.0511,0380.82
IX22517.0922,5630.75
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Motroni, A.; Buffi, A.; Nepa, P. Forklift Tracking: Industry 4.0 Implementation in Large-Scale Warehouses through UWB Sensor Fusion. Appl. Sci. 2021, 11, 10607. https://doi.org/10.3390/app112210607

AMA Style

Motroni A, Buffi A, Nepa P. Forklift Tracking: Industry 4.0 Implementation in Large-Scale Warehouses through UWB Sensor Fusion. Applied Sciences. 2021; 11(22):10607. https://doi.org/10.3390/app112210607

Chicago/Turabian Style

Motroni, Andrea, Alice Buffi, and Paolo Nepa. 2021. "Forklift Tracking: Industry 4.0 Implementation in Large-Scale Warehouses through UWB Sensor Fusion" Applied Sciences 11, no. 22: 10607. https://doi.org/10.3390/app112210607

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop