# Curvature-Based Environment Description for Robot Navigation Using Laser Range Sensors

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Laser Scan Data Segmentation Algorithms

#### 2.1. Problem Statement

_{i|i=1...NR}}, on which (r, φ)

_{i}are the polar coordinates of the ith range reading (r

_{i}is the measured distance of an obstacle to the sensor rotating axis at direction φ

_{i}), and N

_{R}is the number of range readings. Figure 2(a) represents all these variables. It can be assumed that the noise on both measurements, range and bearing, follows a Gaussian distribution with zero mean and variances ${\sigma}_{r}^{2}$ and ${\sigma}_{\phi}^{2}$, respectively. The aim of segmenting a laser scan is to divide it into clusters of range readings associated to different surfaces, planar or curves, of the environment. There are two main problems in laser scan segmentation:

- How many segments are there?
- Which range readings belong to which segment?

#### 2.2. Polygonal Based Methods

#### 2.3. Curvature-Based Methods

- Interpolation-based curvature estimators. These methods interpolate the plane curve coordinates and then differentiate the interpolation curves. Thus, Mokhtarian et al. [25] propose to filter the curve with a one-dimensional Gaussian filter. This filtering removes the plane curve noise.
- Angle-based curvature estimators. These methods propose an alternative curvature measure based on angles between vectors, which are defined as a function of the discrete curve items. Thus, the curve filtering and curvature estimation are mixed by Agam et al. [27], which define the curvature at a given point as the difference between the slopes of the curve segments on the right and left side of the point, where slopes are taken from a look-up table. The size of both curve segments is fixed. Liu et al. [28] compute the curvature function by estimating the edge gradient at each plane curve point, which is equal to the arctangent of its Sobel difference in a 3×3 neighborhood. Arrebola et al. [29] define the curvature at a given point as the correlation of the forward and backward histograms in the k-vicinity of the point, where the resulting value is modified to include concavity and convexity information.

_{max}, but localizes the dominant point position at the finest scale σ

_{min}. In order to avoid a slow iterative estimation of the curvature, an adaptive algorithm was employed by Núñez et al. [16] to extract corners, line and curve segments from the laser scan data.

## 3. CUrvature-BAsed Environment Description Framework

#### 3.1. Segmentation Algorithm

_{n}and (r, φ)

_{n−1}. This algorithm allows to reject isolated range readings, but it leads to an under-segmentation of the laser scan, i.e., extracted segments between breakpoints typically group two or more different structures (see Figure 4). In order to avoid this problem, once the whole laser scan is divided into sets of consecutive range readings, a second segmentation criterion is applied into each set. This approach is focused on the correct selection of the set of dominant points present into a part of the scan bounded by two consecutive breakpoints. If the whole laser scan is divided into different sets of consecutive range readings by the breakpoint detector, this specific problem can be stated as the estimation of the curvature function associated to each set. Therefore, this one is based on the curvature associated to each range reading: consecutive range readings belong to the same segment while their curvature values are similar. To perform this segmentation task, the adaptive curvature function associated to each segment of the laser scan is computed [16]. Then, this information is employed to segment the laser scan into clusters of homogeneous curvature. The whole process to achieve this segmentation task is shown with details in [16]. Figure 5 (a) shows a real environment used to illustrate the CUBA framework. The scan data provided by the sensor and the curvature estimates by the segmentation stage are drawn in Figures 5 (b) and 5 (c), respectively.

#### 3.2. Natural Feature Extraction and Characterization

- Line segmentsIn order to provide precise feature estimation it is essential to represent uncertainties and to propagate them from single range reading measurements to all stages involved in the feature estimation process. As previously mentioned, the methods try to fit parametric curves to each segmented data. An approach for line fitting is to minimize the sum of square perpendicular distances of range readings to lines. This yields a nonlinear regression problem which can be solved for polar coordinates [33]. The line in the laser range finder’s polar coordinate system is represented as$$r=\frac{d}{\text{cos}(\theta -\varphi )}$$$$x\text{cos}\theta +y\text{sin}\theta =d$$
_{i}of a range reading, (r, ϕ)_{i}, to this line is$${d}_{i}={r}_{i}\text{cos}(\theta -{\varphi}_{i})-d$$Under the assumption of known uncertainties, a weight for each measurement point can determine and fit the line in the generalized least squares sense, whose solution is (see [14] for further details)$$\begin{array}{r}\hfill \theta =\frac{1}{2}\text{arctan}\left(\frac{{\sum}_{i}{r}_{i}^{2}\text{sin}2{\varphi}_{i}-\frac{2}{n}{\sum}_{i}{\sum}_{j}{r}_{i}{r}_{j}\text{cos}{\varphi}_{i}\text{sin}{\varphi}_{j}}{{\sum}_{i}{r}_{i}^{2}\text{cos}2{\varphi}_{i}-\frac{1}{n}{\sum}_{i}{\sum}_{j}{r}_{i}{r}_{j}\text{cos}({\varphi}_{i}+{\varphi}_{j})}\right)\\ \hfill d=\frac{{\sum}_{i}{r}_{i}\text{cos}({\varphi}_{i}-\theta )}{n}\end{array}$$Figure 5 (d) presents the detected landmarks corresponding to the scan data acquired by the sensor in 5 (b). In this case, Figure 5 (d) shows the line segments extracted using the described approach (end-points of the line segments are illustrated as squares). These end-points are determined by the intersection between this line and the two lines which are perpendicular to it and pass through the first and last range readings. - Curve segmentsAlthough many circle fitting methods have been proposed, it is a common choice to achieve this by minimizing the mean square distance from the data points to the fitting circle. Basically, the Least Squares Fit (LSF) assumes that each data point is the noised version of the closest model point. This assumption is valid when data points are not contaminated with strong noise.Let the data points be {x
_{i}, y_{i}}_{|i=1...m}(m > 3), with an uncertainty ellipse specified in terms of the standard deviations p_{i}and q_{i}, and the correlations r_{i}. The problem is to obtain the center (x_{c}, y_{c}) and the radius ρ of the circle C which yields the best fit to this data. It is also required to determine the variance matrix associated to the circle parameters.This problem is stated as the minimization of the difference between the set of points {x_{i}, y_{i}} and their corresponding points {x_{c}+ ρ cos ϕ_{i}, y_{c}+ ρ sin ϕ_{i}} which lie on C. This difference is summarized by the 2m-element error vector ε:$$\begin{array}{c}\epsilon =({x}_{1}-({x}_{c}+\rho \text{cos}{\varphi}_{1}),{y}_{1}-({y}_{c}+\rho \text{sin}{\varphi}_{1}),\dots ,\\ {x}_{m}-({x}_{c}+\rho \text{cos}{\varphi}_{m}),{y}_{m}-({y}_{c}+\rho \text{sin}{\varphi}_{m}){)}^{T}\\ ={(\mathrm{\Delta}{x}_{1},\mathrm{\Delta}{y}_{1},\dots \mathrm{\Delta}{x}_{m},\mathrm{\Delta}{y}_{m})}^{T}\end{array}$$This error vector has the known 2m×2m block diagonal variance matrix V = diag(V_{1}...V_{m}), where$${V}_{i}=\left[\begin{array}{cc}{p}_{i}^{2}& {r}_{i}\\ {r}_{i}& {q}_{i}^{2}\end{array}\right]$$Then, assuming that the errors are normally distributed, the maximum likelihood (ML) problem consists of minimizing$$\mathit{minimize}\hspace{1em}{\epsilon}^{T}{V}^{-1}\epsilon $$_{1}, ..., ϕ_{m}, x_{c}, y_{c}, ρ)^{T}.In order to solve the minimization problem, the classical Gauss-Newton algorithm with the Levenberg-Marquardt correction [34, 35] is used. This algorithm finds the vector b which minimizes 8 in an iterative way. It approximates the objective function with the square of the norm of a linear function. Thus, at each iteration, the linear least-squares problem is solved$${\mathit{min}}_{\delta b}{||{{\epsilon}^{\prime}}^{(k)}-\nabla {{\epsilon}^{\prime}}^{(k)}\cdot \delta b\mathit{||}}_{2}$$^{(k)}is the Jacobian matrix of first partial derivatives of ε′ with respect to b and ε′^{(k)}is ε′, both evaluated at b^{(k)}. A detailed description of the Levenberg-Marquardt algorithm can be found at [35]. In this case, the starting estimate for the centre coordinates and radius is obtained using the Taubin’s approximation to the gradient-weighted algebraic circle fitting approach [34]. - Finally, to obtain the variance matrix of the center coordinates and radius, an estimate of the variance matrix of the vector b must be obtained. Further details about the fitting problem are shown in [15]. Figure 5 (d) draws the circle segment extracted using the described approach for the scan data provided by the sensor in Figure 5 (b).
- Real and virtual cornersAs pointed out by Madhavan and Durrant-White [32], one of the main problems of a localization algorithm only based on corner detection is that the set of detected natural landmarks at each time step can be very limited, specially when it works on semi-structured environments. This generates a small observation vector that does not provide enough information to estimate the robot pose. To address this problem, the description algorithm described in this paper uses real and virtual corners as natural landmarks of the robot environment. Real corners are due to change of surface being scanned or change in the orientation of the scanned surface. Thus, they are not associated to laser scan discontinuities. On the other hand, virtual corners are defined as the intersection of extended line segments which are not previously defined as real corners. In order to obtain the corner location, it must be taken into account that failing to identify the correct corner point in the data can lead to large errors that increase with the distance to the detected corner (see Figure 6). Therefore, it is usually not a good option to locate the corner in one of the scan range readings. Another choice is to extract the corner location as the intersection of the two associated lines. Thus, the corner can be detected as the farthest point from a line defined by the two non-touching endpoints of the lines or by finding that point in the neighborhood of the initial corner point, which gives the minimum sum of error variances of both lines [36]. The existence of a corner can be determined from the curvature function [16], but its characterization (estimation of the mean pose and uncertainty measurement) is conducted using the two lines that generates the corner [14]. Figure 5 (d) illustrates the virtual corner detected by the algorithm (triangle) for the real scene described in Figure 5 (a). The associated covariance matrix has been also represented (ellipse).
- EdgesThe adaptive breakpoint detector searches for large discontinuity values in the laser scan data. Range readings that define this discontinuity are marked as breakpoints. Edges are defined as breakpoints associated to end-points of plane surfaces [37]. To satisfy this condition, the portion of the environment where the breakpoint is located must be a line segment and must not be occluded by any other obstacle. This last condition is true if the breakpoint is closer to the robot than the other breakpoint defined by the same large discontinuity (see Figure 7). It must be also noted that, when the laser range finder does not work with a scanning angle of 360°, the first and last breakpoints will not be considered as edges, because it is impossible to know if they define the end-point of a surface.Edges are characterized by the Cartesian position (x, y) of the breakpoint and by the orientation of the plane surface described by the line segment, θ ([14]).

## 4. Affine-invariant Laser Scan Segmentation

#### 4.1. Adaptive Estimation of the Region-of-support

_{f}[i] and t

_{b}[i], respectively. To estimate the t

_{f}[i] value, the algorithm first computes two sets of triangles, ${\{{t}_{j}^{a}\}}_{j=i}^{i+{t}_{f}[i]-1}$ and ${\{{t}_{j}^{c}\}}_{j=i}^{i+{t}_{f}[i]-1}$. The area of the triangle ${t}_{j}^{a}$ is defined as

_{j}, y

_{j}) and (x

_{j+1}, y

_{j+1}) are the Cartesian coordinates of the arc range readings j and j + 1 and (x

_{c}, y

_{c}) is the robot position,

**x**

_{c}.

_{j}, y

_{j}) on the chord that joins the range readings i and i + t

_{f}[i].

_{f}[i] will be defined by the largest value that satisfies

_{f}[i] value. t

_{b}[i] is also set according to the described scheme, but using i − t

_{b}[i] instead of i + t

_{f}[i].

_{t}value is very important. Thus, if the value of U

_{t}is large, t

_{f}[i] and t

_{b}[i] tend to be large and the contour details may be missed; if it is small, t

_{f}[i] and t

_{b}[i] are always very small and the resulting function is noisy. In order to set it correctly, a set of real plane surfaces have been scanned at different distances from the sensor. In these surfaces, this value must be fixed to not to detect any local peak. This simple experiment has provided us an U

_{t}value equal to 25.0 cm

^{2}, which has been successfully employed in all experiments.

#### 4.2. Affine-invariant Laser Scan Segment Descriptor

- Calculation of the local vectors f⃗
_{i}and b⃗_{i}associated to each range reading i. These vectors present the variation in the X and Y axis between range readings i and i + t_{f}[i], and between i and i − t_{b}[i]. If (x_{i}, y_{i}) are the Cartesian coordinates of the range reading i, the local vectors associated to i are defined as$$\begin{array}{l}{\overrightarrow{f}}_{i}=({x}_{i+{t}_{f}[i]}-{x}_{i},{y}_{i+{t}_{f}[i]}-{y}_{i})=({f}_{{x}_{i}},{f}_{{y}_{i}})\\ {\overrightarrow{b}}_{i}=({x}_{i-{t}_{b}[i]}-{x}_{i},{y}_{i-{t}_{b}[i]}-{y}_{i})=({b}_{{x}_{i}},{b}_{{y}_{i}})\end{array}$$ - Calculation of the TAR associated to each range reading. The signed area of the triangle at contour point i is given by [39]:$${\kappa}_{i}=\frac{1}{2}\left|\begin{array}{ccc}{b}_{{x}_{i}}& {b}_{{y}_{i}}& 1\\ 0& 0& 1\\ {f}_{{x}_{i}}& {f}_{{y}_{i}}& 1\end{array}\right|$$
- TAR Normalization. The TAR of the whole laser scan segment, ${\{{\kappa}_{i}\}}_{i=1}^{N}$, is normalized by dividing it by its absolute maximum value.

#### 4.3. Laser Scan Descriptor Under General Affine Transformations

_{1}and t

_{2}represent translation. By substituting the expressions for ${\{{\widehat{x}}_{i},{\widehat{y}}_{i}\}}_{i=1}^{N}$ into Equation 13, we obtain

_{i}= (ad − bc)κ

_{i}, where κ̂

_{i}is the affine transformed version of κ

_{i}. As the TAR is normalized by its maximum value, this representation is invariant to the affine transformations.

#### 4.4. Experimental Results

_{i}is perturbed by a systematic error and a statistical error, usually assumed to follow a Gaussian distribution with zero mean. In order to compensate the systematic error, it has been approximated by a sixth-order polynomial which fits the differences between the measured distance and the true obstacle distance in the least-squares sense (see [14] for further details).

## 5. Comparative Study

_{r}= 5 mm and σ

_{φ}= 0.1 degrees. Each test scan consists of 360 range readings and it represents several line and curve segments.

^{2}-test with a matching valid gate value of 2.77 (75% confidence interval). Then, extracted segments are matched to true segments using a nearest-neighbor algorithm. Experimental results are shown in Table 1. The correctness in these methods can be measured as [13]

_{i}and α

_{i}are line parameters of the corresponding matched line. It is assumed that error distributions are Gaussian. Then, the variance of each distribution is computed as

_{c}and y

_{c}define the center of the circle and ρ is the radius). From Table 1, it can be noted that the IEPF and SM algorithms perform faster than the others. The proposed method is faster than the CUBA, CSS and HT approaches. Besides, the CSS, CUBA and the proposed algorithm are the only methods that do not split curve segments into short straight-line segments. Therefore, they have the best scores in term of correctness and precision with respect to curve segments.

## 6. Conclusions and Future Works

## 7. Glossary

- Odometry: is the use of data from the movement of proprioceptive sensor (actuators) to estimate change in the robot pose over time. Odometry is used by the current robots to estimate (not determine) their pose relative to an initial location.
- Localization: is defined as the knowledge of the position and orientation of the robot in the working environment at every instant of time. In a local point of view, given a map of the environment and an initial pose (x - y position, and θ orientation), the localization task consists of tracking the mobile agent around the environment.
- Mapping: is defined as the problem of acquiring a spatial model of a robot environment. Usually, mapping algorithms obtain an instantaneous local representation of the scene according to the current sensor reading, including static and dynamic objects. Next, a global map is built only with static objects.
- SLAM: is a technique used by autonomous mobile robots to build up a map within an unknown environment while keeping track of their current position at the same time.
- Natural landmarks: Landmarks are defined as features which are determined by the system and detected according to some criteria. In this situation, natural landmarks are directly selected in the natural scene considering their geometrical or photo-metrical features.
- Segmentation: is a process of aiming to classify each scan data into several groups, each of which possibly associates with different structures of the environment.
- Breakpoints: are scan discontinuities due to a change of the surface being scanned by the laser sensor.

## Acknowledgments

## References and Notes

- Roumeliotis, S.; Bekey, G.B. Segments: A Layered, Dual-Kalman Filter Algorithm for Indoor Feature Extraction. Proceedings of the 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems, Kagawa University, Takamatsu, Japan, November 5 – October 30, 2000; pp. 454–461.
- Tardós, J.; Neira, J.; Newman, P.; Leonard, J. Robust mapping and localization in indoor environments using sonar data. Int. J. Robot. Res
**2002**, 40, 311–330. [Google Scholar] - Crowley, J. World Modeling and Position Estimation for a Mobile Robot Using Ultrasonic Ranging. Proceedings of the 1989 IEEE International Conference on Robotics and Automation, Scottsdale, USA, May 14–19, 1989; pp. 674–680.
- Gutmann, J.; Schlegel, C. AMOS: Comparison of Scan Matching Approaches for Self-Localization in Indoor Environments. Proceedings of the 1st Euromicro Workshop on Advanced Mobile Robots, Kaiserslautern, Germany, October 9–11, 1996; pp. 61–67.
- Yagub, M.T.; Katupitaya, J. Line Segment Based Scan Matching for Concurrent Mapping and Localization of a Mobile Robot. Proceedings of the 2006 IEEE International Conference on Control, Automation, Robotics and Vision, Grand Hyatt, Singapore, December 5–8, 2006; pp. 1–6.
- Lingemann, K.; Nuchter, J.H.; Surmann, H. High-speed laser localization for mobile robots. Int. J. Robot. Auton. Syst
**2005**, 51, 275–296. [Google Scholar] - Kosaka, A.K. Fast Vision-guided Mobile Robot Navigation Using Model-based Reasoning And Prediction Of Uncertainties. Proceedings of the 1992 IEEE International Conference on Intelligent Robots and Systems, Raleigh, USA, July 7–10, 1992; pp. 2177–2186.
- Ayache, N.; Faugeras, O. Maintaining representations of the environment of a mobile robot. IEEE Trans. Robot. Autom
**1989**, 5, 804–819. [Google Scholar] - Atiya, S.; Hager, G. Real-time vision-based robot localization. IEEE Trans. Robot. Autom
**1993**, 9, 785–800. [Google Scholar] - Se, S.; Lowe, D.; Little, J. Vision-based Mobile Robot Localization and Mapping Using Scale-Invariant Features. IEEE Proceedings of the 2001 International Conference on Robotics and Automation, Seoul, Korea, May 21–26, 2001; pp. 2051–2058.
- Martin, M. Evolving visual sonar: Depth from monocular images. Patt. Rec. Lett
**2006**, 27, 1174–1180. [Google Scholar] - Xu, K.; Luger, G. The model for optimal design of robot vision systems based on kinematic error correction. Image Vis. Comp
**2007**, 25, 1185–1193. [Google Scholar] - Nguyen, V.; Gächter, S.; Martinelli, A.; Tomatis, N.; Siegwart, R. A comparison of line extraction algorithms using 2D range data for indoor mobile robotics. Auton. Rob
**2007**, 23, 97–111. [Google Scholar] - Núñez, P.; Vázquez-Martín, R.; del Toro, J.; Bandera, A.; Sandoval, F. Natural landmark extraction for mobile robot navigation based on an adaptive curvature estimation. Robot. Auton. Syst
**2008**, 56, 247–264. [Google Scholar] - Núñez, P.; Vázquez-Martín, R.; Bandera, A.; Sandoval, F. An algorithm for fitting 2-D data on the circle: applications to mobile robotics. IEEE Sig. Proc. Lett
**2008**, 15, 127–130. [Google Scholar] - Núnez, P.; Vázquez-Martín, R.; del Toro, J.; Bandera, A.; Sandoval, F. Feature Extraction from Laser Scan Data Based on Curvature Estimation for Mobile Robotics. Proceedings of the 2006 IEEE International Conference on Robotics and Automation, Orlando, USA, May 15–19, 2006; pp. 1167–1172.
- Castellanos, J.; Tardós, J. Laser-based segmentation and localization for a mobile robot. In Robotics and Manufacturing: Recent Trends in Research and Applications 6; Jamshidi, M., Pin, F., Dauchez, P., Eds.; ASME Press: New York, NY, USA, 1996. [Google Scholar]
- Zhang, L.; Ghosh, B.K. Line Segment Based Map Building and Localization Using 2D Laser Rangefinder. Proceedings of the 2000 IEEE International Conference on Robotics and Automation, San Francisco, USA, April 24–28, 2000; pp. 2538–2543.
- Borges, G.; Aldon, M. Line extraction in 2D range images for mobile robotics. J. Int. Robot. Syst
**2004**, 40, 267–297. [Google Scholar] - Nguyen, V.; Martinelli, A.; Tomatis, N.; Siegwart, R. A Comparison of Line Extraction Algorithms Using 2D Laser Rangefinder for Indoor Mobile Robotics. Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Alberta, Canada, August 2–5, 2005; pp. 1929–1934.
- Iocchi, L.; Nardi, D. Hough localization for mobile robots in polygonal environments. Rob. Auton. Syst
**2002**, 40, 43–58. [Google Scholar] - Pfister, S.; Roumeliotis, S.; Burdick, J. Weighted line fitting algorithms for mobile robot map building and efficient data representation. Proceedings of the 2003 IEEE International Conference on Robotics and Automation, Taipei, Taiwan, September 14–19, 2003; pp. 1304–1311.
- Bandera, A.; Pérez-Lorenzo, J.; Bandera, J.; Sandoval, F. Mean shift based clustering of Hough domain for fast line segment detection. Patt. Recog. Lett
**2006**, 27, 578–586. [Google Scholar] - Martínez-Cantin, R.; Castellanos, J.; Tardós, J.; Montiel, J. Adaptive Scale Robust Segmentation for 2D Laser Scanner. Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, October 9–15, 2006; pp. 796–801.
- Mokhtarian, F.; Mackworth, A. Scale-based description and recognition of planar curves and twodimensional shapes. IEEE Trans. Patt. Anal. Machine Intell
**1986**, 8, 34–43. [Google Scholar] - Fontoura, L.; Marcondes, R. Shape analysis and classification. In Shape Analysis and Classification; CRC Press: Boca Raton, FL, USA, 2001. [Google Scholar]
- Agam, G.; Dinstein, I. Geometric separation of partially overlapping nonrigid objects applied to automatic chromosome classification. IEEE Trans. Patt. Anal. Machine Intell
**1997**, 11, 1211–1222. [Google Scholar] - Liu, H.; Srinath, D. Partial shape classification using contour matching in distance transformation. IEEE Trans. Patt. Anal. Machine Intell
**1990**, 11, 1072–1079. [Google Scholar] - Arrebola, F.; Bandera, A.; Camacho, P.; Sandoval, F. Corner detection by local histograms of the contour chain code. Elect. Lett
**1997**, 33, 1769–1771. [Google Scholar] - Bandera, A.; Urdiales, C.; Arrebola, F.; Sandoval, F. Corner detection by means of adaptively estimated curvature function. Elect. Lett
**2000**, 36, 124–126. [Google Scholar] - Reche, P.; Urdiales, C.; Bandera, A.; Trazegnies, C.; Sandoval, F. Corner detection by means of contour local vectors. Elect. Lett
**2002**, 38, 699–701. [Google Scholar] - Madhavan, R.; Durrant-Whyte, H. Natural landmark-based autonomous vehicle navigation. Robot. Auton. Syst
**2004**, 46, 79–95. [Google Scholar] - Arras, K.; Siegwart, R. Feature Extraction and Scene Interpretation for Map Based Navigation and Map Building. Proceedings of SPIE Mobile Robotics XII, vol 3210, Pittsburgh, USA, October 14–17, 1997; pp. 42–53.
- Chernov, N.; Lesort, C. Least squares fitting of circles. J. Math. Imag. Vision
**2005**, 23, 239–251. [Google Scholar] - Shakarji, C. Least-squares fitting algorithms of the NIST algorithm testing system. J. Res. Natl. Inst. Stand. Tech
**1998**, 103, 633–641. [Google Scholar] - Diosi, A.; Kleeman, L. Uncertainty of line segments extracted from static sick pls laser scans. Technical Report MECSE-26-2003, Department of Electrical and Computer Systems Engineering, Monash University,. 2003. [Google Scholar]
- Zhang, S.; Xie, L.; Adams, M. Feature extraction for outdoor mobile robot navigation based on a modified Gauss-Newton optimization approach. Robot. Auton. Syst
**2006**, 54, 277–287. [Google Scholar] - Teh, C.; Chin, R. On the detection of dominant points on digital curves. IEEE Trans. Patt. Anal. Machine Intell
**1989**, 11, 859–872. [Google Scholar] - Alajlan, N.; Rube, I.E.; Kamel, M.; Freeman, G. Shape retrieval using triangle-area representation and dynamic space warping. Patt. Recogn
**2007**, 40, 1911–1920. [Google Scholar] - Howard, A.; Roy, N. The robotics data set repository (radish). Available online: http://radish.sourceforge.net/.

**Figure 1.**(a) Two laser range sensors widely used in Robotic: a LMS200 from SICK and a HOKUYO URG-04LX. (b) Natural landmarks detected and characterized in this work: breakpoints, rupture points, line and curve segments, corners and edges.

**Figure 3.**(a) Segment of a single laser scan (⃞-breakpoints, o-corners). (b) Curvature function associated to (a).

**Figure 4.**(a)–(b) Laser scan and extracted breakpoints (squares). It must be noted that segments of the laser scan which present less than ten range readings are not taken into account (they are marked without boxes in the figures).

**Figure 5.**(a) A real environment where CUBA algorithm has been tested; (b) scan data acquired by the laser range sensor; (c) curvature function associated to (a) using the method proposed in [16]; and (d) natural landmarks (triangle-corners, square-end-points of line segments, o-circles) with their associated uncertainties (ellipses in the images).

**Figure 6.**A real corner is not usually located at one of the laser range readings (they are marked as blue dots over the detected line segments).

**Figure 7.**An edge is defined as a breakpoint associated to the end-point of a plane surface which is not occluded by any other obstacle.

**Figure 8.**Calculation of the maximum length of contour presenting no significant discontinuity on the right side of range reading i (t

_{f}[i]): (a) Part of the laser scan and point i; (b) scan data acquired by the laser range sensor; and (c) evolution of the area delimited by the arc and the chord ( ${\sum}_{j=i}^{i+{t}_{f}[i]-1}|{t}_{j}^{a}|-({\sum}_{j=i}^{i+{t}_{f}[i]-1}|{t}_{j}^{a}|\cap {\sum}_{j=i}^{i+{t}_{f}[i]-1}|{t}_{j}^{c}|)$). It can be noted that this area suffers a sharp increasing when t

_{f}[i] ≥ 8. This change allows to estimate the correct t

_{f}[i] value and it will be detected in our approach using the Equation 12 (in this case, t

_{f}[i] = 8).

**Figure 9.**(a) Laser scan #1. (b) Adaptive TAR associated to scan segment A in (a). (c) Laser scan #2. (d) Adaptive TAR associated to scan segment A in (c).

**Figure 10.**(a) Dominant points detected from the TAR obtained using an adaptive triangle side length; and (b) dominant points detected from the TAR obtained using a t value equal to 3 (left image) or t value equal to 15 (right image).

**Figure 11.**(a) The first test area, an office-like environment sited at the Technology Park of Andalusia (Málaga); and (b) the map of a part of the Intel Jones Farms Campus in Hillsboro, Oregon (source: the Radish repository http://radish.sourceforge.net/).

**Figure 12.**(a) Laser scan #3. (b) Curvature functions associated to (a). (c) Laser scan #4. (d) Curvature functions associated to (c).

**Figure 14.**(a)–(b) Results of the proposed algorithm from different poses in the same indoor environment. The same results are provided by the algorithm from these robot poses (see details in the figure (b); (c)–(d) CUBA algorithm results from the same tests. The algorithm split the curve segment in different parts due to this algorithm is not invariant to robot pose.

Algorithm | SM | SMF | IEPF | CSS | HT | CUBA | Proposed |
---|---|---|---|---|---|---|---|

Execution time (ms) | 7.1 | 14.6 | 4.2 | 39.1 | 33.4 | 13.6 | 10.1 |

TruePos | 0.77 | 0.79 | 0.78 | 0.93 | 0.73 | 0.92 | 0.93 |

FalsePos | 0.23 | 0.23 | 0.26 | 0.03 | 0.28 | 0.02 | 0.02 |

σ_{Δd} [mm] | 15.2 | 12.3 | 17.2 | 10.4 | 9.8 | 10.2 | 10.2 |

σ_{Δα} [deg] | 0.82 | 0.63 | 0.79 | 0.58 | 0.58 | 0.60 | 0.59 |

σ_{Δxc} [mm] | 14.1 | 13.1 | 13.9 | 10.1 | 13.0 | 9.8 | 9.7 |

σ_{Δyc} [mm] | 12.6 | 12.9 | 13.1 | 9.8 | 12.7 | 9.9 | 9.6 |

σ_{Δρ} [mm] | 8.3 | 7.9 | 8.5 | 7.9 | 8.5 | 6.5 | 6.1 |

© 2009 by the authors; licensee MDPI, Basel, Switzerland This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).

## Share and Cite

**MDPI and ACS Style**

Vázquez-Martín, R.; Núñez, P.; Bandera, A.; Sandoval, F.
Curvature-Based Environment Description for Robot Navigation Using Laser Range Sensors. *Sensors* **2009**, *9*, 5894-5918.
https://doi.org/10.3390/s90805894

**AMA Style**

Vázquez-Martín R, Núñez P, Bandera A, Sandoval F.
Curvature-Based Environment Description for Robot Navigation Using Laser Range Sensors. *Sensors*. 2009; 9(8):5894-5918.
https://doi.org/10.3390/s90805894

**Chicago/Turabian Style**

Vázquez-Martín, Ricardo, Pedro Núñez, Antonio Bandera, and Francisco Sandoval.
2009. "Curvature-Based Environment Description for Robot Navigation Using Laser Range Sensors" *Sensors* 9, no. 8: 5894-5918.
https://doi.org/10.3390/s90805894