Abstract

Urban road environments that have pavement and curb are characterized as semistructured road environments. In semistructured road environments, the curb provides useful information for robot navigation. In this paper, we present a practical localization method for outdoor mobile robots using the curb features in semistructured road environments. The curb features are especially useful in urban environment, where the GPS failures take place frequently. A curb extraction is conducted on the basis of the Kernel Fisher Discriminant Analysis (KFDA) to minimize false detection. We adopt the Extended Kalman Filter (EKF) to combine the curb information with odometry and Differential Global Positioning System (DGPS). The uncertainty models for the sensors are quantitatively analyzed to provide a practical solution.

1. Introduction

Outdoor environments have irregular shapes and changes in geometry and illumination due to the weather condition. Therefore, environmental uncertainty is relatively high. There are numerous studies for autonomous navigation of mobile robot in outdoor environments. Typical examples are the autonomous vehicles that were developed through DARPA Grand/Urban Challenges [1, 2]. Most of the vehicles were equipped with a variety of high cost sensors in order to overcome various uncertainties.

The aim of this work is to develop a practical localization method for outdoor mobile robots. In particular, this study focuses on surveillance robots in urban road environments. A localization method using a small number of sensors is proposed instead of using multiple high cost sensors.

The fusion of a global positioning system (GPS) and inertial measurement unit (IMU) has been widely used for the outdoor localization of mobile robots [3, 4]. However, it is difficult to ensure the accurate pose estimation in dense urban environment, where GPS signal is degraded by the multipath errors and satellite blockage. Therefore, the use of environmental features has been studied to enhance the precision of the estimated robot pose in dense urban environment [5, 6]. In [7], a 3D representation of the local environment is used to detect obstruction of the GPS signals, which is blocked by buildings. In [8, 9], the extracted line features of buildings are used to estimate the robot position. However, the available information regarding buildings is sparse in many places, and the slow update rate limits the performance of localization. In [10], the road centerline is extracted to correct the lateral position of the vehicle in mountainous forested paths. However, the correction of the heading error is not considered by the extracted road centerline.

Generally, urban road environments are paved, and the curbs act as the boundaries of the roads. Therefore, urban road environments are characterized as semistructured road environments. In semistructured road environments, the curb provides useful information for robot navigation. Therefore, the curb features have been widely used for navigation strategies and localization methods. In [11], Wijesoma et al. propose a method based on EKF for detection and tracking of the curbs. The range and angle of the curbs are obtained from an LRF measurement. However, quantitative performance analysis of the curb detection was not clearly shown. In [12, 13], the curb on one side of the road is extracted using a vertical LRF for vehicle localization. The lateral error of a vehicle is reduced by map matching approach using the extracted curb point. However, the correction of the heading angle is not considered, because there is no angle information of the curb.

A method for traversable region detection using road features such as road surface, curbs, and obstacles is proposed in our previous works [1416]. The curb features are derived by the geometric features of the road to obtain curb candidates. In order to extract the correct curb among the curb candidates, a validation gate is derived from principal component analysis (PCA). The curb extraction is performed successfully in a road environment. However, there was a fundamental limitation on PCA. The validation gate does not consider the classification of the curb and noncurb data. Consequently, false detection where noncurb data are extracted as curbs may occur. The precision of estimated pose is decreased, when the false detection data are used for localization. Therefore, it is important to reduce the false detection rate.

The contribution of this paper can be summarized by two schemes. The first contribution is the robust curb extraction scheme by using a single Laser Range Finder (LRF). In order to reduce the number of false detection of the curbs, the classification of the curb data and the noncurb data is conducted by using Kernel Fisher Discriminant Analysis (KFDA) in [1719]. On the basis of our previous works, geometrical features of the curb are defined. The second contribution is the integrated localization scheme using curb features on the basis of quantitative sensor uncertainty models. In particular, the uncertainty model for the extracted curb is quantitatively determined from experiments. Extended Kalman Filter (EKF) is exploited to combine the curb features with odometry and Differential Global Positioning System (DGPS) information. The extracted curbs enable accurate estimation of the pose of the robot even when the temporal GPS blackout takes place.

The remainder of this paper is organized as follows. A method for curb extraction is presented in Section 2. Section 3 describes the localization method for an outdoor mobile robot using the extracted curb. It also introduces the uncertainty models for the sensor measurements. The experimental results of the proposed method are presented in Section 4. Finally, the conclusion of this research is presented in Section 5.

2. Road Feature Extraction

2.1. System Configuration

Figure 1 shows the configuration of the mobile robot and the LRF coordinate configuration. A single onboard LRF with a tilting angle of is used to extract road features such as the road surface and curbs. The following list shows the nomenclature for road feature extraction. The variables are described with respect to the robot coordinate frame:: road width;: nominal tilt angle of LRF;: angle between the detected road surface points and -axis;: horizontal look-ahead distance from the robot to the road surface points;: coordinates of the right curb edge (point C);: angle between the right curb and -axis;: coordinates of the left curb edge (point B);: angle between the left curb and -axis.

2.2. Road Feature Detection

A road feature detection scheme was proposed in our previous works [14, 15]. The previous works on road feature detection are briefly reviewed. The first step for the road feature detection is the road surface extraction. In order to identify the road surface, the LRF measurements in expected road region are selected as candidates for the road surface . Multiple line segments are constructed by combining consecutive data points of . The angle of the road surface is parallel with the -axis in the ideal case. Therefore, the range data are saved as road surface points if the angle of the line segment is within threshold . The road surface is extracted as a line by using the least square method. The extracted road surface provides and from the robot. The overall algorithm can be summarized in Algorithm 1.

(1)    the LRF measurement.
(2)  coordinate of the LRF measurement.
(3)  The number of the LRF measurement points.
(4)  Expected range of the road surface points
(5)  for   to
(6)     if   , then
(7)    save as candidate for the road surface.
(8)    angle of candidate for the road surface.
(9)    The number of candidates for the road surface.
(10)  end if
(11)  end for
(12) for   to
(13)  if   , then
(14)     save as the road surface.
(15)  end if
(16) end for
(17) return  ( )

Figure 2 shows the ideal model of a semistructured road environment. The geometrical features of the road can be used to extract the curb features. The curb has the following four attributes.(Att_1) The angular difference between the curb orientation () and road surface angle () is close to 90°.(Att_2) The gap between the horizon distance of the road surface () and the value of the curb point (B or C) is close to 0.(Att_3) The angular difference between the left curb () and the right curb () is close to 0.(Att_4) The difference between the road width and the gap between two curbs is close to 0. It is assumed that the road width is known.

It is commonly assumed that the robot navigates parallel to the curb. It is reasonable to assume that the robot is moving along the road without significant change of orientation in most cases. Moreover, the vertical surfaces of the curb are perpendicular to the road surface. When we scan the road environments using the LRF, the road features are composed of straight lines with different orientations. In order to distinguish different line segments that correspond to the road surface, the curbs, and the sidewalk, it is helpful to assume that the robot’s heading direction points forward along the road. The road width is assumed to be known by given map information.

The first attribute is used to select the curb candidates. The line segments that satisfy the following condition are selected as the curb candidates: where , = tolerance.

Once the curb candidates are determined, attributes 2, 3, and 4 are used to extract the correct curb out of the curb candidates. The attribute values , , and that correspond to each attribute are numerically calculated for the pair of right and left curb candidates as follows: is the gap between the right and left curb points (). When a curb exists on only one side, and are computed from the curb candidates for one side. is assumed to be 0. The data vector of the curb candidates is given as follows:

2.3. Curb Extraction Using Kernel Fisher Discriminant Analysis (KFDA)

KFDA is applied to extract the correct curb from the curb candidates. KFDA aims to find a discriminant function for optimal data classification. Therefore, the discriminant function can be used to classify the curb candidates as curb and noncurb class. The curb extraction is conducted by the following procedure. First, the training data with class information are selected. The discriminant function is derived from the offline computation of the training data. When the discriminant function is obtained, classification of the curb candidates is carried out by the discriminant function in real time.

The discriminant function is defined as a vector to maximize the following object function in the kernel feature space . Consider the following:

The “between-class variance matrix” and the “within-class variance matrix” are defined as follows: with and . is the number of total training samples, and is the number of class samples.

Training data consist of data with obvious class information. Each sample belongs to one of two classes that include curb and noncurb class. In order to obtain equivalent effects on classification, the components of each sample should be normalized. Consider the following:

Here, and are the mean and standard deviation of each attribute, respectively. The training data that are mapped to the kernel feature space are represented by an matrix . The Gaussian kernel is used for mapping the data onto kernel feature space . Consider the following:

is the control parameter that needs to be tuned to improve classification performance [19]. If is too small, the overfitting problem may take place. Although the classification accuracy with respect to the training data increases, the classification performance with respect to the test data becomes poor under the occurrence of the overfitting. If is too large, the underfitting problem may take place. The classification performance will not be satisfactory at all cases when the underfitting takes place. Therefore, should be carefully selected. In this paper, is manually tuned under the consideration of experimental performances.

The discriminant function that maximizes the object function in (4) is derived by an eigenvalue problem. In order to project the training data onto a one-dimensional (1D) solution space, the eigenvector that corresponds to the largest eigenvalue is defined as the discriminant function for data classification. The discriminant function is given by an vector. Consider the following:

The classification of the training data is conducted by an inner product of the data matrix and the discriminant function . The projected training data are distributed in the 1D solution space. Consider the following:

The class of test data can be predicted by using the discriminant function α. The test data denote the curb candidates. The test data is projected to the 1D solution space, as shown in the following equation:

The class properties of the training data are used to predict the class of the test data . The Mahalanobis distances between each class and the test data are computed by the following equation: where and denote the mean and standard deviation of the class distribution, respectively. The class of the test data is determined as the class with the smallest Mahalanobis distance. When the test data are classified as the curb class, they are extracted as the curb.

3. Outdoor Localization Using Curb Feature

3.1. System Design

This paper adopts EKF to estimate the robot pose using curb features. EKF is a well-known method for mobile robot localization and sensor fusion [2022]. When the initial pose of the robot and adequate observations are provided, the pose of the robot can be estimated by correcting the odometry error. The EKF process consists of a prediction step and an update step in sampling time . DGPS and an LRF are used to correct odometry errors. Figure 3 shows a block diagram of the localization process.

3.2. Measurement Uncertainty Model

The GPS error mainly occurs due to the following two factors. One is the pseudorange errors caused by systematic factors. Another is the geometric constellation of satellites. In this paper, DGPS is used to minimize the pseudorange errors. The uncertainty model for DGPS is computed under the consideration of the “dilution of precision” in relation to the geometric constellation of satellites and the pseudorange error or so-called “user-equivalent range error” [23]. Consider the following:

The error covariance is given as (12). The elements of the covariance matrix are provided by the DGPS receiver in real time.

Several studies were proposed to define the error covariance of line features [24, 25]. However, the uncertainty models are appropriate for a static condition or a low-speed driving condition on a flat surface. However, the outdoor mobile robots are usually driven in the uneven terrain. The measurement noise of the extracted curb occurs due to the wobble of the robot. Therefore, the uncertainty model for the curb needs to be defined by experiments under resultant driving conditions.

In order to define the error covariance , we consider the noise model of the extracted curb as shown in Figure 4. The information of the extracted curb contains estimation errors in the range and angle. Therefore, the angle and range measurements for the extracted curb are composed of the “true” angle and the “true” range , along with the estimation errors. Consider the following: where and are the estimation errors for a curb and have random variances and , respectively. The ground truth of the curb locations can be measured by an additional LRF, which is attached to the side of the robot. Because the additional LRF is directly facing the curb, the range data from the additional LRF provide reliable geometric information of the curb during robot’s movement. The measurement of the curb from the additional LRF is more accurate than the forward-pointing tilted LRF, because the number of range data points that correspond to the curb is much larger. The curb is represented as a straight line by the application of the least square method. The ground truth implies the relative range and orientation of the curb. The estimation errors are computed for the extracted curbs. Therefore, the error covariance of the curb is defined by the distribution of the estimation errors. By using a large number of measurements of the estimation errors and , the error covariance is calculated as follows:

The covariance matrices are experimentally defined as constant values for the left and right curbs.

3.3. Extended Kalman Filter Localization Using Curb Feature

The odometry data from the wheel encoders are used to predict the robot pose. By using the incremental distance and orientation , the predicted robot pose is given by the following equation:

The state vector is the predicted robot pose at sampling time . When the initial pose of the robot is given, the robot pose is represented by , and in a global coordinate frame. The uncertainty of the predicted robot pose consistently increases because of the accumulative errors of the odometry.

In order to correct the odometry error, the first measurement correction is performed using DGPS measurements. The update frequency is set to 1 Hz on the basis of the DGPS measurement frequency. When the available position measurement is provided, the observation vector is given in global coordinates. Consider the following:

The observation model for the current state is described as follows:

The measurement Jacobian matrix is given as follows:

The second measurement correction is conducted by using the curb features. The observation vector for the extracted curb is given by (19). When the curbs are extracted on both sides, there are two measurement vectors for the left and right curbs as shown in Figure 5. The sequence of the second measurement correction is from the right curb to the left curb. The update rate is 5 Hz if the curbs are continuously extracted.

The robot pose is corrected by comparing the curb with the map. The extracted curb is matched with the th line of the curb map. The observation model for the th line and the robot pose at time is given by the following equation:

The Jacobian matrix is defined as follows:

The consistency of EKF relies on the observation model. If an erroneous sensor observation is provided, the system does not provide a consistent result. Therefore, the outliers that lie outside of the uncertainty bounds should be rejected. A normalized innovation squared (NIS) test is implemented in order to confirm the consistency of the filter. NIS value has a distribution with respect to degrees of freedom. Consider the following: where is the innovation matrix which is defined as . The NIS value for valid measurements should be within the threshold of the distribution. However, the erroneous measurements are discarded when the NIS value lies outside the threshold boundary. A threshold value that corresponds to a 95% confidence region is used for outlier rejection.

4. Experimental Results

4.1. Experimental Setup

Figure 6 shows the sensor system attached to the mobile robot. The outdoor mobile robot is a Pioneer P3-AT, a commercial outdoor platform of MobileRobots. The wheel encoders attached to the driving motors provide odometry information. SICK LMS-100 was used to detect the curb. The LRF was tilted by 5° toward the ground. A Novatel DGPS system was used to measure the global position. The robot was equipped with a rover antenna, and a base station was located on the roof of a nearby building. The “true” value for the curb and the ground truth were measured using an additional LRF (HOKUYO URG-04lx) attached to the side of the robot.

The experiment was performed in Korea University in Seoul, Korea, as shown in Figure 7. The robot was driven manually along the yellow dashed line from “S” to “G”. The curbs along the experimental path are represented as solid green lines. The target environment has semistructured geometry that is composed of paved roads and curbs in most of the region. Furthermore, there are tall buildings and tunnels that degrade the DGPS precision. The travel distance along the experimental path was 775 m, and average speed of the robot was 0.5 m/s.

4.2. Curb Extraction Results

The curb extraction process in a semistructured road environment is shown in Figure 8. Figure 8(a) shows the road environment where the experiment was performed. The red dotted line represents the scan area of the tilted LRF. Figure 8(b) shows the extracted road surface on the basis of the scanned LRF data in the road environment. When the road surface is extracted, the line segments that satisfy attribute 1 in Section 2.2 are selected as the curb candidates. Figure 8(c) shows the line segments that correspond to the curb candidates. The first and the second candidates for each side of the curbs are represented by blue and green lines, respectively. There are two candidates on each side. Therefore, four pairs of candidates can be extracted as curbs: (R1, L1), (R1, L2), (R2, L1), and (R2, L2). The class prediction results for each pair of curb candidates are listed in Table 1. The pairs of candidates (R1, L1) and (R1, L2) are classified as the curb class. Finally, candidate (R1, L1) is extracted as the curbs as shown in Figure 8(d), which has the smallest Mahalanobis distance with respect to the curb class. If the LRF data that correspond to the vertical surface are detected, the proposed method will perform successfully despite the small and less distinct curbs.

The curb extraction was performed while the robot navigates through the experimental path. Figure 9 shows the curb mapping result from the curb extraction. The curb map is shown in the right bottom in Figure 9. The extracted curbs are denoted by the magenta dots. The results indicate that most of the curbs along the experimental path were successfully extracted. In order to demonstrate the robustness of the proposed method, comparison of classification performance between our previous method and the proposed method is summarized in Table 2. The accuracy of the curb extraction with PCA was 91.3% for 4751 scanned datasets. The accuracy was improved to 98.6% with the proposed KFDA. A confusion matrix is presented below. The conventional method and the proposed method show 97.1% and 98.2% of true positive rate, respectively. The curb extraction is performed successfully by the application of both methods. However, the most important factor in the curb extraction is to reduce the false detection rate. False detection is that the noncurb data are misclassified as the curb data. The accuracy of the estimated pose can be decreased, when the false curbs are used for localization. The false detection rate was 27.4% with PCA. The false detection rate was reduced to 0.4% with the proposed KFDA. The result implies that most of the noncurb data were classified as noncurb data correctly.

4.3. Uncertainty Measurement Results

The DGPS measurements are represented by red dots along the experimental path in Figure 10(a). The areas with large DGPS errors are represented by A–F. Figure 10(b) shows the standard deviation of the DGPS measurements. The standard deviation of the DGPS is about 2 m in open area. There were temporal blackouts in areas B, D, and E due to satellite blockage. The uncertainty measurements in areas B, D, and E were almost infinite as shown in Figure 10(b). Furthermore, large DGPS errors occurred in areas A, C, and F due to the degraded satellite signals. In particular, measurement errors were 2.1 m on average in area F. The standard deviation that was measured in area F was 6.4 m on average. The result implies that the regional properties of DGPS were obtained by measuring the DGPS uncertainty in real time. However, the precision of the localization only by DGPS can be decreased when the robot navigates near the buildings.

The curb estimation errors are considered in order to define the error covariance. The most accurate result for the quantitative curb uncertainty is obtained by measuring the estimation errors in experimental environments. The estimation errors were measured while the robot navigates through the road with curbs. Ground truth for the curb was provided by an additional LRF that is attached to the side of the robot as shown in Figure 11. Ground truth of the curb was provided by line feature from the additional LRF data. Experiments were conducted in order to measure the estimation errors prior to the localization experiment. The following results show the estimation errors for the range and angle of the extracted curb.

Figure 12 shows the histogram of the curb estimation errors for each side. The number of the estimation error measurements is approximately 5000. The covariance matrix of the curb was computed from the distributions of estimation errors. The elements of error covariance for each side of the curb are listed in Table 3.

The covariance representation for the extracted curbs is shown in Figure 13. The extracted curbs are represented by blue lines, and the error bounds for the curbs are represented by green dotted lines. The resultant curbs exist within the error bounds with 95% confidence.

4.4. Outdoor Localization Results

The localization results are shown in Figure 14. The proposed method was compared with the conventional framework for outdoor localization that combines DGPS measurement with odometry. The blue dash-dot line represents the localization results corrected only by the DGPS measurements. The estimated paths show some difference with respect to the reference path (yellow dotted line) in area A–F due to the DGPS errors. The magenta line represents the localization results corrected by DGPS and the extracted curb. When the curb information is applied, the results show that the robot position well matches the reference path, even if the DGPS errors are large or a blackout occurs (area A–F). It is clear that the curb information plays a dominant role. The following part presents the localization errors in area A–F, as shown in Figure 14. The localization errors were compared with the conventional framework that combines DGPS measurement with odometry. The ground truth was computed using the additional LRF attached to the side of the robot.

Figure 15 shows the lateral errors of the localization results. The position errors when using the fusion of the odometry and DGPS are shown in Figure 15(a). The lateral errors in each area were usually greater than 1 m. In particular, the maximum error in area F was 5 m. In contrast, the localization result which was corrected by odometry, DGPS, and curb information shows that the lateral errors were within 0.6 m across the entire area, as shown in Figure 15(b). Therefore, the proposed method shows robust performance in terms of lateral position estimation errors despite the large DGPS errors. The heading errors of the localization results are represented in Figure 16. The result that was corrected using only the DGPS data is shown in Figure 16(a). The heading errors were greater than 1° on average. The variance was larger than 2°. However, when the curb information is used for correction, the heading errors were remarkably decreased. Furthermore, the errors rarely exceed 3° in the entire area. In experiments, the proposed method showed precise and robust performance for the lateral and heading errors, despite the large DGPS errors and a temporal blackout.

5. Conclusion

This paper presents a localization method for outdoor robots using curb features in semistructured road environments. A reliable curb extraction scheme is proposed to classify the curb candidates as curb and noncurb classes. Most of the curbs in an experimental path are extracted with high accuracy. An EKF-based localization is also proposed to combine the extracted curbs with odometry and DGPS measurements. The uncertainty models of the sensors are defined by experiments to provide a practical solution for localization. From experimental results, the robustness of the proposed method is demonstrated in real road experiments. The curb features can correct significantly the lateral position and heading errors in dense area, where the DGPS signal gets degraded by buildings.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This research was supported in part by the MKE (The Ministry of Knowledge Economy), Korea, under the Human Resources Development Program for Convergence Robot Specialists Support Program supervised by the NIPA (National IT Industry Promotion Agency) (NIPA-2013-H1502-13-1001). This work was also supported by the National Research Foundation of Korea (NRF) Grant funded by Korea Government (MEST) (2013-029812).