Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2014, Article ID 368961, 12 pages
http://dx.doi.org/10.1155/2014/368961
Research Article

Localization of Outdoor Mobile Robots Using Curb Features in Urban Road Environments

1Department of Mechanical Engineering, Korea University, Anam-dong, Seongbuk-gu, Seoul 136-713, Republic of Korea
2Department of Control and Instrumentation, Korea University, Anam-dong, Seongbuk-gu, Seoul 136-713, Republic of Korea

Received 12 December 2013; Revised 13 February 2014; Accepted 13 February 2014; Published 8 April 2014

Academic Editor: Leo Chen

Copyright © 2014 Hyunsuk Lee et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Urban road environments that have pavement and curb are characterized as semistructured road environments. In semistructured road environments, the curb provides useful information for robot navigation. In this paper, we present a practical localization method for outdoor mobile robots using the curb features in semistructured road environments. The curb features are especially useful in urban environment, where the GPS failures take place frequently. A curb extraction is conducted on the basis of the Kernel Fisher Discriminant Analysis (KFDA) to minimize false detection. We adopt the Extended Kalman Filter (EKF) to combine the curb information with odometry and Differential Global Positioning System (DGPS). The uncertainty models for the sensors are quantitatively analyzed to provide a practical solution.

1. Introduction

Outdoor environments have irregular shapes and changes in geometry and illumination due to the weather condition. Therefore, environmental uncertainty is relatively high. There are numerous studies for autonomous navigation of mobile robot in outdoor environments. Typical examples are the autonomous vehicles that were developed through DARPA Grand/Urban Challenges [1, 2]. Most of the vehicles were equipped with a variety of high cost sensors in order to overcome various uncertainties.

The aim of this work is to develop a practical localization method for outdoor mobile robots. In particular, this study focuses on surveillance robots in urban road environments. A localization method using a small number of sensors is proposed instead of using multiple high cost sensors.

The fusion of a global positioning system (GPS) and inertial measurement unit (IMU) has been widely used for the outdoor localization of mobile robots [3, 4]. However, it is difficult to ensure the accurate pose estimation in dense urban environment, where GPS signal is degraded by the multipath errors and satellite blockage. Therefore, the use of environmental features has been studied to enhance the precision of the estimated robot pose in dense urban environment [5, 6]. In [7], a 3D representation of the local environment is used to detect obstruction of the GPS signals, which is blocked by buildings. In [8, 9], the extracted line features of buildings are used to estimate the robot position. However, the available information regarding buildings is sparse in many places, and the slow update rate limits the performance of localization. In [10], the road centerline is extracted to correct the lateral position of the vehicle in mountainous forested paths. However, the correction of the heading error is not considered by the extracted road centerline.

Generally, urban road environments are paved, and the curbs act as the boundaries of the roads. Therefore, urban road environments are characterized as semistructured road environments. In semistructured road environments, the curb provides useful information for robot navigation. Therefore, the curb features have been widely used for navigation strategies and localization methods. In [11], Wijesoma et al. propose a method based on EKF for detection and tracking of the curbs. The range and angle of the curbs are obtained from an LRF measurement. However, quantitative performance analysis of the curb detection was not clearly shown. In [12, 13], the curb on one side of the road is extracted using a vertical LRF for vehicle localization. The lateral error of a vehicle is reduced by map matching approach using the extracted curb point. However, the correction of the heading angle is not considered, because there is no angle information of the curb.

A method for traversable region detection using road features such as road surface, curbs, and obstacles is proposed in our previous works [1416]. The curb features are derived by the geometric features of the road to obtain curb candidates. In order to extract the correct curb among the curb candidates, a validation gate is derived from principal component analysis (PCA). The curb extraction is performed successfully in a road environment. However, there was a fundamental limitation on PCA. The validation gate does not consider the classification of the curb and noncurb data. Consequently, false detection where noncurb data are extracted as curbs may occur. The precision of estimated pose is decreased, when the false detection data are used for localization. Therefore, it is important to reduce the false detection rate.

The contribution of this paper can be summarized by two schemes. The first contribution is the robust curb extraction scheme by using a single Laser Range Finder (LRF). In order to reduce the number of false detection of the curbs, the classification of the curb data and the noncurb data is conducted by using Kernel Fisher Discriminant Analysis (KFDA) in [1719]. On the basis of our previous works, geometrical features of the curb are defined. The second contribution is the integrated localization scheme using curb features on the basis of quantitative sensor uncertainty models. In particular, the uncertainty model for the extracted curb is quantitatively determined from experiments. Extended Kalman Filter (EKF) is exploited to combine the curb features with odometry and Differential Global Positioning System (DGPS) information. The extracted curbs enable accurate estimation of the pose of the robot even when the temporal GPS blackout takes place.

The remainder of this paper is organized as follows. A method for curb extraction is presented in Section 2. Section 3 describes the localization method for an outdoor mobile robot using the extracted curb. It also introduces the uncertainty models for the sensor measurements. The experimental results of the proposed method are presented in Section 4. Finally, the conclusion of this research is presented in Section 5.

2. Road Feature Extraction

2.1. System Configuration

Figure 1 shows the configuration of the mobile robot and the LRF coordinate configuration. A single onboard LRF with a tilting angle of is used to extract road features such as the road surface and curbs. The following list shows the nomenclature for road feature extraction. The variables are described with respect to the robot coordinate frame:: road width;: nominal tilt angle of LRF;: angle between the detected road surface points and -axis;: horizontal look-ahead distance from the robot to the road surface points;: coordinates of the right curb edge (point C);: angle between the right curb and -axis;: coordinates of the left curb edge (point B);: angle between the left curb and -axis.

368961.fig.001
Figure 1: The configuration of a robot and installation of a LRF.
2.2. Road Feature Detection

A road feature detection scheme was proposed in our previous works [14, 15]. The previous works on road feature detection are briefly reviewed. The first step for the road feature detection is the road surface extraction. In order to identify the road surface, the LRF measurements in expected road region are selected as candidates for the road surface . Multiple line segments are constructed by combining consecutive data points of . The angle of the road surface is parallel with the -axis in the ideal case. Therefore, the range data are saved as road surface points if the angle of the line segment is within threshold . The road surface is extracted as a line by using the least square method. The extracted road surface provides and from the robot. The overall algorithm can be summarized in Algorithm 1.

alg1
Algorithm 1: Road surface extraction.

Figure 2 shows the ideal model of a semistructured road environment. The geometrical features of the road can be used to extract the curb features. The curb has the following four attributes.(Att_1) The angular difference between the curb orientation () and road surface angle () is close to 90°.(Att_2) The gap between the horizon distance of the road surface () and the value of the curb point (B or C) is close to 0.(Att_3) The angular difference between the left curb () and the right curb () is close to 0.(Att_4) The difference between the road width and the gap between two curbs is close to 0. It is assumed that the road width is known.

368961.fig.002
Figure 2: Ideal model of semiconstructed environment (blue dotted line: LRF data, red line: extracted road surface).

It is commonly assumed that the robot navigates parallel to the curb. It is reasonable to assume that the robot is moving along the road without significant change of orientation in most cases. Moreover, the vertical surfaces of the curb are perpendicular to the road surface. When we scan the road environments using the LRF, the road features are composed of straight lines with different orientations. In order to distinguish different line segments that correspond to the road surface, the curbs, and the sidewalk, it is helpful to assume that the robot’s heading direction points forward along the road. The road width is assumed to be known by given map information.

The first attribute is used to select the curb candidates. The line segments that satisfy the following condition are selected as the curb candidates: where , = tolerance.

Once the curb candidates are determined, attributes 2, 3, and 4 are used to extract the correct curb out of the curb candidates. The attribute values , , and that correspond to each attribute are numerically calculated for the pair of right and left curb candidates as follows: is the gap between the right and left curb points (). When a curb exists on only one side, and are computed from the curb candidates for one side. is assumed to be 0. The data vector of the curb candidates is given as follows:

2.3. Curb Extraction Using Kernel Fisher Discriminant Analysis (KFDA)

KFDA is applied to extract the correct curb from the curb candidates. KFDA aims to find a discriminant function for optimal data classification. Therefore, the discriminant function can be used to classify the curb candidates as curb and noncurb class. The curb extraction is conducted by the following procedure. First, the training data with class information are selected. The discriminant function is derived from the offline computation of the training data. When the discriminant function is obtained, classification of the curb candidates is carried out by the discriminant function in real time.

The discriminant function is defined as a vector to maximize the following object function in the kernel feature space . Consider the following:

The “between-class variance matrix” and the “within-class variance matrix” are defined as follows: with and . is the number of total training samples, and is the number of class samples.

Training data consist of data with obvious class information. Each sample belongs to one of two classes that include curb and noncurb class. In order to obtain equivalent effects on classification, the components of each sample should be normalized. Consider the following:

Here, and are the mean and standard deviation of each attribute, respectively. The training data that are mapped to the kernel feature space are represented by an matrix . The Gaussian kernel is used for mapping the data onto kernel feature space . Consider the following:

is the control parameter that needs to be tuned to improve classification performance [19]. If is too small, the overfitting problem may take place. Although the classification accuracy with respect to the training data increases, the classification performance with respect to the test data becomes poor under the occurrence of the overfitting. If is too large, the underfitting problem may take place. The classification performance will not be satisfactory at all cases when the underfitting takes place. Therefore, should be carefully selected. In this paper, is manually tuned under the consideration of experimental performances.

The discriminant function that maximizes the object function in (4) is derived by an eigenvalue problem. In order to project the training data onto a one-dimensional (1D) solution space, the eigenvector that corresponds to the largest eigenvalue is defined as the discriminant function for data classification. The discriminant function is given by an vector. Consider the following:

The classification of the training data is conducted by an inner product of the data matrix and the discriminant function . The projected training data are distributed in the 1D solution space. Consider the following:

The class of test data can be predicted by using the discriminant function α. The test data denote the curb candidates. The test data is projected to the 1D solution space, as shown in the following equation:

The class properties of the training data are used to predict the class of the test data . The Mahalanobis distances between each class and the test data are computed by the following equation: where and denote the mean and standard deviation of the class distribution, respectively. The class of the test data is determined as the class with the smallest Mahalanobis distance. When the test data are classified as the curb class, they are extracted as the curb.

3. Outdoor Localization Using Curb Feature

3.1. System Design

This paper adopts EKF to estimate the robot pose using curb features. EKF is a well-known method for mobile robot localization and sensor fusion [2022]. When the initial pose of the robot and adequate observations are provided, the pose of the robot can be estimated by correcting the odometry error. The EKF process consists of a prediction step and an update step in sampling time . DGPS and an LRF are used to correct odometry errors. Figure 3 shows a block diagram of the localization process.

368961.fig.003
Figure 3: A block diagram of localization process.
3.2. Measurement Uncertainty Model

The GPS error mainly occurs due to the following two factors. One is the pseudorange errors caused by systematic factors. Another is the geometric constellation of satellites. In this paper, DGPS is used to minimize the pseudorange errors. The uncertainty model for DGPS is computed under the consideration of the “dilution of precision” in relation to the geometric constellation of satellites and the pseudorange error or so-called “user-equivalent range error” [23]. Consider the following:

The error covariance is given as (12). The elements of the covariance matrix are provided by the DGPS receiver in real time.

Several studies were proposed to define the error covariance of line features [24, 25]. However, the uncertainty models are appropriate for a static condition or a low-speed driving condition on a flat surface. However, the outdoor mobile robots are usually driven in the uneven terrain. The measurement noise of the extracted curb occurs due to the wobble of the robot. Therefore, the uncertainty model for the curb needs to be defined by experiments under resultant driving conditions.

In order to define the error covariance , we consider the noise model of the extracted curb as shown in Figure 4. The information of the extracted curb contains estimation errors in the range and angle. Therefore, the angle and range measurements for the extracted curb are composed of the “true” angle and the “true” range , along with the estimation errors. Consider the following: where and are the estimation errors for a curb and have random variances and , respectively. The ground truth of the curb locations can be measured by an additional LRF, which is attached to the side of the robot. Because the additional LRF is directly facing the curb, the range data from the additional LRF provide reliable geometric information of the curb during robot’s movement. The measurement of the curb from the additional LRF is more accurate than the forward-pointing tilted LRF, because the number of range data points that correspond to the curb is much larger. The curb is represented as a straight line by the application of the least square method. The ground truth implies the relative range and orientation of the curb. The estimation errors are computed for the extracted curbs. Therefore, the error covariance of the curb is defined by the distribution of the estimation errors. By using a large number of measurements of the estimation errors and , the error covariance is calculated as follows:

368961.fig.004
Figure 4: Noise model of the extracted curb.

The covariance matrices are experimentally defined as constant values for the left and right curbs.

3.3. Extended Kalman Filter Localization Using Curb Feature

The odometry data from the wheel encoders are used to predict the robot pose. By using the incremental distance and orientation , the predicted robot pose is given by the following equation:

The state vector is the predicted robot pose at sampling time . When the initial pose of the robot is given, the robot pose is represented by , and in a global coordinate frame. The uncertainty of the predicted robot pose consistently increases because of the accumulative errors of the odometry.

In order to correct the odometry error, the first measurement correction is performed using DGPS measurements. The update frequency is set to 1 Hz on the basis of the DGPS measurement frequency. When the available position measurement is provided, the observation vector is given in global coordinates. Consider the following:

The observation model for the current state is described as follows:

The measurement Jacobian matrix is given as follows:

The second measurement correction is conducted by using the curb features. The observation vector for the extracted curb is given by (19). When the curbs are extracted on both sides, there are two measurement vectors for the left and right curbs as shown in Figure 5. The sequence of the second measurement correction is from the right curb to the left curb. The update rate is 5 Hz if the curbs are continuously extracted.

368961.fig.005
Figure 5: Extracted curb features and a line map with respect to the global coordinate frame.

The robot pose is corrected by comparing the curb with the map. The extracted curb is matched with the th line of the curb map. The observation model for the th line and the robot pose at time is given by the following equation:

The Jacobian matrix is defined as follows:

The consistency of EKF relies on the observation model. If an erroneous sensor observation is provided, the system does not provide a consistent result. Therefore, the outliers that lie outside of the uncertainty bounds should be rejected. A normalized innovation squared (NIS) test is implemented in order to confirm the consistency of the filter. NIS value has a distribution with respect to degrees of freedom. Consider the following: where is the innovation matrix which is defined as . The NIS value for valid measurements should be within the threshold of the distribution. However, the erroneous measurements are discarded when the NIS value lies outside the threshold boundary. A threshold value that corresponds to a 95% confidence region is used for outlier rejection.

4. Experimental Results

4.1. Experimental Setup

Figure 6 shows the sensor system attached to the mobile robot. The outdoor mobile robot is a Pioneer P3-AT, a commercial outdoor platform of MobileRobots. The wheel encoders attached to the driving motors provide odometry information. SICK LMS-100 was used to detect the curb. The LRF was tilted by 5° toward the ground. A Novatel DGPS system was used to measure the global position. The robot was equipped with a rover antenna, and a base station was located on the roof of a nearby building. The “true” value for the curb and the ground truth were measured using an additional LRF (HOKUYO URG-04lx) attached to the side of the robot.

368961.fig.006
Figure 6: Sensor system attached to mobile robot.

The experiment was performed in Korea University in Seoul, Korea, as shown in Figure 7. The robot was driven manually along the yellow dashed line from “S” to “G”. The curbs along the experimental path are represented as solid green lines. The target environment has semistructured geometry that is composed of paved roads and curbs in most of the region. Furthermore, there are tall buildings and tunnels that degrade the DGPS precision. The travel distance along the experimental path was 775 m, and average speed of the robot was 0.5 m/s.

368961.fig.007
Figure 7: Experimental environment.
4.2. Curb Extraction Results

The curb extraction process in a semistructured road environment is shown in Figure 8. Figure 8(a) shows the road environment where the experiment was performed. The red dotted line represents the scan area of the tilted LRF. Figure 8(b) shows the extracted road surface on the basis of the scanned LRF data in the road environment. When the road surface is extracted, the line segments that satisfy attribute 1 in Section 2.2 are selected as the curb candidates. Figure 8(c) shows the line segments that correspond to the curb candidates. The first and the second candidates for each side of the curbs are represented by blue and green lines, respectively. There are two candidates on each side. Therefore, four pairs of candidates can be extracted as curbs: (R1, L1), (R1, L2), (R2, L1), and (R2, L2). The class prediction results for each pair of curb candidates are listed in Table 1. The pairs of candidates (R1, L1) and (R1, L2) are classified as the curb class. Finally, candidate (R1, L1) is extracted as the curbs as shown in Figure 8(d), which has the smallest Mahalanobis distance with respect to the curb class. If the LRF data that correspond to the vertical surface are detected, the proposed method will perform successfully despite the small and less distinct curbs.

tab1
Table 1: Class prediction for pairs of curb candidates.
fig8
Figure 8: Curb extraction results. (a) Road environment. (b) Road surface detection. (c) Curb feature candidates on both sides. (d) Extracted curb.

The curb extraction was performed while the robot navigates through the experimental path. Figure 9 shows the curb mapping result from the curb extraction. The curb map is shown in the right bottom in Figure 9. The extracted curbs are denoted by the magenta dots. The results indicate that most of the curbs along the experimental path were successfully extracted. In order to demonstrate the robustness of the proposed method, comparison of classification performance between our previous method and the proposed method is summarized in Table 2. The accuracy of the curb extraction with PCA was 91.3% for 4751 scanned datasets. The accuracy was improved to 98.6% with the proposed KFDA. A confusion matrix is presented below. The conventional method and the proposed method show 97.1% and 98.2% of true positive rate, respectively. The curb extraction is performed successfully by the application of both methods. However, the most important factor in the curb extraction is to reduce the false detection rate. False detection is that the noncurb data are misclassified as the curb data. The accuracy of the estimated pose can be decreased, when the false curbs are used for localization. The false detection rate was 27.4% with PCA. The false detection rate was reduced to 0.4% with the proposed KFDA. The result implies that most of the noncurb data were classified as noncurb data correctly.

tab2
Table 2: Classification results including confusion matrix.
368961.fig.009
Figure 9: Curb mapping result on experimental path.
4.3. Uncertainty Measurement Results

The DGPS measurements are represented by red dots along the experimental path in Figure 10(a). The areas with large DGPS errors are represented by A–F. Figure 10(b) shows the standard deviation of the DGPS measurements. The standard deviation of the DGPS is about 2 m in open area. There were temporal blackouts in areas B, D, and E due to satellite blockage. The uncertainty measurements in areas B, D, and E were almost infinite as shown in Figure 10(b). Furthermore, large DGPS errors occurred in areas A, C, and F due to the degraded satellite signals. In particular, measurement errors were 2.1 m on average in area F. The standard deviation that was measured in area F was 6.4 m on average. The result implies that the regional properties of DGPS were obtained by measuring the DGPS uncertainty in real time. However, the precision of the localization only by DGPS can be decreased when the robot navigates near the buildings.

fig10
Figure 10: (a) DGPS measurement and (b) DGPS uncertainty measurement ( bound).

The curb estimation errors are considered in order to define the error covariance. The most accurate result for the quantitative curb uncertainty is obtained by measuring the estimation errors in experimental environments. The estimation errors were measured while the robot navigates through the road with curbs. Ground truth for the curb was provided by an additional LRF that is attached to the side of the robot as shown in Figure 11. Ground truth of the curb was provided by line feature from the additional LRF data. Experiments were conducted in order to measure the estimation errors prior to the localization experiment. The following results show the estimation errors for the range and angle of the extracted curb.

368961.fig.0011
Figure 11: Additional LRF for ground truth of the curbs.

Figure 12 shows the histogram of the curb estimation errors for each side. The number of the estimation error measurements is approximately 5000. The covariance matrix of the curb was computed from the distributions of estimation errors. The elements of error covariance for each side of the curb are listed in Table 3.

tab3
Table 3: Error covariance of extracted curb.
fig12
Figure 12: Distribution of estimation errors for (a) left and (b) right curbs.

The covariance representation for the extracted curbs is shown in Figure 13. The extracted curbs are represented by blue lines, and the error bounds for the curbs are represented by green dotted lines. The resultant curbs exist within the error bounds with 95% confidence.

368961.fig.0013
Figure 13: Extracted curbs and covariance representation.
4.4. Outdoor Localization Results

The localization results are shown in Figure 14. The proposed method was compared with the conventional framework for outdoor localization that combines DGPS measurement with odometry. The blue dash-dot line represents the localization results corrected only by the DGPS measurements. The estimated paths show some difference with respect to the reference path (yellow dotted line) in area A–F due to the DGPS errors. The magenta line represents the localization results corrected by DGPS and the extracted curb. When the curb information is applied, the results show that the robot position well matches the reference path, even if the DGPS errors are large or a blackout occurs (area A–F). It is clear that the curb information plays a dominant role. The following part presents the localization errors in area A–F, as shown in Figure 14. The localization errors were compared with the conventional framework that combines DGPS measurement with odometry. The ground truth was computed using the additional LRF attached to the side of the robot.

368961.fig.0014
Figure 14: Localization results in semistructured road environment.

Figure 15 shows the lateral errors of the localization results. The position errors when using the fusion of the odometry and DGPS are shown in Figure 15(a). The lateral errors in each area were usually greater than 1 m. In particular, the maximum error in area F was 5 m. In contrast, the localization result which was corrected by odometry, DGPS, and curb information shows that the lateral errors were within 0.6 m across the entire area, as shown in Figure 15(b). Therefore, the proposed method shows robust performance in terms of lateral position estimation errors despite the large DGPS errors. The heading errors of the localization results are represented in Figure 16. The result that was corrected using only the DGPS data is shown in Figure 16(a). The heading errors were greater than 1° on average. The variance was larger than 2°. However, when the curb information is used for correction, the heading errors were remarkably decreased. Furthermore, the errors rarely exceed 3° in the entire area. In experiments, the proposed method showed precise and robust performance for the lateral and heading errors, despite the large DGPS errors and a temporal blackout.

fig15
Figure 15: Lateral errors (a) corrected by DGPS and (b) corrected by DGPS and extracted curb.
fig16
Figure 16: Heading errors (a) corrected by DGPS and (b) corrected by DGPS and extracted curb.

5. Conclusion

This paper presents a localization method for outdoor robots using curb features in semistructured road environments. A reliable curb extraction scheme is proposed to classify the curb candidates as curb and noncurb classes. Most of the curbs in an experimental path are extracted with high accuracy. An EKF-based localization is also proposed to combine the extracted curbs with odometry and DGPS measurements. The uncertainty models of the sensors are defined by experiments to provide a practical solution for localization. From experimental results, the robustness of the proposed method is demonstrated in real road experiments. The curb features can correct significantly the lateral position and heading errors in dense area, where the DGPS signal gets degraded by buildings.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This research was supported in part by the MKE (The Ministry of Knowledge Economy), Korea, under the Human Resources Development Program for Convergence Robot Specialists Support Program supervised by the NIPA (National IT Industry Promotion Agency) (NIPA-2013-H1502-13-1001). This work was also supported by the National Research Foundation of Korea (NRF) Grant funded by Korea Government (MEST) (2013-029812).

References

  1. S. Thrun, M. Montemerlo, H. Dahlkamp et al., “Stanley: the robot that won the DARPA Grand Challenge,” Journal of Field Robotics, vol. 23, no. 9, pp. 661–692, 2006. View at Publisher · View at Google Scholar · View at Scopus
  2. M. Buehler, K. Lagnemma, and S. Singh, The DARPA Urban Challenge: Autonomous Vehicles in City Traffic, Springer, New York, NY, USA, 2009.
  3. A. Soloviev, “Tight coupling of GPS, laser scanner, and inertial measurements for navigation in urban environments,” in Proceedings of the IEEE/ION Position, Location and Navigation Symposium, pp. 511–525, Monterey, Calif, USA, May 2008. View at Publisher · View at Google Scholar · View at Scopus
  4. S. Panzieri, F. Pascucci, and G. Ulivi, “An outdoor navigation system using GPS and inertial platform,” IEEE/ASME Transactions on Mechatronics, vol. 7, no. 2, pp. 134–142, 2002. View at Publisher · View at Google Scholar · View at Scopus
  5. E.-J. Jung and D.-J. Yi, “Task-oriented navigation algorithms for an outdoor environment with colored borders and obstacles,” Intelligent Service Robotics, vol. 6, no. 2, pp. 69–77, 2013. View at Google Scholar
  6. S. Kim, J. Kang, and M. Jin Chung, “Probabilistic voxel mapping using an adaptive confidence measure of stereo matching,” Intelligent Service Robotics, vol. 6, no. 2, pp. 89–99, 2013. View at Google Scholar
  7. M. Joerger and B. Pervan, “Range-domain integration of GPS and laser-scanner measurements for outdoor navigation,” in Proceedings of the 19th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS '06), pp. 1115–1123, Fort Worth, Tex, USA, September 2006. View at Scopus
  8. M. Hentschel, O. Wulf, and B. Wagner, “A GPS and laser-based localization for urban and non-urban outdoor environments,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 149–154, Nice, France, September 2008. View at Publisher · View at Google Scholar · View at Scopus
  9. A. Georgiev and P. K. Allen, “Localization methods for a mobile robot in urban environments,” IEEE Transactions on Robotics, vol. 20, no. 5, pp. 851–864, 2004. View at Publisher · View at Google Scholar · View at Scopus
  10. Y. Morales, T. Tsubouchi, and S. Yuta, “Vehicle localization in outdoor mountainous forested paths and extension of two-dimensional road centerline maps to three-dimensional maps,” Advanced Robotics, vol. 24, no. 4, pp. 489–513, 2010. View at Publisher · View at Google Scholar · View at Scopus
  11. W. S. Wijesoma, K. R. S. Kodagoda, and A. P. Balasuriya, “Road-boundary detection and tracking using ladar sensing,” IEEE Transactions on Robotics and Automation, vol. 20, no. 3, pp. 456–464, 2004. View at Publisher · View at Google Scholar · View at Scopus
  12. M. Jabbour and P. Bonnifait, “Global localization robust to GPS outages using a vertical ladar,” in Proceedings of the 9th International Conference on Control, Automation, Robotics and Vision, pp. 1–6, December 2006.
  13. P. Bonnifait, M. Jabbour, and V. Cherfaoui, “Autonomous navigation in urban areas using GIS-managed information,” International Journal of Vehicle Autonomous Systems, vol. 6, no. 5, pp. 83–103, 2008. View at Publisher · View at Google Scholar
  14. Y. Shin, D. Kim, H. Lee, J. Park, and W. Chung, “Autonomous navigation of a surveillance robot in harsh outdoor environments,” Advances in Mechanical Engineering, vol. 2013, Article ID 837484, 15 pages, 2013. View at Publisher · View at Google Scholar
  15. Y. Shin, C. Jung, and W. Chung, “Drivable road region detection using a single laser range finder for outdoor patrol robots,” in Proceedings of the IEEE Intelligent Vehicles Symposium (IV '10), pp. 877–882, San Diego, Calif, USA, June 2010. View at Publisher · View at Google Scholar · View at Scopus
  16. D. Kim and W. Chung, “Localization of outdoor mobile robots using road features,” in Proceedings of the 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI '11), pp. 363–365, Incheon, Republic of Korea, November 2011. View at Publisher · View at Google Scholar · View at Scopus
  17. S. Mika, G. Rätsch, J. Weston, B. Schölkopf, and K. Müller, “Fisher discriminant analysis with kernels,” Neural Networks for Signal Processing, vol. 9, pp. 41–48, 1999. View at Google Scholar
  18. S. Mika, G. Rätsch, and K. R. Müller, “A mathematical programming approach to the kernel fisher algorithm,” Advances in Neural Information Processing, vol. 13, pp. 591–597, 2001. View at Google Scholar
  19. G. Camps-Valls and L. Bruzzone, Kernel Methods for Remote Sensing Data Analysis, John Wiley & Sons, New York, NY, USA, 2009.
  20. R. E. Kalman, “A new approach to linear filtering and prediction problems,” Journal of Basic Engineering, vol. 82, no. 1, pp. 35–45, 1960. View at Publisher · View at Google Scholar
  21. R. E. Kalman and R. S. Bucy, “New results in linear filtering and prediction theory,” Journal of Basic Engineering, vol. 83, no. 3, pp. 95–108, 1961. View at Google Scholar · View at MathSciNet
  22. J. J. Leonard and H. F. Durrant-Whyte, Directed Sonar Sensing for Mobile Robot Navigation, Kluwer Academic, Dordrecht, The Netherlands, 1992.
  23. B. W. Parkinson and J. J. Spilker, Global Positioning System: Theory and Applications, The American Institute of Aeronautics and Astronautics, Washington, DC, USA, 1996.
  24. K. O. Arras and R. Y. Siegwart, “Feature extraction and scene interpretation for map-based navigation and map building,” in 12th Mobile Robots Conference, vol. 3210 of Proceedings of SPIE, pp. 42–53, Pittsburgh, Pa, USA, January 1998. View at Publisher · View at Google Scholar · View at Scopus
  25. S. T. Pfister, S. I. Roumeliotis, and J. W. Burdick, “Weighted line fitting algorithms for mobile robot map building and efficient data representation,” in Proceedings of the IEEE International Conference on Robotics and Automation, pp. 1304–1311, September 2003. View at Scopus