Table of Contents Author Guidelines Submit a Manuscript
Journal of Sensors
Volume 2016, Article ID 9497524, 13 pages
http://dx.doi.org/10.1155/2016/9497524
Research Article

Improving the Lane Reference Detection for Autonomous Road Vehicle Control

University Institute for Automobile Research (INSIA), Technical University of Madrid, Campus Sur UPM, Carretera de Valencia km 7, 28031 Madrid, Spain

Received 10 March 2016; Accepted 26 April 2016

Academic Editor: Guiyun Tian

Copyright © 2016 Felipe Jiménez et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Autonomous road vehicles are increasingly becoming more important and there are several techniques and sensors that are being applied for vehicle control. This paper presents an alternative system for maintaining the position of autonomous vehicles without adding additional elements to the standard sensor architecture, by using a 3D laser scanner for continuously detecting a reference element in situations in which the GNSS receiver fails or provides accuracy below the required level. Considering that the guidance variables are more accurately estimated when dealing with reference points in front of and behind the vehicle, an algorithm based on vehicle dynamics mathematical model is proposed to extend the detected points in cases where the sensor is placed at the front of the vehicle. The algorithm has been tested when driving along a lane delimited by New Jersey barriers at both sides and the results show a correct behaviour. The system is capable of estimating the reference element behind the vehicle with sufficient accuracy when the laser scanner is placed at the front of it, so the robustness of the control input variables (lateral and angular errors) estimation is improved making it unnecessary to place the sensor on the vehicle roof or to introduce additional sensors.

1. Introduction

Autonomous road vehicles are increasingly becoming more important and all manufacturers are involved in research projects in order to develop semiautomated or fully automated solutions [1]. Apparently the recent presentations of Google’s cars have aroused their interest, but several research groups have been working in this field for over 20 years [24]. What these autonomous vehicles have in common is the ability to perform two tasks: identifying obstacles to avoid accidents and positioning to track a route. These two problems have been under research since the initial developments, and there is still no general solution for both that meets the requirements of reliability, robustness, efficiency, and affordable cost.

The three most common methods for detecting long-range obstacles are radar [58], laser scanner [913], and computer vision [14, 15]. Each of them has advantages and disadvantages [16] and sensor fusion is commonly used to overcome the individual limitations [1721]. Regarding the technologies for detecting vehicle surroundings, which can be also used for autonomous guidance, computer vision and laser scanner can be highlighted [22, 23].

Several techniques have been implemented to develop autonomous vehicle control, including navigation methods such as waypoints using Global Navigation Satellite Systems (GNSS) [24], tracking lines or other infrastructure elements [25], and following the preceding vehicle [26, 27]. In any case, vehicle positioning at lane level accuracy is critical for ensuring safe and proper route guidance. In this case, the primary sensor that ensures this positioning is the GNSS receiver, which can generate positions with accuracy up to 1 cm if differential correction is used. However, it is common to find situations in which GNSS signal reception is poor or may even have been lost, such as in city traffic and tunnels. For those cases, GNSS positioning is frequently supplemented with inertial systems that can maintain position with adequate precision for a short time [28]. To overcome this limitation, some autonomous vehicles equip computer vision systems to detect road lines, which are capable of assisting the guidance system to keep the vehicle in the lane, but with a very precarious positioning which is subject to the detection problems of this technology under adverse lighting and environmental conditions.

Laser scanners are taking on a very important role in the problem of autonomous driving, especially since they have moved from 2D scanners to 3D. By using 3D data, the problems of using 2D laser scanners are avoided, for example, measurement errors by the partially hidden observation of an obstacle or greater detail of the information obtained. Also, with 3D data, the information obtained is considerably more complete and can be used for several purposes [2931]. In addition, through the information obtained by laser scanners to recognize the vehicle surroundings, progress has been made in the Simultaneous Localization and Mapping (SLAM) problem [3234]. Overall, the SLAM works incrementally to make a virtual map of the environment through which the vehicle navigates so that it can be located on this map. One relevant limitation is the fact that most of these algorithms are focused on mobile robots whose dynamic behaviour differs from a road vehicle and its environment.

This paper presents an alternative system for maintaining the position of autonomous vehicles without adding other elements to the standard sensor architecture by using the 3D laser scanner for both obstacle detection and positioning the vehicle in situations in which the GNSS receiver fails or provides accuracy below the required level. To this end, a representation of a reference element, such as a barrier, a tunnel wall, or a lane line (Figure 1), is established based on the data recorded by the sensor. Then, the angular and lateral errors of the vehicle related to the border of the road are calculated and used for the autonomous vehicle control.

Figure 1: Elements perceived by the laser scanner for autonomous vehicle guidance: (a) new Jersey barriers, (b) tunnel wall, and (c) lane lines.

For a reliable representation and in order to reduce the influence of possible erroneous points, it is recommended to acquire points in front of and behind the vehicle. However, this would involve placing the sensor on the top of the vehicle, which is not the most feasible solution for implementation in a vehicle, or using the information fusion from various sensors [35], which may have the drawback of the high total cost of the equipment. A method is proposed in which a single sensor is used at the front of the vehicle and processes the previous points to estimate the geometry which could have been detected by the sensor behind the vehicle. This estimation can be based on the reconstruction of the reference element in absolute coordinates from the vehicle positioning by GNSS. However, this solution would not be viable in environments such as tunnels and may be subject to high errors because of the uncertainty of this position. Moreover, the absence of elements that can serve as reference hinders this solution under normal driving on public roads. Therefore, the proposed solution is based on the knowledge of vehicle dynamics by acquiring the vehicle’s signals and its trajectory estimation using a mathematical model of vehicle dynamics.

2. Detection Method of the Reference Element

2.1. Sensor and Its Location

The variables estimation for autonomous vehicle driving is based on a laser scanner, a technology that has been widely used for these purposes. As a sensor for identifying obstacles and autonomous navigation, a VLP-16 Velodyne 3D LiDAR sensor has been used which measures distances in real time by measuring the Time of Flight (TOF). This kind of laser scanner is a sensor used in automotive environments to detect objects that may appear in the surroundings of a vehicle. It is possible to acquire data from 5 to 20 Hz, with amplitude of 360° horizontal view (FOV, Field of View) and 30° vertically distributed in 16 layers, which provides up to 300,000 points per second. These sensors were originally developed for the DARPA (Defense Advanced Research Projects Agency) autonomous vehicle competitions [36] and are currently used in a wide variety of applications.

With this equipment, it is possible to detect clouds of tridimensional points that represent obstacles, vehicles, pedestrians, and other road elements. However, the information generated by this sensor can also be used to detect the road boundaries as well as the separation between the lanes of the road. In Figure 2, the different colours of the points represent the reflectivity intensity. Then, the lane lines are automatically highlighted and the lateral traffic barriers are also differentiated from the pavement (areas 1 and 2). Clustering obstacles can also provide the barrier information (areas 3 and 4).

Figure 2: Raw information captured by a 3D laser scanner on a highway section.

Two possible locations for the laser scanner in the vehicle (at the front and on the top) have been considered (Figure 3), each of them with their advantages and disadvantages. The former facilitates sensor integration in the vehicle but the second solution provides a 360° view and obstacle detection at greater distances and higher speeds.

Figure 3: Possible sensor placements. (a) At the front of the vehicle. (b) On the vehicle roof.

Unlike [25, 33, 37], in this paper, the sensor is designed to be placed at the front of the vehicle, but specific algorithms are required to extend the detected points of the reference element behind the vehicle position to resemble the behaviour of the system when placing the sensor on the vehicle roof in order to improve the robustness of the reference element detection. The placement selection has been done considering that many manufacturers are implementing their sensors at the bumper or the windscreen and the location at the roof is only chosen in prototypes. In this paper proposal, the most unfavourable situation is considered in order to generate solutions that could be used in any case.

2.2. Fixed Barrier or Wall Detection

As mentioned in [16], the procedure followed for detecting obstacles is divided into two stages: () segmentation data and () obstacle tracking. In the first stage, the data obtained directly from the sensor are divided into clusters, associating these obstacles as far as possible with similar characteristics [38, 39]. Then, tracking tries to correlate the clusters of one scan with the previous results.

Furthermore, to keep computational complexity low, and when the characteristics of the object and its detection make its height irrelevant, the problem can be tackled by building a 2D model from the information provided by the 3D point cloud, just as is performed by other detection algorithms [40].

The detection algorithm for vehicle guidance is defined in 7 steps.

Step 1 (selection of the Region of Interest (ROI)). This region is defined as a function of the look-ahead distance following the suggestions developed in [41] which depends on the vehicle speed and is quantified by (1), where is the look-ahead distance and is the speed of the vehicle, where metres, metres,  km/h, and  km/h. These coefficients have been calculated experimentally to guarantee the safe operation of the autonomous vehicle and the stability of its controllers. Thus, with this operation, the computation is reduced because the nonrelevant laser points are discarded. Consider

Step 2 (Delaunay triangulation of the discrete mesh points (Figure 4(a))). Each 3 points are grouped into sets following a criterion of neighbourhood, using Delaunay triangulation. This first clustering allows the representation of the detected objects as a set of surfaces, whose features will allow further analysis and classification.

Figure 4: (a) Delaunay triangulation mesh. (b) Normal vectors. (c) Detected obstacle. (d) Interest zone and control parameters calculated. -axis corresponds to the direction of AV. Distances are in metres.

Step 3 (normal vector calculation (Figure 4(b))). Once the mesh has been triangulated, the equation and the normal vector of each triangle are calculated in order to consider the orientation of the surface. Then, for each triangle a normalized normal vector can be calculated considering 2 vectors between vertices . ConsiderThen, the angles between the vector and the three coordinate axes are calculated by (3) in order to compare γ angle with the vertical axis:

Step 4 (threshold filtering). With the information supplied by the normal vectors of each surface, a filtering operation is executed by removing the triangles whose normal vector direction is out of a prefix threshold based on the global vertical axis. In the case of the New Jersey vertical barriers, the boundaries of the threshold have been defined between 75° and 90° from the horizontal considering the barrier geometry [42], the sensor accuracy and resolution, and possible misdetections. These thresholds have provided satisfactory results in experimental preliminary tests. If the condition is true, the coordinates of the three vertices of the triangle are included as new points of the candidate list.

Step 5 (DBSCAN clustering (Figure 4(c))). The next step is the clustering of the different set of points which are detected as candidates to represent a lateral barrier or wall. The calculations to determine the clustering and the shape of the barrier are made using bidimensional points since the vertical coordinate is not relevant once the barrier has been detected. This clustering is based on the Euclidean distance between the points and their density, using the density-based spatial clustering of applications with noise (DBSCAN) method [43]. Then, the set of points is segmented in the different candidates to represent a barrier. Once this classification is complete and in order to be used for autonomous vehicle positioning and guidance, the selected set of points will be the one located laterally and longitudinally to the vehicle, considering a length of 20 metres ahead. If available, the same distance should be considered behind the vehicle.

Step 6 (lateral barrier equation calculation (Figure 4(d))). Finally, once the candidate set of points is selected, the mathematical function that defined the barriers structure is calculated using a quadratic regression. Considerwhere and are the coordinate axes relative to the actual vehicle position, with axis being the longitudinal one.

Step 7 (vehicle guidance variables calculation). Then it is possible to calculate the two main variables necessary to perform the autonomous vehicle control: the lateral and angular errors. In order to calculate both errors, firstly the point is calculated. This point must be part of the regression and its normal vector must pass through the sensor location. Considering the tangent line equation at , the perpendicular one is given by the following equation:Considering that this straight line must cut the local coordinate system (laser scanner location) and must verify (4), is obtained and then the angular and lateral errors are as provided by the following:

2.3. Alternative Solution for Outsider Elements Filtering

It should be noted that, in the case of an obstacle that is located near the barrier, such as a car or traffic signs, the clustering step may include their points as part of the selected cluster. In the event that the outsider element is far away from the car, the number of points might not be sufficiently representative compared to the whole barrier so the function calculation may not be affected. But, in the event of the element being near the vehicle, the number of points associated with this detection may be high and may affect the calculation of the reference element equation in a very significant way because of the high relative relevance of these points. This situation can be seen in Figure 4(d) in which more points associated with the barrier are detected near the vehicle than in locations far away from it. In order to manage this case and homogenize their relative relevance, a weighting factor has been assigned to each point that increases as the distance between the point and the vehicle increases. Considering that the detected points’ density per length unit for each laser layer is inversely proportional to the distance to the scanner, the density of points belonging to the barrier to be used in Step 5 in which the vertical coordinate is not taken into account could be fitted by an expression similar to the following equation:Then, assuming the sensor blind zone when is lower than 2 metres, a weighting function is built as the difference between the density at this zone border and the density fitted by (7) as shown in Figure 5. This function represents the number of times each detected point should be repeated in order to homogenize the relevance of the points detected in the ROI, so the effect of outsider elements is the same independently of the distance to the sensor and sudden changes in the results of Step 6 of the algorithm could be minimized.

Figure 5: Example of detected points’ density versus distance to the sensor and weighting function definition.
2.4. Lane Line Detection

The road line detection is easier, since it is based on the identification of changes in the measurement of the reflectivity in those points detected on the surface of the road, as shown in Figure 2. The main problem in this detection is the lack of robustness because it is subject to the road conditions and only a limited set of points can be used as representative of the lines.

3. SLAM Problem

Should the laser be placed at the front of the vehicle, there is no information about the points belonging to the barrier or the wall behind the vehicle. The proposed method to obtain the necessary variables for autonomous vehicle control is based on adding to the detected points of the reference element, the portion of the reference element that is behind the vehicle and is not detected by the laser scanner. Since the use of GNSS is not feasible due to the inaccuracy of the measurement if differential corrections are not available [44] and because of the possible signal losses in some environments, or it is not possible to identify fixed landmarks in the infrastructure in an open and highly changing environment, the proposed method estimates the trajectory of the vehicle by studying the driving variables, such as steering wheel rotation and speed. Thus, the vehicle’s position can be determined referred to its current position, so the previous measurements of the laser scanner are reproduced from the past regarding this new position and are integrated with the new detections.

3.1. Vehicle Trajectory Estimation

In order to extrapolate the reference element to the area without data for an accurate and stable quadratic regression calculation, the vehicle trajectory in the past should be reproduced to complete the SLAM problem. Different solutions could be proposed to estimate this trajectory: () using GNSS waypoints, but signal deterioration could be significant, () using inertial measurement systems but cumulative errors limit their applicability in long distances [45, 46], and () using speed and steering wheel angle and simulating the vehicle dynamics.

Considering the limitations of the first 2 methods discussed above and the fact that even techniques such as Kalman filtering could provide accurate results combining the first two approaches [4749], these solutions are not suitable for road sections such as long tunnel stretches. For this reason, the current approach is based on the third alternative.

To estimate the vehicle’s trajectory by simulating the vehicle dynamics, very different degrees of complexity models could be used [5054]. A widely used model for these purposes is the 14-degree-of-freedom (dof) model, which considers the movements and rotations of the sprung mass (6 dof), the movement of the unsprung masses (4 dof), and wheel rotation (4 dof). This model is integrated in the software presented in [55].

Dynamic equations of sprung mass movements have been written considering the Cartesian coordinates system fixed to the sprung mass in which -axis is the longitudinal, -axis is the transversal, and -axis is the vertical. It should be noted that a noninertial coordinates system is used; therefore inertial forces appear. External forces include aerodynamic, tyre, and suspension forces.

Linear movements are as follows:

Angular movements are as follows:Furthermore, unsprung mass movements and wheel rotation are calculated by the following equations:

Tyre forces are calculated using the Dugoff model and traction forces are obtained from a propulsion system model that includes engine and transmission [56, 57].

It should be noted that the implementation of this model does not have an excessively high computational cost, but it could be simplified to guarantee real time execution. In this regard, instead of using simpler models such as traditional 3 dof models (two-wheel vehicle model) [28] that do not take into account all relevant parameters of the dynamics, the solution to generate a look-up table has been chosen. The table is generated offline using the full model where the input variables are the speed and the steering wheel rotation , both being available in a vehicle. As output variable the yaw rate is obtained, from which it is possible to determine the evolution of the trajectory in local coordinates. Consider where is the time between two consecutive samples.

3.2. Road Superelevation Rate Influence

The above mathematical model does not take into account the influence of road superelevation rate. However, this influence could be significant. In order not to increase the complexity of the 2 input variables’ look-up table, it is proposed to correct the input of the steering wheel angle depending on the road superelevation rate. Thus, Figure 6 shows the two-wheel vehicle model [52, 56] in which the gravitational force component is acting transversely on the vehicle because of the superelevation rate, there being no other force applied in that direction in the centre of gravity. Considerwhere is the vehicle weight and is the superelevation angle.

Figure 6: Two-wheel vehicle model moving along a road section with nonzero superelevation rate.

This force generates the tyre efforts in order to achieve a force balance and they are given by the following: where is the steering angle of the front wheels and and are the distances between the front and rear axles and the centre of gravity.

Assuming a linear relationship between the lateral forces in the tyres and their slip angles and constant transversal tyre stiffness , therefore these angles can be calculated.

In order to maintain a straight trajectory, the vehicle slip angle should be null. Considering Figure 6, this condition is satisfied in the case where (14) is fulfilled; that is to say the orthogonal projection of the centre of curvature falls on the centre of gravity:So the required steering angle is given by the following equation:Thus, with the road superelevation rate at each instant now known, it is possible to correct the measured angle of the steering wheel according to (16). Considering the relationship with the steering wheel angle , the look-up table provides the yaw rate and the trajectory is calculated according to (10):It should be noted that the superelevation rate variable should be obtained in real time as well as speed and steering wheel rotation. One possible solution is to use a detailed digital map to know any absolute position of the vehicle on it. However, this solution involves the availability of such maps with the required degree of precision and detail, which are not common [58], as well as a precise absolute positioning, which cannot be guaranteed in several areas [59]. Alternatively, the superelevation angle could be approximately obtained from the measurement of a gyroscopic platform [45]. In this case, the gyroscopic platform cannot distinguish between actual superelevation rate and vehicle roll angles and proposes [46] an iterative solution, which, however, is not feasible to implement in real time in the current method. Despite this, in normal driving conditions, it is estimated that the roll angle influence is negligible and, in this case, this measurement is free of cumulative errors.

3.3. Validation Results

The mathematical model of vehicle dynamics has already been validated [55]. However, it is necessary to evaluate the accuracy provided when computing the vehicle trajectory as described above. Figure 7 shows the path of the test vehicle on a 1.4 km long interurban road stretch. The acquisition system records the speed and steering wheel angle from the internal communication bus, while the road superelevation angle is given by an RMS FES 33 gyroscopic platform and the position is given by a centimetre-accuracy Trimble R4 GPS receiver with differential correction for accurate results comparison. As can be seen, not considering the superelevation angle leads to significant and inadmissible errors in the path generated from the dynamic variables of the vehicle. However, this road information may improve outcomes significantly.

Figure 7: Estimation of the trajectory using vehicle dynamics simulation.

Moreover, it should be noted that the estimates of the path are subject to cumulative errors, similarly to those presented in [45, 46]. However, it should be noted that this error becomes significant as the distance increases without any correction. In the case of the proposed calculation of the reference element behind the vehicle, the trajectory for a few metres (generally around 20 metres) is required. Additionally, although not absolute corrections are made, the estimate is made based on each relative position; thus any deviations obtained are taken only over such short distances. For the same route, estimations of 20-metre-long stretches have been computed and the results compared with accurate GPS positions. The algorithm calculations have been performed at the same rate of the perception system (20 Hz). Table 1 shows that the results deviations are quite small; therefore the proposed solution can be accepted.

Table 1: Errors in trajectory estimation on 20-metre-long road stretches.

4. Tests and Results

The algorithm has been tested using New Jersey barriers as reference element for autonomous driving. Although the algorithm has been developed to place the sensor at the front of the vehicle, in order to perform the results validation, in these tests the laser scanner has been placed on top of the vehicle, retrieving 360° measurements. The test site was the BUS-VAO independent-lane along the A6 highway (Madrid, Spain), which has a length of 12 km and 2 lanes except in specific sections that only have 1. The lane is limited by New Jersey barriers on both sides that are interrupted only by the incorporations which are made only on one side. These barriers were used as a reference for guiding the autonomous vehicle. The speed was set between 45 and 60 km/h and the acquisition rate was 20 Hz.

Firstly, the desirability of using the weighting factor for outsider filtering elements was analyzed. Figure 8 shows lateral and angular errors along a road stretch in both cases with and without using the weighting function previously presented. By applying this function, it is observed that the behaviour is very sensitive to errors when the points detected belong to the cluster but not to the barrier or the wall and are placed far away from the sensor, producing a ripple in the response, so, contrary to the intuitive idea, better results are obtained without this factor.

Figure 8: Algorithm response (a) without weighting function and (b) with weighting function.

Furthermore, Figure 9 corroborates that the response of the reference element estimation is considerably worse when dealing only with points detected in front of the vehicle so the solution of placing the sensor on the roof would be optimal following this criteria if no other corrections or algorithms are introduced.

Figure 9: Algorithm response for same frame number in two different configurations: (a) use of points detected in front of the vehicle and (b) use of points detected in front of and behind the vehicle.

The previous results justify the need to apply the method that extrapolates the detected reference element behind the vehicle where the sensor is unable to perceive it. In order to check the reliability of this proposed method, Figure 10 shows an example of detection of the barriers at a point of the trajectory and compares the following cases:(a)Estimation of the geometric characteristics of the reference element using only the points detected in front of the vehicle (the case in which the laser scanner is at the front of the vehicle).(b)Estimation of the geometric characteristics of the reference element using points detected in front of and behind the vehicle (in which case the laser scanner is on the top of the vehicle).(c)Estimation of the geometric characteristics of the reference element using the points detected in front of the vehicle and estimating the location of the points behind it.As can be seen, the adjustment determined in case (a) is far from the real solution compared to that provided by case (b), since the information is lost behind the vehicle. However, the adjustment in case (c) achieves good results. Table 2 shows deviations found between results of cases (a)-(b) and (b)-(c) for the entire path. These results support the proper functioning of the proposed method, making a significant improvement to the estimations that could be achieved only with the points detected in front of the vehicle and achieving similar results when the complete set of points detected in front of and behind the vehicle is used. Thus, it may be concluded that the laser scanner could be placed in the front part without reducing the quality of the calculation of the parameters used for autonomous driving when the proposed algorithm is used.

Table 2: Deviations between estimates of parameters for guiding the autonomous vehicle.
Figure 10: Estimation of the geometric characteristics of the barrier (a) using only the points detected ahead of the vehicle, (b) using points detected in front of and behind the vehicle, and (c) using the points detected ahead of the vehicle and estimating the location of the points behind it.

5. Conclusions

This paper has described the use of laser scanner technology to maintain the position of an autonomous vehicle even when the GNSS and inertial systems signals are not sufficiently accurate. This system is complementary to conventional guidance systems and works in parallel without interfering with obstacle detection systems. A method to increase the robustness of the data needed for autonomous driving along a reference element has been presented. Specifically, the proposed two major advances over other similar approaches are as follows:(i)The method does not require the sensor to have vision of the reference element behind the vehicle.(ii)The method does not require absolute positioning (e.g., GNSS) or landmarks in the infrastructureSince the environment is structured, it has been possible to base the system on a single sensor of relatively low cost compared to others available in the market (16 detection layers versus costly scanners with 64). Moreover, the implementation performed does not imply high computational complexity, so it can be executed in real time.

Finally, tests have shown the appropriate behaviour of the implemented algorithms and how they improve the solution in the case where only data ahead of the vehicle is considered.

Nomenclature

, , and External forces
, , and External momentums
, and Linear velocities along Cartesian axes , , and
, and Angular velocities around Cartesian axes , , and of the reference system
Vehicle mass
, , and Moments of inertia of the vehicle around the three axes of the reference system
, , and Moments of inertia of the vehicle
Vertical movement of unsprung mass
Mass of unsprung mass
Vertical stiffness of tyre
Forces transmitted by suspension spring
Forces transmitted by suspension shock absorber
Rotation speed of wheel
Moment of inertia of wheel
Longitudinal force at tyre
Moment at tyre .

Competing Interests

The authors declare that there are no competing interests regarding the publication of this paper.

Acknowledgments

This work has received the support of the Spanish Ministerio de Economía y Competitividad (TRA2013-48314-C3-2-R).

References

  1. M. B. Aranda, “La conducción automatizada,” in Sistemas de Asistencia al Conductor y de Gestión Inteligente del Tráfico, pp. 175–189, ASEPA, Madrid, Spain, 2014 (Spanish).
  2. J. E. Naranjo, C. González, R. García, T. De Pedro, and M. A. Sotelo, “Using fuzzy logic in automated vehicle control,” IEEE Intelligent Systems, vol. 22, no. 1, pp. 36–45, 2007. View at Publisher · View at Google Scholar · View at Scopus
  3. C. Urmson, C. Baker, J. Dolan et al., “Autonomous driving in urban environments: boss and the urban challenge,” Journal of Field Robotics, vol. 25, no. 8, pp. 425–466, 2008. View at Publisher · View at Google Scholar · View at Scopus
  4. A. Broggi, P. Medici, P. Zani, A. Coati, and M. Panciroli, “Autonomous vehicles control in the VisLab Intercontinental Autonomous Challenge,” Annual Reviews in Control, vol. 36, no. 1, pp. 161–171, 2012. View at Publisher · View at Google Scholar · View at Scopus
  5. S. Tokoro, “Pre-crash sensor for pre-crash safety,” in Proceedings of the International Conference on Enhanced Safety of Vehicles, Gothenburg, Sweden, June 2015.
  6. A. Polychronopoulos, M. Tsogas, A. J. Amditis, and L. Andreone, “Sensor fusion for predicting vehicles' path for collision avoidance systems,” IEEE Transactions on Intelligent Transportation Systems, vol. 8, no. 3, pp. 549–562, 2007. View at Publisher · View at Google Scholar · View at Scopus
  7. F. Fölster and H. Rohling, “Data association and tracking for automotive radar networks,” IEEE Transactions on Intelligent Transportation Systems, vol. 6, no. 4, pp. 370–377, 2005. View at Publisher · View at Google Scholar · View at Scopus
  8. R. Abou-Jaoude, “ACC radar sensor technology, test requirements, and test solutions,” IEEE Transactions on Intelligent Transportation Systems, vol. 4, no. 3, pp. 115–122, 2003. View at Publisher · View at Google Scholar · View at Scopus
  9. J. Guivant, E. Nebot, and S. Baiker, “Autonomous navigation and map building using laser range sensors in outdoor applications,” Journal of Robotic Systems, vol. 17, no. 10, pp. 565–583, 2000. View at Publisher · View at Google Scholar
  10. K. C. Fuerstenberg, K. C. J. Dietmayer, and S. Eisenlauer, “Multilayer laserscanner for robust object tracking and classification in urban traffic scenes,” in Proceedings of the 9th World Congress on Intelligent Transport Systems, Chicago, Ill, USA, October 2002.
  11. A. Ewald and V. Willhoeft, “Laser scanners for obstacle detection in automotive applications,” in Proceedings of the IEEE Intelligent Vehicles Symposium, pp. 682–687, Dearborn, Mich, USA, October 2000. View at Scopus
  12. J. Guo, M.-J. Tsai, and J.-Y. Han, “Automatic reconstruction of road surface features by using terrestrial mobile lidar,” Automation in Construction, vol. 58, article 1942, pp. 165–175, 2015. View at Publisher · View at Google Scholar · View at Scopus
  13. F. García, F. Jiménez, J. E. Naranjo et al., “Environment perception based on LIDAR sensors for real road applications,” Robotica, vol. 30, no. 2, pp. 185–193, 2012. View at Publisher · View at Google Scholar · View at Scopus
  14. C. Caraffi, S. Cattani, and P. Grisleri, “Off-road path and obstacle detection using decision networks and stereo vision,” IEEE Transactions on Intelligent Transportation Systems, vol. 8, no. 4, pp. 607–618, 2007. View at Publisher · View at Google Scholar · View at Scopus
  15. J. Hernández-Aceituno, L. Acosta, and J. D. Piñeiro, “Pedestrian detection in crowded environments through Bayesian prediction of sequential probability matrices,” Journal of Sensors, vol. 2016, Article ID 4697260, 8 pages, 2016. View at Publisher · View at Google Scholar
  16. F. Jiménez and J. E. Naranjo, “Improving the obstacle detection and identification algorithms of a laserscanner-based collision avoidance system,” Transportation Research Part C: Emerging Technologies, vol. 19, no. 4, pp. 658–672, 2011. View at Publisher · View at Google Scholar · View at Scopus
  17. N. Shimomura, K. Fujimoto, T. Oki, and H. Muro, “An algorithm for distinguishing the types of objects on the road using laser radar and vision,” IEEE Transactions on Intelligent Transportation Systems, vol. 3, no. 3, pp. 189–195, 2002. View at Publisher · View at Google Scholar · View at Scopus
  18. R. Aufrère, J. Gowdy, C. Mertz, C. Thorpe, C.-C. Wang, and T. Yata, “Perception for collision avoidance and autonomous driving,” Mechatronics, vol. 13, no. 10, pp. 1149–1161, 2003. View at Publisher · View at Google Scholar · View at Scopus
  19. J. Tan, J. Li, X. An, and H. He, “Robust curb detection with fusion of 3D-Lidar and camera data,” Sensors, vol. 14, no. 5, pp. 9046–9073, 2014. View at Publisher · View at Google Scholar · View at Scopus
  20. X. Zhang, A. Zhang, and X. Meng, “Automatic fusion of hyperspectral images and laser scans using feature points,” Journal of Sensors, vol. 2015, Article ID 415361, 9 pages, 2015. View at Publisher · View at Google Scholar · View at Scopus
  21. F. García, F. Jiménez, J. J. Anaya, J. M. Armingol, J. E. Naranjo, and A. de la Escalera, “Distributed pedestrian detection alerts based on data fusion with accurate localization,” Sensors (Switzerland), vol. 13, no. 9, pp. 11687–11708, 2013. View at Publisher · View at Google Scholar · View at Scopus
  22. F. Jiménez, J. E. Naranjo, and Ó. Gómez, “Autonomous manoeuvring systems for collision avoidance on single carriageway roads,” Sensors, vol. 12, no. 12, pp. 16498–16521, 2012. View at Publisher · View at Google Scholar · View at Scopus
  23. F. Jiménez, J. E. Naranjo, and Ó. Gómez, “Autonomous collision avoidance system based on accurate knowledge of the vehicle surroundings,” IET Intelligent Transport Systems, vol. 9, no. 1, pp. 105–117, 2015. View at Publisher · View at Google Scholar · View at Scopus
  24. K. Chu, M. Lee, and M. Sunwoo, “Local path planning for off-road autonomous driving with avoidance of static obstacles,” IEEE Transactions on Intelligent Transportation Systems, vol. 13, no. 4, pp. 1599–1616, 2012. View at Publisher · View at Google Scholar · View at Scopus
  25. J. Han, D. Kim, M. Lee, and M. Sunwoo, “Enhanced road boundary and obstacle detection using a downward-looking LIDAR sensor,” IEEE Transactions on Vehicular Technology, vol. 61, no. 3, pp. 971–985, 2012. View at Publisher · View at Google Scholar · View at Scopus
  26. C. Bergenhem, E. Hedin, and D. Skarin, “Vehicle-to-vehicle communication for a platooning system,” Procedia—Social and Behavioral Sciences, vol. 48, pp. 1222–1233, 2012. View at Publisher · View at Google Scholar
  27. A. Broggi, M. Bertozzi, A. Fascioli, C. Guarino Lo Bianco, and A. Piazzi, “Visual perception of obstacles and vehicles for platooning,” IEEE Transactions on Intelligent Transportation Systems, vol. 1, no. 3, pp. 164–176, 2000. View at Publisher · View at Google Scholar · View at Scopus
  28. F. Jiménez, J. E. Naranjo, F. García et al., “Limitations of positioning systems for developing digital maps and locating vehicles according to the specifications of future driver assistance systems,” IET Intelligent Transport Systems, vol. 5, no. 1, pp. 60–69, 2011. View at Publisher · View at Google Scholar · View at Scopus
  29. S. Zeng, “An object-tracking algorithm for 3-D range data using motion and surface estimation,” IEEE Transactions on Intelligent Transportation Systems, vol. 14, no. 3, pp. 1109–1118, 2013. View at Publisher · View at Google Scholar · View at Scopus
  30. S. Hosseinyalamdary, Y. Balazadegan, and C. Toth, “Tracking 3D moving objects based on GPS/IMU navigation solution, laser scanner point cloud and GIS data,” ISPRS International Journal of Geo-Information, vol. 4, no. 3, pp. 1301–1316, 2015. View at Publisher · View at Google Scholar
  31. J. Larson and M. Trivedi, “Lidar based off-road negative obstacle detection and analysis,” in Proceedings of the 14th IEEE International Intelligent Transportation Systems Conference (ITSC '11), pp. 192–197, IEEE, Washington, Wash, USA, October 2011. View at Publisher · View at Google Scholar · View at Scopus
  32. F. A. A. Cheein, R. Carelli, C. De la Cruz, and T. F. Bastos-Filho, “SLAM-based turning strategy in restricted environments for car-like mobile robots,” in Proceedings of the IEEE International Conference on Industrial Technology (ICIT '10), pp. 602–607, Viña del Mar, Chile, March 2010. View at Publisher · View at Google Scholar · View at Scopus
  33. F. Mutz, L. P. Veronese, T. Oliveira-Santos, E. de Aguiar, F. A. Auat Cheein, and A. Ferreira De Souza, “Large-scale mapping in complex field scenarios using an autonomous car,” Expert Systems with Applications, vol. 46, pp. 439–462, 2016. View at Publisher · View at Google Scholar
  34. F.-J. Pei, H.-Y. Li, and Y.-H. Cheng, “An improved FastSLAM system based on distributed structure for autonomous robot navigation,” Journal of Sensors, vol. 2014, Article ID 456289, 9 pages, 2014. View at Publisher · View at Google Scholar · View at Scopus
  35. Q. Li, L. Chen, M. Li, S.-L. Shaw, and A. Nüchter, “A sensor-fusion drivable-region and lane-detection system for autonomous vehicle navigation in challenging road scenarios,” IEEE Transactions on Vehicular Technology, vol. 63, no. 2, pp. 540–555, 2014. View at Publisher · View at Google Scholar · View at Scopus
  36. M. Buehler, K. Iagnemma, and S. Singh, The DARPA Urban Challenge: Autonomous Vehicles in City Traffic, Springer, Berlin, Germany, 2009.
  37. Y. Zhang, J. Wang, X. Wang, C. Li, and L. Wang, “A real-time curb detection and tracking method for UGVs by using a 3D-LIDAR sensor,” in Proceedings of the IEEE Conference on Control Applications (CCA '15), pp. 1020–1025, Sydney, Australia, September 2015. View at Publisher · View at Google Scholar
  38. S. Wender and K. C. J. Dietmayer, “3D vehicle detection using a laser scanner and a video camera,” IET Intelligent Transport Systems, vol. 2, no. 2, article 105, 2008. View at Publisher · View at Google Scholar · View at Scopus
  39. J. Leonard, J. How, S. Teller et al., “A perception-driven autonomous urban vehicle,” Journal of Field Robotics, vol. 25, no. 10, pp. 727–774, 2008. View at Publisher · View at Google Scholar · View at Scopus
  40. A. Petrovskaya and S. Thrun, “Model based vehicle detection and tracking for autonomous urban driving,” Autonomous Robots, vol. 26, no. 2-3, pp. 123–139, 2009. View at Publisher · View at Google Scholar · View at Scopus
  41. A. Broggi, M. Bertozzi, A. Fascioli, C. G. Lo Bianco, and A. Piazzi, “The ARGO autonomous vehicle’s vision and control systems,” International Journal of Intelligent Control and Systems, vol. 3, no. 4, pp. 409–441, 1999. View at Google Scholar
  42. C. F. McDevitt, “Basics of concrete barriers,” Public Roads, vol. 63, no. 5, pp. 10–14, 2000. View at Google Scholar
  43. M. Ester, H.-P. Kriegel, J. Sander, and X. Xu, “A density-based algorithm for discovering clusters in large spatial databases with noise,” KDD 1996 Proceedings, vol. 96, no. 34, pp. 226–231, 1996. View at Google Scholar
  44. F. Jiménez, J. E. Naranjo, F. García, and J. M. Armingol, “Can low-cost road vehicles positioning systems fulfil accuracy specifications of new ADAS applications?” Journal of Navigation, vol. 64, no. 2, pp. 251–264, 2011. View at Publisher · View at Google Scholar · View at Scopus
  45. F. Jiménez, F. Aparicio, and G. Estrada, “Measurement uncertainty determination and curve-fitting algorithms for development of accurate digital maps for advanced driver assistance systems,” Transportation Research Part C: Emerging Technologies, vol. 17, no. 3, pp. 225–239, 2009. View at Publisher · View at Google Scholar · View at Scopus
  46. F. Jiménez, “Improvements in road geometry measurement using inertial measurement systems in datalog vehicles,” Measurement, vol. 44, no. 1, pp. 102–112, 2011. View at Publisher · View at Google Scholar · View at Scopus
  47. A. Lahrech, C. Boucher, and J.-C. Noyer, “Fusion of GPS and odometer measurements for map-based vehicle navigation,” in Proceedings of the IEEE International Conference on Industrial Technology (ICIT '04), pp. 944–948, Hammamet, Tunisia, December 2004. View at Scopus
  48. P. Zhang, J. Gu, E. E. Milios, and P. Huynh, “Navigation with IMU/GPS/digital compass with unscented Kalman filter,” in Proceedings of the IEEE International Conference on Mechatronics and Automation (ICMA '05), vol. 3, pp. 1497–1502, IEEE, Ontario, Canada, July-August 2005. View at Scopus
  49. R. Toledo-Moreo, M. A. Zamora-Izquierdo, B. Úbeda-Miñarro, and A. F. Gómez-Skarmeta, “High-integrity IMM-EKF-based road vehicle navigation with low-cost GPS/SBAS/INS,” IEEE Transactions on Intelligent Transportation Systems, vol. 8, no. 3, pp. 491–511, 2007. View at Publisher · View at Google Scholar · View at Scopus
  50. Mechanical Simulation Corporation, CARSIM 5: Math Models Manual, MSC, Ann Arbor, Mich, 2001.
  51. A. G. Nalecz, “Development and validation of light vehicle dynamics simulation (LVDS),” SAE 920056, 1992. View at Google Scholar
  52. C. Vera and J. Félez, Dinámica Vehicular Basada en la Técnica del BOND GRAPH, Publications Section of the ETSII, Madrid, Spain, 1994 (Spanish).
  53. T. D. Day, “An overview of the HVE vehicle model,” SAE 950308, SAE International, 1995. View at Google Scholar
  54. J. M. Mera, Modelos complejos de sistemas vehiculares empleando la técnica del Bond Graph [M.S. thesis], Madrid, Spain, 2001 (Spanish).
  55. F. Aparicio, F. Jiménez, and J. Sánchez, “Development and use of vehicle dynamics simulation software as support for road vehicles theory teaching,” Computer Applications in Engineering Education, vol. 17, no. 4, pp. 467–478, 2009. View at Publisher · View at Google Scholar · View at Scopus
  56. G. Genta, Motor Vehicle Dynamics: Modeling and Simulation, World Scientific, Singapore, 1999.
  57. F. Aparicio, C. Vera, and V. Díaz, Teoría de Los Vehículos Automóviles, Publications Section of the ETSII, Madrid, Spain, 2001 (Spanish).
  58. F. Jiménez and J. E. Naranjo, “Nuevos requerimientos de precisión en el posicionamiento de vehículos para aplicaciones ADAS,” Dyna Ingeniería e Industria, vol. 84, no. 3, pp. 245–250, 2009 (Spanish). View at Google Scholar
  59. J. E. Naranjo, F. Jiménez, F. Aparicio, and J. Zato, “GPS and inertial systems for high precision positioning on Motorways,” Journal of Navigation, vol. 62, no. 2, pp. 351–363, 2009. View at Publisher · View at Google Scholar · View at Scopus