Table of Contents Author Guidelines Submit a Manuscript
Mobile Information Systems
Volume 2017, Article ID 4802159, 11 pages
Research Article

An Online Solution of LiDAR Scan Matching Aided Inertial Navigation System for Indoor Mobile Mapping

GNSS Research Centre, Wuhan University, 129 Luoyu Road, Wuhan 430079, China

Correspondence should be addressed to Jian Tang; nc.ude.uhw@naijgnat

Received 23 February 2017; Revised 17 May 2017; Accepted 30 May 2017; Published 3 July 2017

Academic Editor: Gonzalo Seco-Granados

Copyright © 2017 Xiaoji Niu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Multisensors (LiDAR/IMU/CAMERA) integrated Simultaneous Location and Mapping (SLAM) technology for navigation and mobile mapping in a GNSS-denied environment, such as indoor areas, dense forests, or urban canyons, becomes a promising solution. An online (real-time) version of such system can extremely extend its applications, especially for indoor mobile mapping. However, the real-time response issue of multisensors is a big challenge for an online SLAM system, due to the different sampling frequencies and processing time of different algorithms. In this paper, an online Extended Kalman Filter (EKF) integrated algorithm of LiDAR scan matching and IMU mechanization for Unmanned Ground Vehicle (UGV) indoor navigation system is introduced. Since LiDAR scan matching is considerably more time consuming than the IMU mechanism, the real-time synchronous issue is solved via a one-step-error-state-transition method in EKF. Stationary and dynamic field tests had been performed using a UGV platform along typical corridor of office building. Compared to the traditional sequential postprocessed EKF algorithm, the proposed method can significantly mitigate the time delay of navigation outputs under the premise of guaranteeing the positioning accuracy, which can be used as an online navigation solution for indoor mobile mapping.

1. Introduction

The establishment of highly efficient, accurate, and low-cost indoor mapping technology is becoming more and more necessary and urgent, due to the growing interest and market of indoor Location Based Services (LBSs). Simultaneous Location and Mapping (SLAM) has become a popular and effective indoor mapping technology in recent years. The SLAM technique is a process of building a map of an unknown environment by traversing it with range sensors while determining the system location on the map simultaneously; it has been explored in robotics and computer science for decades. LiDAR-based SLAM is one of the most successful technologies, as it can provide high frequency and high precision range measurements [1]. It combines positioning and mapping by utilizing two or more consecutive frames of scan points (called scan matching) with various algorithms [28]. However, LiDAR-based SLAM is heavily dependent on environment features and performs poorly in a featureless area. Various data fusion solutions have been researched for a long time in order to offset the poor performance of standalone sensor [913]. The LiDAR/IMU integrated method is a feasible way to solve such problems [1, 58, 14], because inertial measurement unit- (IMU-) based inertial navigation system (INS) can offer accurate relative positions and attitudes in a short period using gyroscopes and accelerometers [58]. Finally, LiDAR/IMU integrated method can provide sustainable and accurate position and attitude results for indoor mobile mapping.

Presently, the most popular SLAM algorithms are divided into the following types: Kalman filter, particle filters, and graph-based. Hector SLAM estimates the 3D navigation state based on robust scan matching and inertial navigation system by Extended Kalman Filter (EKF) [15]. Gmapping is a Rao-Blackwellized Particle Filter SLAM approach which requires a high number of particles to obtain good results. Therefore, an adaptive resampling technique needs to be developed to solve the depletion problems [16]. Karto SLAM is a graph-based SLAM approach. The higher the number of landmarks, the more amount of memory required [17]. The sensor system in this paper is only composed of LiDAR and IMU. The Extended Kalman Filter method is adopted for LiDAR/IMU integrated in order to exploit IMU’s advantages in this paper.

However, most of the existing highly accurate SLAM systems are postprocessed works [1821]. An efficient online solution is still a challenge in the field of highly accurate mapping applications. For the LiDAR/IMU integrated system, although IMU can accelerate the computation of LiDAR scan matching, the time cost of LiDAR scan matching is still much larger than the IMU sampling and processing. Thereby, in an online application, if the IMU data are received when the LiDAR scans are matching, these IMU data have to wait in the buffer until the LiDAR scan matching process is finished via the traditional sequential postprocessed method. The time delay will hinder the IMU observation in current time from being obtained and processed in time. This is a fatal influence for an online system [22].

Various methods for the time delay issue have already been addressed in previous works; many efforts concentrate on optimizing the fusion filter [2325]. Guivant et al. presented an optimal algorithm that significantly reduces the computational requirements by propagating the stored information in the local area to the rest of the global map in only one iteration [22]. Other methods include dividing the large-scale maps into small amenable maps [24] or building a hierarchical submap method [23]; these algorithms are suitable for large-scale maps that have several landmarks. In 2016, Google’s Cartographer provides a real-time solution for indoor mapping, which belongs to the graph-based SLAM algorithm. Its time consumption of LiDAR scan matching is far less than that in this paper. However, the Cartographer needs to achieve real-time loop closure to eliminate the accumulated errors and the CPU occupancy is extremely high for accelerating the computation speed [26]. In addition, optimizing a large number of variables in the SLAM issue simultaneously by using two algorithms is another alternative to do mapping in real-time. One algorithm performs odometry at a high frequency but low fidelity to estimate velocity of the LiDAR. Another algorithm runs at a frequency of an order of magnitude lower for fine matching and registration of the point cloud [27].

The details of the LiDAR/IMU fused method were introduced in our previous work [4, 5]. In this paper, we will introduce an efficient online solution based on the compensation method by state propagating for the LiDAR/IMU integrated inertial navigation system for indoor mobile mapping, which is an extension of our previous works. This state propagating method has been studied in a GNSS/IMU integrated system but not in a LiDAR/IMU integrated system. It has the advantages of low computation complexity, simple implementation, and high reliability [28].

The proposed LiDAR scan matching algorithm in this paper is an improved, probabilistically motivated Maximum Likelihood Estimation (IMLE) algorithm. It is a brute global optimum search method that can obtain the global optimum position for each scan matching. Aided with IMU, its mapping precision can achieve centimeter-level with the current postprocessed Extended Kalman Filter (EKF) method [4]. The original sequential LiDAR scan matching and IMU mechanism process workflow is divided into two independent and parallel workflows. Then, the time delay between the LiDAR scan matching and IMU mechanism is synchronized via the one-step-error-state-transition method, which is applied to a standard EKF. Compared with existing online mapping solutions, this paper offers several major contributions: (a) it presents an improved method based on our previous works [4, 5], keeping the high precision characteristics of the LiDAR/IMU fusion algorithm; (b) the parallel approach can make the LiDAR scan matching and IMU mechanization operate independently, accelerating the processing of the whole system; (c) the one-step-error-state-transition mathematic model applied in EKF only records and propagates the error state in EKF prediction epochs and does not increase the dimension of the state matrix, which keeps the data fusion synchronized and reduces the computation complexity and processing time.

The rest of this paper is organized as follows: Section 2 gives the workflow of the online LiDAR/IMU integrated system and describes the mathematic principle of the proposed method; Section 3 presents the indoor field tests and discusses the results; conclusions are drawn in Section 4.

2. Online LiDAR/IMU Integrated System Modeling

2.1. Sequential IMU and LiDAR Fusion Modeling

The overview of the traditional sequential LiDAR/IMU integrated system mathematic model is shown in Figure 1. The sampling rates of IMU and LiDAR are approximately 200 Hz and 10 Hz, respectively. The rate of IMU is higher than that of LiDAR. IMU can calculate the position (), velocity , and attitudes by using the mechanization algorithm, when no LiDAR observation information is received. The local level frame of north, east, and down (NED) named navigation frame (n-frame) is taken as the reference frame for the inertial navigation. The body frame (b-frame) is defined at the IMU’s centre, with the axes pointing forward, right, and down, respectively. In fact, the IMU output contains errors and the errors will cause the navigation results to drift rapidly over a long time. Thus, an error propagation model must work alongside the system motion model to further correct and obtain better navigation results. The Phi-Angle error model is selected to describe the time-dependent behavior errors [25, 26]. The error state vector is defined in the n-frame as follows:where the error state consists of the errors of position (), the errors of velocity , the errors of attitude (), the bias of gyroscope (), and the bias of acceleration (), which is a 15-dimension vector.

Figure 1: The mathematical model of the LiDAR/IMU integrated system.

The biases of gyroscope and accelerometer are modeled as a first-order Gauss-Markov process with correlation time and mean square value . The model is described by

The INS error model with the sensor error models in continuous time can be expressed by

is the dynamic matrix, is a noise-input mapping matrix, and is the forcing vector of white noise, according to the system motion model and concrete formation of , that can be found in the works of Shin, 2001 and 2005 [29, 30].

The discrete form of (3) iswhere is the state transition matrix and is the driven white noise

is a sequence of zero-mean random variable and the covariance matrix associated with is given by

is the covariance matrix; is the spectral density matrix; and are velocity random walk and angular random walk, which are given by the IMU user manual; and are the correlation times of gyroscopes and accelerometers, respectively; and are the mean square values of gyroscopes and accelerometers, respectively, which are described in formula (2).

The LiDAR and IMU measurements are fused by the EKF algorithm only at the epoch in which LiDAR scan information is obtained. The EKF observation functions are given briefly bywhere is a 4-dimensional measurement vector; is the predicted position from the IMU mechanization; is the observed position from LiDAR; and are the predicted and observed heading angles, respectively, which are expressed as Euler angles. They make up the four observations. The LiDAR observations of 2-dimension position and heading angle can be obtained from the LiDAR scan matching. And owing to the flatness of building floor in indoor environment, the height of LiDAR observation is assumed constant, which is set as zero in this paper; is the designed matrix that describes the relation between the state vector and the measurements and is given in (10); is the driven response of the input white noise at time .

The measurement covariance matrix is written as is a 4-dimension covariance matrix. , are the errors of position and heading, approximate values based on the properties of the laser scanner device, and the angle and range searching intervals of the LiDAR scan matching algorithm.

The estimates of the EKF prediction functions are

The Kalman gain is

The state vector is updated aswhere and are the prior estimate and its error covariance. The matrix, namely, the estimated standard deviations of the estimated states, consists ofwhere the initial value of matrix in the experiment is set as follows: , , and are the precision of initial position and are given at centimeters level, which is the precision of positioning by LiDAR scan matching; , , and are small values about 0.001 m/s, because the whole system is started from stationary state [31]; , , and are the accuracy of initial heading, which are empirical value about 1 degree, 1 degree, and 5 degrees, respectively.

Finally, the estimated error is fed back to the INS mechanization to correct the final output of navigation state, , , and , which will also be the initial state for the LiDAR scan matching algorithm in the next epoch. Then, the next iteration continues.

2.2. Online Improvement

In the previous sequential IMU and LiDAR fusion model, the next IMU mechanization must wait until the prior LiDAR scan matching has completed. For instance, with the configuration in this system, the time of LiDAR scan matching costs approximately 70 ms. This means that the following IMU data in this 70 ms cannot be processed in time. Thus, the time delay of IMU mechanization due to the processing time cost of LiDAR scan matching is a challenge for online processing.

To settle this issue, a parallel online method is utilized for LiDAR scan matching and the IMU mechanization process. Figure 2 describes the processing sequence. When scan points are received at , LiDAR scan matching begins to work in the LiDAR thread; meanwhile, the IMU mechanization continues to run in the IMU thread. Normally, after the LiDAR scan matching finishes, the matching results have to be fused with the IMU at epoch in the sequential process model. However, the integration finishing time, namely, the current time, is not . Only when the current error state estimate corrects the current IMU output can we obtain the real-time information. Therefore, the calculated and its covariance in have to be propagated to the current time correctly by using the one-step-error-state-transition algorithm, which will be introduced in detail in Section 2.3. Then, error state estimation at the current time can be fed back to correct the final output of navigation in time.

Figure 2: The parallel processing sequence diagram.
2.3. One-Step-Error-State-Transition

This section will describe mathematical derivation of propagating the error state estimation of EKF from to the current time. The sampling frequency of IMU is 200 Hz, which means every 5 ms there will be an IMU data. Mechanization of IMU is not costly and it can be done during this 5 ms and it implies that predictions of EKF can be performed at proper times online. However, LiDAR scan matching takes much more time to get the observation result. For example, if we get IMU and LiDAR data at time . Mechanization of IMU can get the prediction immediately. While LiDAR scan matching will cost about 70 ms to get the EKF observation position of time . After scan matching, current time has shifted to  ms; then EKF can only update the result at the moment of . At the time , the INS prediction moved to , and matrix and error state vector have already been predicted to time that are and , which are not corrected by observation of LiDAR (). Therefore, we only have the updated results , , and at time . Thereby, and need to be propagated to and in time for online application.

State transition means using the state in time to estimate the state in time . The Kalman Filter used in the field of navigation is a minimum variance estimation that can be defined simply through the use of a conditional expectation [29]:where is the expectation operator; is the state vector; represents the measurements from time to .

Considering , the state vector can be deduced from formula (4) as

Combining (16) and (17) yields

According to formulas (4) and (8), only has an effect on the state vector and is irrelevant to the measurements . Additionally, is white noise, which is a sequence of zero-mean random variables that is uncorrelated timewise. Its expectation is given by . Therefore, (18) can be simplified as follows:where is the updating estimate of .

Defining as the error of and ,

The covariance matrix associated with is given by [23]

The covariance of can be

is a positive-definite matrix, which satisfies .

Defining yields

Combining the above equations yieldswhich means that the error state estimate and its error covariance can be obtained by using the accumulated state transition matrixes. The LiDAR sampling time is treated as time and the LiDAR/IMU integration finish time is . The state estimate and its covariance in can be propagated to by using formula (24), and then the propagated results can be updated using function (14). Finally, the updated state estimate is fed back to the current IMU output to correct the final output of navigation in the online EKF.

3. Results and Discussion

3.1. System Overview

To verify the online performance of our proposed integrated system, a series of tests based on the UGV mobile mapping platform has been designed. The hardware and software platform in this paper is what used in previous work called NAVIS [5]. The detail information of LiDAR and IMU is listed in Table 1. The components are installed horizontally on the UGV platform. The whole system moves at a speed of about 0.9 m/s, and the total time spent on the experiments in this paper is approximately three minutes and forty-six seconds. The following data processing was conducted on a pad, with the Windows 8 operation system and a 1.6 GHz CPU, which is suitable for the online process for the UGV.

Table 1: The detailed parameters of LiDAR and IMU.

The field tests of the proposed LiDAR/IMU integrated system were conducted along the corridor of Finnish Geospatial Research Institute (FGI) library (see Figure 3). To evaluate the effectiveness of our proposed online EKF algorithm, stationary and dynamic experiments were carried out, and the mapping results generated by online EKF algorithm and postprocessed EKF algorithm were compared with that generated by a high precision laser scanner.

Figure 3: Top view of FGI library.
3.2. Accuracy Evaluation of Stationary Estimation

As shown in Figure 4, the stationary test was conducted from the beginning to 50th second. Table 2 shows the numerical positioning error statistic results of the two methods. Absolute position and the heading angle in the stationary moment were taken as reference. The overall position result in this paper has been projected in n-frame. The position RMS errors of north with postprocessed EKF method and online EKF method were 13 mm and 33 mm; the position RMS errors of east were 36 mm and 46 mm; the RMS errors of the heading were 0.2030 degrees and 0.2319 degrees, respectively. The difference in RMS of the two methods was approximately 2.0 mm, 1.0 mm, and 0.03 degrees. Considering that the range error of LiDAR is approximately 2–4 cm, and the angular resolution is about 0.25 degrees, such differences are acceptable. Thereby, in stationary positioning, the overall estimated accuracy of the position and attitude of the online EKF algorithm can be regarded as being in accordance with that of postprocessed EKF algorithm.

Table 2: The static positioning error statistics.
Figure 4: (a) The positioning results with online EKF solution; (b) the positioning results with post-EKF solution; (c) the heading results with online EKF and post-EKF.
3.3. Accuracy Evaluation of Dynamic Estimation

Figure 5(a) is generated by LiDAR scan matching standalone. Figure 5(b) is the likelihood map generated with the online EKF algorithm. They are both the floor plan of the FGI library. By comparing Figures 5(a) and 5(b), it can be seen that the map results of LiDAR standalone system are much noisier than that of the LiDAR/INS, and there appeared mismatching in the long corridor in Figure 5(a), where the features are little. Therefore, the LiDAR/INS system can overcome the drawbacks of the LiDAR standalone system and achieve higher mapping accuracy.

Figure 5: (a) The map results of UGV platform with LiDAR scan matching standalone. (b) The map and trajectories result of UGV platform with post-EKF solution and online EKF solution.

The trajectories of the UGV platform in dynamic mode with postprocessed EKF algorithm and online algorithm are also shown in Figure 5(b). The initial position was taken as origin. As shown in the plot, the green trajectory coincides well with the red trajectory, which implies that our online EKF method has the same positioning accuracy as that of the sequential postprocessed EKF method. The difference is not obvious from the plot.

The overall dynamic process lasted approximately 176 s. Figure 6 is the positioning difference with postprocessed EKF solution and the online EKF solution, and the statistics result is listed in Table 3. The position RMS errors of north and east are 0.0138 m and 0.0440 m, respectively; the RMS error of the heading is 0.1979 degrees. The position difference is still at centimeter-level, and the heading result is also under the angular resolution of LiDAR. As a result, the accuracy of the position and attitude of the online EKF algorithm can be considered as being at the same level with that of the postprocessed EKF in dynamic mode.

Table 3: The dynamic positioning difference statistics between online EKF and post-EKF.
Figure 6: The comparison of positioning outputs by the post-EKF solution and online EKF solution.

In addition to the comparison of trajectory accuracy, the map quality also needs to be verified. Figures 7(a) and 7(b) show the likelihood map generated by the two different solutions—postprocessed EKF solution and online EKF solution. They are compared with the reference map, which is presented in Figure 7(c) and generated with a Terrestrial Laser Scanner (TLS, FARO Focus3D 330X). By comparing the zoom-in-images of each map, it is obvious that the line features in Figures 7(a) and 7(b) are remarkably noisier than that in Figure 7(c). The profiles are wider than their counterparts generated by TLS, and some of the corners are too ambiguous to be detected. The reason is that the adopted Hokuyo laser scanner is a big footprint scanner with centimeter ranging accuracy but the TLS applies small footprint millimeter accuracy laser.

Figure 7: (a) NAVIS map result with selected feature points with online EKF solution; (b) NAVIS map result with selected feature points with post-EKF solution; (c) TLS reference map and the selected feature points.

The unmovable corners of book shelves and walls are selected as the main feature points for accuracy evaluation. There are in total 67 feature points of corners picked out from the three maps for evaluation. The RMS errors of the selected feature points of the FGI library with the postprocessed EKF solution and online EKF solution are listed in Table 4. The RMS error with the postprocessed EKF method is 0.0562 m, and the RMS error with the online EKF method is 0.0732 m. The difference of RMS errors with the two methods is under 0.02 m. Considering the errors brought by manual operation, the accuracy is reasonable.

Table 4: The comparison of accuracy results statistics for the selected feature points.

The positioning precision is reflected by the mapping precision indirectly in the indoor environment, due to the lacking of trajectory reference truth. Figure 8(a) shows the cumulative distribution of the mapping results errors with post-EKF and online EKF, respectively. The -axis means the percentage of less than the corresponding errors in -axis. The mapping results errors of feature points with post-EKF are all less than 0.011 m, and the errors of 70%–80% feature points are less than 0.06 m. While the errors of feature points with online EKF are all less than 0.013 m, the errors of 70%–80% feature points are less than 0.08 m. Figure 8(b) describes the drift of mapping results with online EKF. The errors of LiDAR and IMU will be accumulated continuously over time, if there is no loop closure or some other methods to correct the errors. The final drift is about 0.013 m.

Figure 8: (a) The cumulative distribution of the mapping errors with post-EKF and online EKF; (b) the mapping error drift of the online EKF.
3.4. Verification of Real-Time Performance

To evaluate the performance of online EKF solution, the time delay of each output epoch is compared with those of postprocessed EKF solution in Figure 9. Theoretically, as introduced in Figure 2, if there is no LiDAR data received, the IMU mechanization results will be output, and the output delay of one epoch means the time consumption of IMU mechanization. When the LiDAR data is received, for the traditional sequential EKF, the output delay of one epoch equals the total time it takes to complete the LiDAR scan matching, IMU mechanization, and Kalman update, but for the proposed parallel online EKF, the output delay of each epoch is composed of only the IMU mechanization, Kalman update, and the one-step-error-state-transition process, which will be much shorter.

Figure 9: The output delay using online EKF solution and post-EKF solution.

As the results shown in Figure 9, the time delay of each output epoch with the online EKF is much less than that with the postprocessed EKF. As observed in Table 5 the mean output delay using the postprocessed EKF is 1.69 ms, while the maximum is 88.65 ms. The mean time delay of each output epoch using the online EKF is 0.280 ms, while the maximum is 3.76 ms. The mean output delay of the online EKF reduced about 6 times, and the maximum value reduces 23 times. The mean and maximum value improvement is because the online EKF can take advantage of the parallel processing, that is, dual processing threads running on two CPU cores, and the online EKF utilizes the one-step-error-state-transition method to make it possible to use previous update information (LiDAR scan matching) to correct current IMU mechanization results, so that IMU output does not have to wait until the LiDAR scan matching process finished. Therefore, according to Figure 9, the improved EKF method in this paper can reduce the output delay efficiently compared with the postprocessed EKF method, and it ensures the better real-time performance for the online system.

Table 5: The output delay statistics with online-EKF solution and post-EKF solution.

4. Conclusions

An online solution for the LiDAR/IMU integrated system is proposed in this paper. The positioning results of IMU and LiDAR scan matching are real-time synchronized using the proposed one-step-error-state-transition method in the EKF to improve the real-time response of the LiDAR/IMU integrated navigation. The accuracy and online improvement results prove that (1) the proposed online method can achieve the same positioning and mapping accuracy (centimeters level) as the sequential post-EKF method; (2) the online EKF solution can reduce the maximum output delay from 88.65 ms in postprocessed EKF to 3.76 ms. It improves the real-time performance effectively. In conclusion, the online solution proposed in this paper can solve the time lag issue of LiDAR/IMU integration without the loss of the positioning accuracy. In the future work, the proposed method will be integrated into embedded-hardware platform and applied for real-time 2D and 3D indoor mapping.

Conflicts of Interest

The authors declare no conflicts of interest.


This study was financially supported by the National Key Research and Development Program (nos. 2016YFB0502202 and 2016YFB0501803) and the Academy of Finland (Project: “Centre of Excellence in Laser Scanning Research” (272195)). The authors express thanks to Yuwei Chen, Professor Juha Hyyppä, and other friends in Finnish Geospatial Research Institute, who provided the experiment data and useful feedback.


  1. J. Zhang and S. Singh, “LOAM: Lidar Odometry and Mapping in Real-time,” in Proceedings of the Robotics: Science and System Conference, 2014. View at Publisher · View at Google Scholar
  2. A. A. Aghamohammadi, H. D. Taghirad, A. H. Tamjidi et al., “Feature-based laser scan matching for accurate and high speed mobile robot localization,” in Proceedings of the European Conference on Mobile Robots EMCR '07, Freiburg, Germany, September 2007.
  3. Z. Duan and Z. Cai, “Robust simultaneous localization and mapping based on laser range finder with improved adaptive particle filter,” in Proceedings of the Chinese Control and Decision Conference CCDC '08, pp. 2820–2824, July 2008. View at Publisher · View at Google Scholar · View at Scopus
  4. J. Tang, Y. Chen, A. Jaakkola, J. Liu, J. Hyyppä, and H. Hyyppä, “NAVIS-An UGV indoor positioning system using laser scan matching for large-area real-time applications,” Sensors, vol. 14, no. 7, pp. 11805–11824, 2014. View at Publisher · View at Google Scholar · View at Scopus
  5. J. Tang, Y. Chen, X. Niu et al., “LiDAR scan matching aided inertial navigation system in GNSS-denied environments,” Sensors, vol. 15, no. 7, pp. 16710–16728, 2015. View at Publisher · View at Google Scholar · View at Scopus
  6. I. Klein and S. Filin, “Lidar and INS fusion in periods of GPS outages for mobile laser scanning mapping systems,” International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. 3812, pp. 231–236, 2011. View at Publisher · View at Google Scholar
  7. A. Soloviev, “Tight coupling of GPS, laser scanner, and inertial measurements for navigation in urban environments,” in Proceedings of the IEEE/ION Position, Location and Navigation Symposium, pp. 511–525, IEEE, Monterey, Calif, USA, May 2008. View at Publisher · View at Google Scholar
  8. R. Li, J. Liu, L. Zhang, and Y. Hang, “LIDAR/MEMS IMU integrated navigation (SLAM) method for a small UAV in indoor environments,” in Proceedings of the 8th International Conference on DGON Inertial Sensors and Systems, ISS '14, pp. 1–5, 2015. View at Publisher · View at Google Scholar · View at Scopus
  9. L. Chen, S. Ali-Löytty, R. Piché, and L. Wu, “Mobile tracking in mixed line-of-sight/non-line-of-sight conditions: algorithm and theoretical lower bound,” Wireless Personal Communications, vol. 65, no. 4, pp. 753–771, 2012. View at Publisher · View at Google Scholar · View at Scopus
  10. N. Joshi, M. Baumann, A. Ehammer et al., “A review of the application of optical and radar remote sensing data fusion to land use mapping and monitoring,” Remote Sensing, vol. 8, no. 1, article no. 70, 2016. View at Publisher · View at Google Scholar · View at Scopus
  11. L. Chen and L. Wu, “Mobile positioning in mixed LOS/NLOS conditions using modified EKF banks and data fusion method,” IEICE Transactions on Communications, vol. 92, no. 4, pp. 1318–1325, 2009. View at Publisher · View at Google Scholar · View at Scopus
  12. H. B. Mitchell, “Multi-Sensor Data Fusion,” in Proceedings of the Intelligent Problem Solving, Methodologies and Approaches, International Conference on Industrial and Engineering Applications of Artificial Intelligence and Expert Systems, IEA/AIE '00, pp. 245–253, Louisiana, La, Usa, 2016.
  13. L. Chen, L. Pei, H. Kuusniemi, Y. Chen, T. Kröger, and R. Chen, “Bayesian fusion for indoor positioning using bluetooth fingerprints,” Wireless Personal Communications, vol. 70, no. 4, pp. 1735–1745, 2013. View at Publisher · View at Google Scholar · View at Scopus
  14. Y. Chen, J. Liu, A. Jaakkola et al., “Knowledge-based indoor positioning based on LiDAR aided multiple sensors system for UGVs,” in Proceedings of the 2014 IEEE/ION Position, Location and Navigation Symposium, PLANS '14, pp. 109–114, May 2014. View at Publisher · View at Google Scholar · View at Scopus
  15. S. Kohlbrecher, O. Von Stryk, J. Meyer, and U. Klingauf, “A flexible and scalable SLAM system with full 3D motion estimation,” in Proceedings of the 9th IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR '11), pp. 155–160, IEEE, Kyoto, Japan, November 2011. View at Publisher · View at Google Scholar · View at Scopus
  16. G. Grisetti, C. Stachniss, and W. Burgard, “Improved techniques for grid mapping with Rao-Blackwellized particle filters,” IEEE Transactions on Robotics, vol. 23, no. 1, pp. 34–46, 2007. View at Publisher · View at Google Scholar · View at Scopus
  17. R. Vincent, B. Limketkai, and M. Eriksen, “Comparison of indoor robot localization techniques in the absence of GPS,” in Detection and Sensing of Mines, Explosive Objects, and Obscured Targets XV of Defense, Security, and Sensing Symposium, Proceedings of the SPIE, April 2010.
  18. S. Thrun and J. J. Leonard, “Simultaneous localization and mapping,” in Springer Handbook of Robotics, pp. 871–889, Springer Berlin Heidelberg, Berlin, Germany, 2008. View at Publisher · View at Google Scholar
  19. R. Kaijaluoto, A. Kukko, and J. Hyyppä, “Precise indoor localization for mobile laser scanner,” in Proceedings of the ISPRS WG IV/7 and WG V/4 Joint Workshop on Indoor-Outdoor Seamless Modelling, Mapping and Navigation, pp. 1–6, May 2015. View at Publisher · View at Google Scholar · View at Scopus
  20. J. Jung, S. Yoon, S. Ju, and J. Heo, “Development of kinematic 3D laser scanning system for indoor mapping and as-built BIM using constrained SLAM,” Sensors, vol. 15, no. 10, pp. 26430–26456, 2015. View at Publisher · View at Google Scholar · View at Scopus
  21. J. Jung, S. Hong, S. Jeong et al., “Productive modeling for development of as-built BIM of existing indoor structures,” Automation in Construction, vol. 42, no. 2, pp. 68–77, 2014. View at Publisher · View at Google Scholar · View at Scopus
  22. A. Fernández, P. F. Silva, I. Colomina et al., “Real-time navigation and mapping with mobile mapping systems using LiDAR/Camera/INS/GNSS advanced hybridization algorithms: description and test results,” in Proceedings of the 27th International Technical Meeting of the Satellite Division of the Institute of Navigation, ION GNSS 2014, pp. 896–903, September 2014. View at Scopus
  23. J. E. Guivant and E. M. Nebot, “Optimization of the simultaneous localization and map-building algorithm for real-time implementation,” IEEE Transactions on Robotics and Automation, vol. 17, no. 3, pp. 242–257, 2001. View at Publisher · View at Google Scholar · View at Scopus
  24. S. B. Williams, G. Dissanayake, and H. Durrant-Whyte, “An efficient approach to the simultaneous localisation and mapping problem,” in Proceedings of the 2002 IEEE International Conference on Robotics and Automation, pp. 406–411, usa, May 2002. View at Scopus
  25. C. Estrada, J. Neira, and J. D. Tardós, “Hierarchical SLAM: real-time accurate mapping of large environments,” IEEE Transactions on Robotics, vol. 21, no. 4, pp. 588–596, 2005. View at Publisher · View at Google Scholar · View at Scopus
  26. W. Hess, D. Kohler, H. Rapp, and D. Andor, “Real-time loop closure in 2D LIDAR SLAM,” in Proceedings of the 2016 IEEE International Conference on Robotics and Automation, ICRA 2016, pp. 1271–1278, swe, May 2016. View at Publisher · View at Google Scholar · View at Scopus
  27. H. Durrant-Whyte and T. Bailey, “Simultaneous localization and mapping: part i the essential algorithms,” IEEE Robotics Automation Magazine, vol. 13, no. 2, pp. 99–110, 2006. View at Google Scholar
  28. Z. Hongping, L. Chang, W. Hongchen et al., “Improvement and verification of real-time performance of GNSS/INS tightly coupled integration in embedded platform,” Journal of Southeast University (Natural Science Edition), vol. 46, no. 4, pp. 695–701, 2016. View at Publisher · View at Google Scholar
  29. E.-H. Shin and E.-S. Naser, Accuracy Improvement of Low Cost INS/GPS for Land Applications, University of Calgary, Department of Geomatics Engineering, Calgary, Canada, 2001, 2001.
  30. E. H. Shin, Estimation Techniques for Low-Cost Inertial Navigation UCGE report, 20219, 2005,
  31. X. Dong and S. Zhang, Positioning and application of GPS/INS integrated navigation [M.S. thesis], National University of Defense Technology, Changsha, China, 1998.