Abstract

In the process of acquiring point cloud data by a 3D laser scanner, some problems, such as outliers, mixed points, and holes, may be caused in the target point cloud due to the external environment, the discreteness of the laser beam, and the occlusion of objects. In this paper, a point cloud quality optimization and enhancement algorithm is designed. A self-adaptive octree is established to rasterize the point cloud and calculate the density of each grid, combing with the statistical filtering to remove outliers from the point cloud data. Then, a plane projection method is used for removing the confounding points from the point cloud data. Finally, the point cloud is triangulated and a priority value is set, and then, points are preferentially inserted where the priority value is the largest to repair the holes. Experiments show that while removing outliers and confounding points, the detailed features of the point cloud can be maintained, holes are effectively filled, and the quality of the point cloud is effectively improved.

1. Introduction

In recent years, laser 3D scanning technology has developed rapidly. As a new format of data, the three-dimensional point cloud can accurately record the three-dimensional topography, geometric features, spatial coordinates, and other information about the surface of the object, and it has many advantages that two-dimensional data does not have [1]. In the process of acquiring point cloud data by a 3D laser scanner, outliers, mixed points, holes, and other defects may be generated in the target point cloud due to the external environment, the discreteness of the laser beam, and the occlusion of objects [25]. These defects degrade the quality of the point cloud and affect the subsequent reconstruction accuracy of the three-dimensional surface.

Pirotti et al. [6] made some assessment with the two commonly used outlier detection methods—statistical outlier removal (SOR) filter and local outlier factor (LOF) filter. Four different filtering algorithms were well used for filtering the raw point cloud data form a UAV by Mustafa Zeybek [7], and a developed methodology contributes to the reduction of errors caused by data losses in various modelling studies [8]. Fleishman et al. [9] proposed the moving least squares algorithm, in which the noise points are projected onto an estimated plane. But it is sensitive to a large number of outliers. Based on the locally optimal projection (LOP) algorithm proposed by Lipman et al. [10], Huang et al. [11] proposed weighted locally optimal projection algorithms that can remove some noise points during the projection. But these algorithms have poor robustness and cannot keep the sharp features of the point cloud well. Rusu et al. [12] proposed an algorithm based on neighborhood statistics, which needs two iterations and takes much time to process the large-scale point cloud. Meanwhile, it is a challenge to guarantee accuracy when removing the noise. Ning et al. [13] proposed a simple and effective noisy point trimming method based on two geometric feature constraints, but it is only suitable for certain types of noise. Ester et al. [14] proposed a clustering algorithm DBSCAN, which can be used for removing the redundant points including the noise points. Leal et al. [15] presented a two-step method both for normal estimation and for point position update, measuring the sparsity of sharp features while discriminating between noise and feature, but the performance in some models with more detailed features is general. In addition to removing noisy points, the optimization and enhancement of point cloud quality also need to repair the holes caused by the object occlusion. Holes repairing algorithms based on triangular meshes could be divided into two categories: surface-based methods and volume-based ones [16]. Generally, surface-based methods detect, repair, and refine the holes on the given triangular mesh [17, 18]. Volume-based methods convert the given mesh into a function with signed distance on the volume mesh to fill the holes and then extract the complete grid from the zero-order set of the distance function [1921]. The holes repairing the algorithm proposed by Leong et al. [22] directly connect the boundary of the holes, but the repair is not effective since no new triangle surfaces are added. Liu Zhenghong and Yun [23] extracted the boundary of the holes and determined the direction of contraction by calculating the direction of the triangular surface’s normal vector related to the boundary point. In this way, a complete triangular patch is iteratively generated, but the number of additional points could not be controlled. The point cloud quality optimization and enhancement algorithm designed in this paper could effectively remove the outliers and confounding points while maintaining the sharp features of the point cloud. At the same time, by setting the priority value, the number of new points could be controlled when repairing holes.

2. Theoretical Derivation

The implementation steps of the proposed method are as follows: (1)Establishing a self-adaptive octree to rasterize the point cloud and setting the thresholds for defining the outliers to delete these points(2)After removing the outliers, projecting the point cloud onto the local least squares fitting plane to remove the confounding points(3)Establishing a priority value and inserting points at the positions with higher priority values until the hole filling which is completed

2.1. Outlier Removal

Outliers are usually disorganized, far away from the main point cloud, sparse, and geometrically discontinuous with inconsistent local point density. In contrast, the main point cloud is relatively concentrated and dense. The statistical filtering algorithm is based on the characteristics that the distance between the outlier points and the neighboring points is considerable while the distance between the main points and the neighboring points is small. And the statistical analysis towards the neighborhood of each point is used for removing the outliers [24]. In this paper, density is introduced based on the statistical filtering algorithm. First, an octree is established to take the minimum enclosing cube of the point cloud as the root node, and the cube is divided into eight subcubes with equal size. If a subcube contains points, it should continue to be divided into eight cubes with equal size until each point in the point cloud obtains a unique index coordinate. Then, the density of the point cloud in each small cube is calculated, and usually, the density of the subcube containing outliers is relatively small.

The specific implementation steps are as follows: (1)Establishing a self-adaptive octree to rasterize the point cloud and calculating the density of the point cloud in each grid(2)For any point of the point cloud in each grid, searching for the-neighborhood point of and calculating the distance between and (3)Defining the probability of a point being an outlier as(4)Establishing a threshold. If is greater than the threshold, is considered an outlier and is deleted

2.2. Confounding Point Removal

Confounding points cannot be easily distinguished from the real point cloud. It is difficult for traditional algorithms to achieve a balance between removing the confounding points and maintaining the detailed and sharp features of the point cloud. In this paper, we use the least square to fit the local plane and obtain the normal vector of the local plane. The mixed points are projected onto the local fitting plane, which not only improves the quality of the point cloud but also maintains the sharp features.

The least squares method is used for fitting the local plane and obtaining the normal vector. First, we fit a local plane with a set of points composed of a point and its -neighborhood points. Suppose the local fitting plane is , and then, the formula of the deviation quadratic sum is as follows:

It can also be written in matrix form as follows: where

When takes the minimum value, the fitting degree is the best. According to the least squares criterion, we can use the following formula to obtain .

In this way, the fitting plane can be obtained, and the normal vector of a certain point can also be got by substituting the coordinates of the point into the fitting plane.

The specific implementation steps are as follows: (1)For a point in the target point cloud , searching for its -neighborhood points and calculating the mean value of the neighborhood points(2)Using the least squares method to fit a local plane and calculating the normal vector of (3)Calculating the projection of the vector in the normal direction to obtain the distance from the point to the local fitting plane (4)Projecting onto the local fitting plane and obtaining a projection point (5)Getting the new point set

2.3. Hole Filling

For the hole repair based on scattered point clouds, the scattered point cloud data can be triangulated first, and the possible holes in the point cloud data can be transformed into grid holes. When using the triangulation algorithm to fit the surface of a point cloud, the part of the triangle with a hole is usually relatively large, so in the process of repairing the point cloud hole, we can judge whether there is a hole by the area of the triangle. In this paper, a priority value is established preferentially to repair the part with a larger hole area in the point cloud. A higher priority value means that a specific part in the point cloud has a larger area of the hole. Then, a point is inserted firstly in the part with the largest priority value. Next, the inserted point is merged with the original point, and the above operations are repeated until the holes are repaired.

The specific implementation steps are as follows: (1)Triangulating the target point cloud using triangulation algorithm(2)Getting the corresponding three edges from the vertices of each triangular patch; and using Heron’s formula to calculate the area of each triangular patch; then getting the priority value corresponding to each triangular patch (3)Getting the largest priority value, that is, finding the triangular patch with the largest area and its vertices(4)Inserting a point at the center of gravity of the triangle with the largest priority value and getting a new point set(5)Repeating (1) to (4) until the hole is repaired completely

3. Experimental Results and Data Analysis

3.1. Algorithm Validation

The denoising accuracy and denoising recall rate [25] are adopted to evaluate the effect of the algorithm on the outlier removal so as to verify the effectiveness of the proposed algorithm. Since the denoising accuracy and the denoising recall rate are inversely proportional, the -score is introduced as the harmonic mean of the two parameters. A higher -score means a better performance of the algorithm. The chamfer distance [26] is used for measuring the average nearest squared distance between the point cloud and .

The denoising accuracy can be calculated as follows: where is the denoising accuracy, is the number of noisy points removed, and means the whole number of noisy points.

The denoising recall rate is defined by the following formula: where is the denoising recall and is the number of points that should be removed.

So the -score can be obtained by where is the -score.

We defined as the chamfer distance between the point cloud and , and it can be calculated as follows: where is the point in and is the point in .

In the experiments, we added 10%, 20%, and 30% random noise to the point cloud, respectively. The method of adding noise points in this paper is as follows: some points are selected randomly in the origin point cloud; then, the coordinates of these points are changed within a range; here, we used the 2 to 8 times of mesh resolution, so the confounding points and outliers could be obtained, adding these points into the origin point cloud so that we can get the noisy point cloud finally.

For enhancing the quality of the point cloud, first, we removed the outliers using the proposed algorithm and compared it with the statistical filtering and the radius filtering. As shown from Figures 13 and Tables 13, four neighborhood points are selected in the four algorithms, and the highest -score value of each algorithm and the corresponding and are recorded. In the second part, we removed the confounding points using the proposed method. After processing the noisy point cloud by the proposed method, we calculated the in each part. Finally, the point cloud was triangulated, and a priority value was set. Points are preferentially inserted to where the priority value was the largest so as to repair the holes.

Figure 1 and Table 1 show that after the Bunny point cloud with 10% noise is processed by the proposed method, the surface of the point cloud is smooth, uniform, and almost without noise; meanwhile, the outliers and confounding points are removed. The denoising recall rate of the radius filtering is higher than that of the statistical filtering and the proposed outlier removal algorithm. But the denoising accuracy rate and -score of the outlier removal algorithm are better than those of the statistical filtering and the radius filtering.

Figure 2 and Table 2 demonstrate that after the Bunny point cloud with 20% noise is processed by the proposed method, no noticeable noise points appear on the surface of the point cloud, together with the outliers and confounding points removed. The denoising recall rate of the radius filtering is better than that of the statistical filtering and the outlier removal algorithm. Yet the denoising accuracy rate and -score of our outlier removal algorithm are better than those of the statistical filter and radius filtering.

In experiment 3, Figure 3 and Table 3 show that after in the Bunny point cloud with 30% noise is processed by the proposed method, no obvious noise points exist on the surface of the point cloud, together with the outliers and confounding points removed. The denoising recall rate of the radius filtering is better than that of the statistical filtering and the proposed outlier removal algorithm. But the denoising accuracy rate and -score of our outlier removal algorithm are better than those of the statistical filter and radius filtering.

The effect comparison of the three algorithms in removing outliers with different noise degrees is shown in Figure 4. The proposed outlier removal algorithm shows an upward trend with the increase of noise level, the radius filtering tends to be stable, and the statistical filtering shows a downward trend. The algorithm in this paper has always been better than the other two algorithms.

The is the original Bunny point cloud, is the noisy point cloud, is the point cloud after the outliers removed by the proposed outlier removal algorithm, and is the point cloud after the confounding points removed by the proposed confounding point removal algorithm. From the evaluation indicators in Table 4, we can see that is smaller than and , indicating that the proposed algorithm can effectively remove outliers and confounding points.

The Bunny point cloud with 20% noise added is taken as an example to verify the effect of hole filling. Both the Bunny point cloud with noise and that has been processed by the proposed algorithm are triangulated, as shown in Figures 5(a) and 5(b). Figures 5(c) and 5(d) show the triangulation effect of the point cloud with noise and that has been processed by the proposed algorithm after the viewing angle is flipped, respectively. The noise on the surface of the Bunny noisy point cloud is completely removed, and the fitted surface is uniform and smooth without apparent loss. The holes at the bottom of the point cloud are also effectively filled. The results show that the proposed algorithm in this paper preserves the sharp and detailed features of the point cloud while removing the outliers and confounding points and makes an effective repair for the holes finally.

3.2. Experimental Verification of Algorithm Performance

To further verify the effect of the proposed algorithm, the actual point cloud data of the locomotive pantograph slide, obtained by the laser line structured light three-dimensional scanner, is set as the experimental object. In the data collection, the camera type is Ranger3, the laser model is Lingyun (808 nm, 15 W), and the effective accuracy of the 3D scanner is 0.5 mm. The collected pantograph data contains 632,718 points with some obvious defects on the surface. The results are shown from Figures 68 and Tables 57.

Figure 6 and Table 5 show that after the pantograph point cloud with 10% noise is processed by the proposed method, no obvious noise points exist on the surface of the point cloud, together with the outliers and confounding points removed. Radius filtering has the highest denoising accuracy. The denoising recall rate of the statistical filtering is better than that of the radius filtering and the proposed outlier removal algorithm. However, the denoising accuracy of the proposed outlier removal algorithm is close to that of the radius filtering, and the denoising recall rate is close to that of the statistical filtering. The -score of the proposed outlier removal algorithm is better than that of the statistical filtering and radius filtering. So the comprehensive performance of the proposed method is the best.

Figure 7 and Table 6 show that after the pantograph point cloud with 20% noise is processed by the proposed method, no noticeable noise points appear on the surface of the point cloud, together with outliers and confounding points removed. The denoising accuracy rate, the denoising recall rate, and -score of the proposed outlier removal algorithm are better than those of the statistical filtering and radius filtering.

Figure 8 and Table 7 illustrate that after the pantograph point cloud with 30% noise is processed by the proposed method, no evident noise points remain on the surface of the point cloud together with the outliers and confounding points removed. The denoising recall rate of the radius filtering is better than that of the statistical filtering and the proposed outlier removal algorithm, but the denoising accuracy rate and -score of the proposed outlier removal algorithm are better than those of the statistical filter and radius filtering.

The effect comparison of the three algorithms in removing outliers with different noise degrees is shown in Figure 9. The proposed outlier removal algorithm and the radius filtering show an upward trend with the increase of noise level, and the statistical filtering shows a downward tread. The proposed algorithm always performs better than the other two algorithms.

The is the original pantograph point cloud, is the noisy point cloud, is the point cloud after the outliers removed by the proposed outlier removal algorithm, and is the point cloud after the confounding points removed by the proposed confounding point removal algorithm. From the evaluation indicators in Table 8, the proposed algorithm can remove both the outliers the confounding points.

To show the hole filling effects, we compared the original pantograph point cloud with that after the hole filling, as shown in Figure 10. Figures 10(a) and 10(c) illustrate the original pantograph point cloud and that after the hole filling, respectively. Figures 10(b) and 10(d) demonstrate the original pantograph point cloud with the local hole and that after the hole filling. The results show that the algorithm proposed in this paper effectively fills the hole.

4. Conclusions

In this paper, an algorithm for point cloud quality optimization and enhancement is designed and verified. Firstly, an adaptive octree is established to rasterize the point cloud and to calculate the density of each grid. Then, the statistical filtering is combined to remove the outliers from the point cloud data. And the method of plane projection is used for removing the confounding points. Finally, the point cloud is triangulated, and a priority value is set up to insert the point to the place where the priority value is the largest, so as to repair the holes. To verify the effectiveness of the proposed algorithm, we, respectively, added 10%, 20%, and 30% noise to the Bunny point cloud and used the actual point cloud data of the locomotive pantograph slider to test. Then, the proposed algorithm was compared with the statistical filtering algorithm and the radius filtering algorithm. Experimental results show that the proposed algorithm can decently keep the sharp and detailed features from the original point cloud surface while removing noise. It can also effectively fill the holes and comprehensively perform better than the algorithms of statistical filtering and radius filtering.

Data Availability

Data will be available on request.

Conflicts of Interest

The authors declare no competing financial interests.

Acknowledgments

This paper is supported by the National Natural Science Foundation of China (No. 61960206010).