Mathematical Problems in Engineering

Volume 2015 (2015), Article ID 646410, 12 pages

http://dx.doi.org/10.1155/2015/646410

## Resampling to Speed Up Consolidation of Point Clouds

^{1}College of Information Sciences and Technology, Donghua University, Shanghai 201620, China^{2}College of Information and Engineering, Shanghai Open University, Shanghai 200233, China^{3}Engineering Research Center of Digitized Textile & Fashion Technology, Ministry of Education, Donghua University, Shanghai 201620, China

Received 23 January 2015; Accepted 10 March 2015

Academic Editor: Carla Roque

Copyright © 2015 Huanyu Yang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

Processing of large-scale scattered point clouds has currently become a hot topic in the field of computer graphics research. A supposedly valid tool in producing a set of denoised, outlier-free, and evenly distributed particles over the original point clouds, Weighted Locally Optimal Projection (WLOP) algorithm, has been used in the consolidation of unorganized 3D point clouds by many researchers. However, the algorithm is considered relatively ineffective, due to the large amount of the point clouds data and the iteration calculation. In this paper, a resampling method applied to the point set of 3D model, which significantly improves the computing speed of the WLOP algorithm. In order to measure the impact of error, which will increase with the improvement of calculation efficiency, on the accuracy of the algorithm, we define two quantitative indicators, that is, the projection error and uniformity of distribution. The performance of our method will be evaluated by using both quantitative and qualitative analyses. Our experimental validation demonstrates that this method greatly improves calculating efficiency, notwithstanding the slightly reduced projection accuracy in comparison to WLOP.

#### 1. Introduction

As a popular topic of growing interest in the fundamental research of computer graphics, reverse engineering in reconstructing 3D models from unorganized point cloud data has been considered by many authors and various results on surface reconstruction from point clouds have been published in recent years. Various methods have been proposed in this regard, including RBF-based approaches [1–4], integration of Voronoi diagrams and variational method [5], Poisson surface reconstruction technique [6], and smooth signed distance method [7]. The local moving least squares (MLS) surface by Levin [8] and its variants have been proven to be a powerful surface representation of point set data. Guennebaud and Gross proposed algebraic point set surfaces (APSS) [9] and Öztireli et al. proposed a robust implicit MLS (RIMLS) variant to project a point set (or a mesh) onto the MLS surface [10].

During the past two decades, preprocessing of unorganized point cloud data has received considerable attention mainly in virtue of their theoretical importance as well as the extensive applications. Many problems still have yet to be addressed, however. In particular, it is noteworthy that the 3D surface represented by unorganized point clouds is typically noisy, as it contains holes, with high variations in point density caused by acquisition errors or misalignment of multiple scans. The preprocessing for surface reconstruction includes denoising, outlier removal, thinning, orientation, and redistribution of the input points (see, e.g., [11–15]), which has resorted to resampling for point cloud consolidation. WLOP (Weighted Locally Optimal Projection) operator [15] and LOP (Locally Optimal Projection) operator [14] have proven better immunity against noise and outlier of raw scanned data, in addition to their advantage in creating evenly redistributed point clouds. However, these methods are found limited in preserving geometrical features of the model in the projection (see [16, 17]). Such limitations, however, need to be further addressed with concrete quantitative evidence, which is obtainable via computationally expensive reconstruction of large point set data. The open problem we intend to solve in this paper is the slow computing speed caused by multiple iterations in previously proposed models and limitation in results obtained from reconstructing large point set data due to the mathematical complexity.

In this paper, we present the resampling method for point clouds processing, a new and more efficient organization of data. We downsample the point set with -nearest neighbor algorithm and the normal vectorial angle, which reduces the amount of computation that is required, and upsample the computing result using WLOP. A numerical simulation on accuracy, uniformly distributed setting, and computing time is given to illustrate the applicability. Our method can reduce the noise and create evenly redistributed points like WLOP. For more experiments we demonstrate that time complexity is significantly reduced using our method.

Four main contributions of this paper can be summarized as follows.(1)We review the application and progress of WLOP algorithm for geometry reconstruction.(2)We design an optimization process to reduce the computational complexity of WLOP. In the proposed algorithm, the surface points are classified by the -nearest neighbor algorithm, and the normal vector is used to choose the point from every group to each subsampled point set. The WLOP operator is applied to the point clouds of every subset. This approach speeds up the computation two to three times.(3)We contribute to the scarce literature on quantitative analysis for point cloud processing, offering a definition of the projection error indicators reference to the method of error estimation analysis of point cloud simplification algorithm and the uniform distribution indicators reference to the method of point density.(4)Our proposed Resampling WLOP approach has been evaluated on models with various shape complexities and noise levels. The effectiveness and performance of our method are validated and illustrated through experimental results and comparison with the WLOP method.

The rest of this paper is organized as follows. Section 2 reviews the related studies in the consolidation of unorganized 3D point clouds from which the proposed method is derived. Section 3 provides a detailed description of the resampling method. The evaluation criteria are described in Section 4. The proposed method is implemented and evaluated against the WLOP algorithm in Section 5, followed by the concluding comments in Section 6.

#### 2. Related Work

In this section, we briefly review the application and research of Weighted Locally Optimal Projection (WLOP) algorithm among the various methods hitherto developed for the processing of point clouds. Lipman et al. proposed a parameterization-free projection operator, that is, Locally Optimal Projection (LOP), for geometry reconstruction reminiscent of the well-known multivariate median problem [14]. This method is robust to noise, outliers, and nonuniformities of raw scanned data. LOP operates well on raw data without relying on a local parameterization of the points or on their local orientation. Given an unorganized set of points , LOP defines a set of projected points by a fixed point iteration where, given the current iterate , , the next iterate is to minimizewhere and , with and .

However, LOP may not work well when the distribution of the input points is highly nonuniform. Huang et al. [15] improved this operator by incorporating locally adaptive density weights into LOP, to improve its ability to deal with nonuniformity. The improved operator named as WLOP (Weighted Locally Optimal Projection) is supported by a robust normal estimation method which they presented to assign consistent normal to each projected point.

The projection for point iswhere , and where and are adaptive density weights.

Many surface reconstruction algorithms can greatly benefit from the output of their consolidation framework. Huang et al. presented a method to assign consistently oriented normal vectors to unorganized points with noises, nonuniformities, and thin sharp features compared with the consolidation approach [15]. They presented a framework that distributes points by inserting samples into sparse regions, using the WLOP operator which performs downsampling and relaxation [18]. Huang et al. altered the LOP operator and made it normal or edge-aware, allowing for resampling away from edges [19].

Although LOP and WLOP are both considered effective approaches to reconstruct geometry, several drawbacks are not to be dismissed, for instance, limitation in guaranteeing the smoothness of projection and in the reconstruction of models which contain sharp features. Li et al. extended the WLOP operator by incorporating the normal information into the WLOP operator [16]. In this modified approach, neighbors across sharp features are regarded as outliers in normal space; thus they are excluded from the projection to keep the features intact. Li et al. later suggested incorporating a normal mollification step into the extended WLOP operator to get a more accurate result. Liao et al. presented an efficient and feature-preserving Locally Optimal Projection Operator (FLOP) for geometry reconstruction [17], which is based on a bilateral-weighted LOP, attending to information concerning both spatial and geometric features. FLOP has been greatly accelerated by using the random sampling of the Kernel Density Estimate (KDE) and extended for efficient and faithful reconstruction of time-varying surface reconstruction. FLOP method preserves the features better than LOP and WLOP operators. However, it fails to accurately compute normal vectors for sparsely sampled models with sharp features and heavy outliers.

Moving least squares (MLS) is a very attractive tool to design effective meshless surface representations. Öztireli et al. reviewed robust implicit MLS surfaces in terms of local kernel regression. RIMLS representation can handle sparse sampling, generate a continuous surface with better preserved fine details, and can naturally deal with any kind of sharp features with controllable sharpness [10]. In a similar vein, we validate our method for surface reconstruction using RIMLS representation.

#### 3. Resampling WLOP

##### 3.1. Resampling WLOP Schema

The WLOP algorithm suffers from high computational costs, due to the large amount of the point clouds data and the iteration calculation. Recently more attention is focused on how to speed up the computing time of this algorithm. Liao et al. had greatly accelerated the computing time of FLOP [17]. The subsampled point set by using the random sampling of the Kernel Density Estimate (KDE) is a feature-aware approximation to the original input point set . The idea is similar to that in this paper. However, the number of is smaller than the number of . The performance of their method which is actually data simplification for point cloud depends upon the simplified calculation ratio. The number of input point set is divided into two subsets but not to be reduced by using our method. And so the comparisons between our method and the approach in [17] cannot be made directly. Existing solutions for speeding up the process of point clouds using hierarchical-based method are demonstrated in [20]. Diez et al. proposed a novel technique named as Hierarchical Normal Space Sampling (HNSS) to speed up coarce matching algorithms for point clouds. Two sets of points were grouped to points hierarchical according to the distribution of their normal vectors. The data hierarchy is traversed using RANSAC algorithm applied to point matching for every corresponding hierarchy level of two point sets. Our resampling method involves only one set of points. And so the comparisons between our method and the approach in [20] cannot be made directly.

In this section, we present the Resampling Weight Locally Optimal Projection, for a new and more efficient method for data organization. The computational complexity of the multivariate median problem, which remains a bottleneck of WLOP, is superlinear in the number of input points . The purpose of organizing data is to reduce the number of input points in each calculation. Data after being downsampled is traversed using WLOP algorithm for every subset. The Resampling WLOP schema is shown in Figure 1 and described in detail below.