Table of Contents Author Guidelines Submit a Manuscript
Scientific Programming
Volume 2017, Article ID 1526706, 8 pages
https://doi.org/10.1155/2017/1526706
Research Article

Field Geometric Calibration Method for Line Structured Light Sensor Using Single Circular Target

1School of Electrical Engineering, Henan University of Technology, Zhengzhou, Henan 450001, China
2College of Computer and Communication Engineering, Zhengzhou University of Light Industry, Zhengzhou, Henan 450002, China
3Marine Engineering Institute, Jimei University, Xiamen, Fujian 361021, China

Correspondence should be addressed to Tianfei Chen; moc.361@iefnait_nehc

Received 17 October 2017; Accepted 11 December 2017; Published 31 December 2017

Academic Editor: Shangguang Wang

Copyright © 2017 Tianfei Chen et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

To achieve fast calibration of line structured light sensor, a geometric calibration approach based on single circular calibration target is proposed. The proposed method uses the circular points to establish linear equations, and according to the angle constraint, the camera intrinsic parameters can be calculated through optimization. Then, the light plane calibration is accomplished in two steps. Firstly, when the vanishing lines of target plane at various postures are obtained, the intersections between vanishing lines and laser stripe can be computed, and the normal vector of light plane can be calibrated via line fitting method using intersection points. After that, the distance from the origin of camera coordinate system to the light plane can be derived based on the model of perspective-three-point. The actual experimental result shows that this calibration method has high accuracy, its average measuring accuracy is 0.0451 mm, and relative error is 0.2314%. In addition, the entire calibration process has no complex operations. It is simple, convenient, and suitable for calibration on sites.

1. Introduction

As a typical representative of noncontact vision measurement technology, the line structured light sensor has a wide application prospect in industry for dimensional analysis, on-line inspection, component quality, and reverse engineering, due to its simple structure, moderate accuracy, fast speed, large amount of information, and other advantage factors [16]. Generally, the sensor consists of one camera and one laser generator, and the laser triangulation is treated as basic principle, so the laser generator is displaced relative to the camera in space. When the structure is fixed, the task of sensor is to acquire the three-dimensional (3D) characteristic information of profile of the measured object. The acquisition process is described as follows: laser plane from laser generator is modulated by the depth of the measured object, and the laser stripe is formed, which contains 3D information of the measured object. The camera captures the object containing the deformed laser stripe. Then, 3D dense world points are generated by sampling points on each light stripe in the CCD image, and the calculation of 3D world points is based on the mathematical model of the sensor. When the line structured light sensor has already been assembled, some unknown model parameters exist, which directly influence the calculation precision of 3D world points for the measured object. Therefore, the calibration of model parameters is one of the key links for line structured light sensor.

The parameters calibration for line structured light sensor incorporates two stages: stage one is camera calibration, and the other is the calibration of light plane equation. According to the dimension of calibration target, the existing calibration method can be divided into the categories as below: (1) 3D stereo target. The wire drawing calibration method [7] and the sawtooth calibration method [8] are the earliest proposed methods. It should be noted that camera calibration is not involved in these two methods, and some precise auxiliary equipment is still needed, which leads to the fact that the calibration cost is relatively higher. Besides, the calibration process is tedious and the accuracy is limited. On the foundation of the previous work, Huynh et al. [9] and Wei et al. [10] make full use of cross-ratio invariance and double cross-ratio invariance to compute calibration points, and the calibration accuracy is improved significantly. However, 3D precision calibration target is still employed, and calibration cost is higher all the same. In addition, high quality images are difficult to be captured, due to illumination occlusion of 3D stereo target. (2) Planar target. The calibration method that used free-moving planar target has become commonly popular for line structured light sensor [1115]. This method can not only obtain high quality calibration images, but also reduce the calibration cost. However, some repeated calculation from each position of planar target to camera is required, so the complexity of computation is higher. (3) One-dimensional (1D) target. Wei et al. [16] and Han et al. [17] have proposed calibration methods based on 1D target, which can expand the application field of line structured light sensor. (4) Self-calibration method. Active vision calibration method for line structured light sensor has been proposed in [18], which can achieve automatic calibration without target, but this method relies on the mechanical movement of sensor, so it does not have universal applicability.

A field geometric calibration method for line structured light sensor based on single circular target is proposed in this paper. The proposed method makes full use of geometric constraints of a single circle to complete the parameters calibration of camera and light plane equation at the same time. The adopted single circular target has the advantages of simple structure and easy processing, and it can be placed free-movingly to capture more higher quality calibration images. In addition, the calculation process is not complicated or repeated, the algorithm is simple and fast, and it is flexible, convenient, and suitable for calibration on sites.

2. Mathematical Model of Line Structured Light Sensor

Figure 1 shows the mathematical model of line structured light sensor. In Figure 1, - represents camera coordinate system, and - is image plane coordinate system of CCD. - is defined as normalized plane coordinate system, and its distance to the origin of camera coordinate system is 1. Besides, , . The optical axis of camera is perpendicular to normalized plane and CCD image plane. is a point on the laser stripe, and , are, respectively, corresponding image points on CCD image plane and normalized plane.

Figure 1: Model of line structured light sensor.

Note that the homogeneous coordinates of point in camera coordinate system are denoted by , and the homogeneous coordinates of its corresponding image points , in their coordinate system are, respectively, . Based on the perspective projection transformation, the relationships are described as follows: represents a unit matrix of , and the homogeneous coordinate point can be treated as 3D coordinates of image point in the camera coordinate system. is camera intrinsic parameter matrix; its form is represented as follows: are, respectively, the scale factors of the CCD image plane in the axis direction. represent the principal point coordinates of the camera; is skew factor.

If the radial distortion of camera lens is further considered, the first-order model is commonly chosen to handle the nonlinear distortion effects, due to the fact that too many distortion parameters will make solution unstable [19]. Note that the actual corresponding image point coordinate of points is ; we have the relationship as follows: is distortion parameter; . Thus, the whole intrinsic camera parameters consist of the matrix and the distortion parameter .

Assume that the light plane equation in camera coordinate system is expressed as follows: is a point on the light plane, so its coordinate also satisfies constraint of (4). are light plane parameters of sensor. If the camera intrinsic parameters and light plane parameters are known, the 3D coordinates of point on the light plane can be determined according to (1), (3), and (4).

3. Calibration of Sensor Parameters

3.1. Geometric Theorem

A single circular calibration target is used in this paper. In order to describe the proposed algorithm conveniently, some related geometric theorems are firstly introduced. Based on the geometric constraints of single circle, camera parameters and light plane parameters can be calibrated.

Figure 2 shows schematic diagram of single circular target and its imaging. In Figure 2(a), is circle center, and A, B, C, D are intersection points between the circle and the line , across the circle center . The angle between , is . indicates the line at infinity, which is the intersection line formed by target plane and the plane at infinity. are, respectively, intersection points between line and . are circular points of line . In Figure 2(b), vanishing points are, respectively, imaging points of . Vanishing line is the projection of on CCD. In camera coordinate system, the direction of space straight line is , the normal direction of target plane is n, the homogeneous expressions of vanishing point and vanishing line are, respectively, . Based on the projective geometry theory [20], the following properties are satisfied.

Figure 2: Schematic diagram of target plane and its imaging. (a) Target plane. (b) Imaging of target.

(1) The cross-ratio of collinear points satisfies , and the cross-ratio of its projection remains unchanged. Similarly, .

(2) Any circle on target plane intersects the line at circular points, and it is irrespective of the position and radius of the circle. Accordingly, the intersections between the imaging of circle and the imaging of the line at infinity are the imaging of circular points.

(3) The intersection point between the space straight line and the plane at infinity indicates the direction of the straight line, and the intersection line between the space plane and the plane at infinity indicates the normal direction of the space plane.

Based on the above properties, the relations can be represented as follows:The symbol represents equality in case of a constant factor.

3.2. Camera Calibration

The single circular target images under different postures are captured within the camera’s field of view. For each image, vanishing points for all straight lines across the circle center can be calculated according to property , and the vanishing line of target plane under this posture is obtained by fitting the straight line using vanishing points; then the imaging of circular points can be computed according to property . If the imaging coordinates of circular point at a certain position are , and its relationship with camera intrinsic matrix can be expressed asthe linear equations of matrix can be constructed by formula (7), the detailed procedure can be referred to in [21]. However, the above process does not take into account the lens distortion parameters , and the lens distortion cannot be neglected in the field of vision measurement. In this paper, the optimization objective function is established according to the known angle constraint of the straight lines on the target; the formula is expressed as follows:In the above formula, is the number of target images, and is the number of angles between straight lines across the circle center on the target. represents the known angles between straight lines, and is the actual calculation value, which can be treated as a function of camera intrinsic parameter and . First, the extracted image points are corrected by formula (3), and then the vanishing points are computed. Using formula (5), the direction of lines through the center on the target is obtained in the camera coordinate system, and the angle between the lines is obtained by the cosine theorem. At this point, the camera calibration including distortion can be transformed into a nonlinear optimization problem, and Nelder-Mead nonlinear simplex method is used to solve (8). The solution of linear equation is as the iterative initial value for matrix , and the initial value of distortion parameters is set to zero; then the iterative optimization is carried on, so that which minimize the objective function are the intrinsic camera parameters.

3.3. Light Plane Calibration

From formula (4), are the light plane parameters of the sensor. At present, the general methods [711] are used to calibrate light plane by plane fitting after seeking the calibration points on the light plane. Considering geometric meaning of light plane, indicates the normal vector of light plane, represents the distance information from the origin of the coordinate system to the light plane, and then single circular target is used to complete calibration in two steps in this paper.

Figure 3 is schematic diagram of calibration for normal vector of laser plane. is laser stripe line on CCD image plane which is formed by modulation between light plane and target plane. The Steger method [22] is adopted to extract the center coordinates of laser stripe line, and this method is stable and robust, and its accuracy can be reached to subpixel level. are vanishing points of straight lines across the circle center on the target, and its corresponding image coordinates can be determined by property . are defined as intersection points between circle and straight lines across the center , and is image coordinates of circle center; then the vanishing points calculation formula is given as follows: represents corresponding vanishing line of target plane, and its mathematical equation can be obtained by plane fitting using vanishing points. Note that the homogeneous coordinates expression of on the CCD image plane is , so the formula is given byIn the above formula, , respectively, indicate , components of vanishing point. is the intersection point between line and line , and it represents the vanishing point of laser stripe line under this posture. Similarly, target plane’s posture can be changed repeatedly, vanishing points which indicate different directions of laser stripe lines on the light plane can be obtained, and then vanishing line of light plane can be calculated by line fitting, and the fitting equation is similar to formula (10). Then, the normal vector of light plane in the camera coordinate system can be computed by formula (6).

Figure 3: Schematic diagram of normal vector of laser plane.

Note that it is necessary to make the target plane and the CCD image plane incline to each other; that is to say, a certain angle between them should exists when calibrating camera and light plane. If the two planes are parallel, the projection of the target plane becomes into affine transformation, and then the vanishing line of the target plane cannot be determined.

Figure 4 shows the schematic diagram of calibration for distance information. is the origin of the camera coordinate system, is the distance from the origin to the light plane. represents the normal vector of light plane in the camera coordinate system. On the target plane, the laser stripe line intersects with the circle at point A, and the line across the point and the center intersects with the circle at point B, and their corresponding image points on CCD plane are, respectively, a, o, b. At this moment, the principle of perspective-three-point between points A, O, B and viewpoint will be used. If the distance of AO, OB and the angle of are known, the coordinates of points A, O, B can be calculated [23]. On the CCD image plane, the coordinates of points a, o, b can be extracted by image processing algorithm. If the camera intrinsic parameters have been calibrated, the angle can also be computed using the coordinates of the imaging points. While the distance of AO, OB is equal to the radius of the circle, the coordinates of point on the light plane, that is vector , can be calculated based on the model of perspective-three-point. Thus, the distance from the origin of camera coordinate system to the light plane is as expressed as follows:

Figure 4: Schematic diagram of distance information.

In order to reduce the influence of noise and other factors, the position of the target plane can be changed repeatedly and the average calculation is carried on; the formula is described as follows: represents distance information at the position, is the total number of points, and then the distance parameter is computed as follows:

Till then, the model parameters of sensor have already been calibrated completely.

4. Experiment Results and Accuracy Analysis

As shown in Figure 5(a), the line structured light sensor is composed of WAT-5352EX2 camera (resolution: ), fixed focus lens made by Computar, and laser line generator of 650 nm wavelength. Figure 5(b) shows the single circular target used in this paper, and the radius is 20 mm. In order to extract the intersection points conveniently, the target checkered with black and white is adopted, and each angle in the target is 30 degrees.

Figure 5: Line structured light measurement system and calibration target. (a) Line structured light sensor. (b) The single circular calibration target.
4.1. Calibration Experiment

Turn on the laser generator, and the single circular target is free-movingly placed at different position within the measurement space of the sensor to capture calibration images. Figure 6 shows a set of calibration images. In the calibration images, the intersection points between the circle and lines across the center can be extracted by using OpenCV functions, and the functions are based on the Harris corner detection principle, and its extraction accuracy can reach 0.1 pixels. In addition, [21] is used to locate the circle center in the calibration image.

Figure 6: One group of calibration images.

First of all, camera calibration is carried on according to Section 3.2, and the results are shown in Table 1.

Table 1: Camera intrinsic parameters.

Figure 7 shows the circle center locations influenced by lens distortion in a calibration image. The solid points are the intersection points of every two straight lines through the circle center, and the circle indicates the location center. The low distortion lens is used in this experiment, and its distortion is relatively smaller. Before correction, the solid points are basically distributed in a pixel cell, and the location center is within the solid points. After correction, the distribution of solid points is further narrowed on the direction of axis , and then the result demonstrates that the accuracy of circle center location can be further improved by distortion correction.

Figure 7: Effect of distortion on circle center. (a) Before correction. (b) After correction.

Extract the laser stripe line, and calculate the intersection points between laser stripe line and the vanishing line of the target, and then fit the straight line; the fitting line is the vanishing line of light plane in CCD. In Figure 8, the circles are the intersection points between laser stripe line and vanishing line of target plane, and the fitting line is vanishing line of light plane. The camera intrinsic parameters have already been calibrated, so the normal vector of light plane in the camera coordinate system can be calculated by formula (6). The result is computed as follows:

Figure 8: Vanishing line of light plane.

Figure 9 is a laser stripe image which satisfies the model of perspective-three-point. The star points and the cross points are the intersection points of laser stripe line and the fitting ellipse, and the circle represents the center. Then the model of perspective-three-point is constructed, and the distance information of the light plane is derived according to the content of Section 3.3. In order to reduce the influence of noise and other factors, more stripe images should be captured as shown in Figure 9, and then the average calculation is carried on after calculating distance information for each laser stripe image. By formula (13), the distance information of light plane is determined as . Finally, the calibration result of light plane equation is expressed as follows:

Figure 9: Laser stripe image of perspective-three-point.
4.2. Accuracy Analysis

In order to verify the accuracy of the proposed calibration method, a high precision planar target with solid circle is used. As shown in Figure 10, turn on the laser generator and capture the target image. In the local world coordinate system defined by planar target, the coordinates of feature points on the stripe line can be computed using principle of cross-ration invariance, and then reference distance value can be calculated among the feature points. According to the calibration results of line structured light sensor, the distance among feature points can also be calculated; then compare them with the reference distance value. The comparative results are shown in Table 2.

Table 2: Measurement data by line structured light sensor.
Figure 10: Solid circle target for accuracy analysis.

From the analysis data in Table 2, we can see that the proposed method has higher calibration accuracy, the average absolute measurement deviation is 0.0451 mm, and the average relative measurement error is 0.2314%, which can meet the precision requirements of most application areas.

5. Conclusion

In this paper, a geometric calibration method for line structured light sensor is presented. This method uses single circular target to construct geometric constraints, and both camera intrinsic parameter and light plane equation can be calibrated at the same time. The single circular target’s structure is simple and easy to process. The proposed algorithm has no complex or repeated calculation process, and it is simple, fast, flexible, and suitable for field calibration. The experimental results demonstrate that this method has higher calibration accuracy, the average measurement error is 0.0451 mm, and the relative error is 0.2314%, which meets the requirements of the detection accuracy in most application areas.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

This work was supported in part by the National Natural Science Foundation of China (U1604151, 61302118), Outstanding Talent Project of Science and Technology Innovation in Henan Province (174200510008), Program for Scientific and Technological Innovation Team in Universities of Henan Province (16IRTSTHN029), Program for Science and Technology Innovation Talents in Universities of Henan Province (17HASTIT022), the Funding Scheme of Young Key Teacher of Henan Province Universities (2016GGJS-087), and the Fundamental Research Funds of Henan University of Technology (2015QNJH13, 2016XTCX06).

References

  1. H. Bian and K. Liu, “Robustly decoding multiple-line-structured light in temporal Fourier domain for fast and accurate three-dimensional reconstruction,” Optical Engineering, vol. 55, no. 9, Article ID 093110, 2016. View at Publisher · View at Google Scholar · View at Scopus
  2. X. Hui-Yuan, X. You, and Z. Zhi-Jian, “Accurate extrinsic calibration method of a line structured-light sensor based on a standard ball,” IET Image Processing, vol. 5, no. 5, pp. 369–374, 2011. View at Publisher · View at Google Scholar · View at Scopus
  3. D. Wu, T. Chen, and A. Li, “A high precision approach to calibrate a structured light vision sensor in a robot-based three-dimensional measurement system,” Sensors, vol. 16, no. 9, article no. 1388, 2016. View at Publisher · View at Google Scholar · View at Scopus
  4. F. J. Brosed, J. J. Aguilar, D. Guillomïa, and J. Santolaria, “3D geometrical inspection of complex geometry parts using a novel laser triangulation sensor and a robot,” Sensors, vol. 11, no. 1, pp. 90–110, 2011. View at Publisher · View at Google Scholar · View at Scopus
  5. J. Geng, “Structured-light 3D surface imaging: a tutorial,” Advances in Optics and Photonics, vol. 3, no. 2, pp. 128–160, 2011. View at Publisher · View at Google Scholar · View at Scopus
  6. B. Wu, T. Xue, T. Zhang et al., “A novel method for round steel measurement with a multi-line structured light vision sensor,” Measurement Science & Technology, vol. 21, no. 2, pp. 283–293, 2010. View at Google Scholar
  7. R. Dewar, “Self-generated targets for spatial calibration of structured light optical sectioning sensors with respect to an external coordinate system,” in Proceedings of the Robots and Vision’88 Conference, pp. 5–13, 1985.
  8. D. Fajie, L. Fengmei, and Y. Shenghua, “A new accurate method for the calibration of line structured light sensor,” Chinese Journal of Scientific Instrument, vol. 21, no. 1, pp. 108–110, 2000. View at Google Scholar
  9. D. Q. Huynh, R. A. Owens, and P. E. Hartmann, “Calibrating a structured light stripe system: A novel approach,” International Journal of Computer Vision, vol. 33, no. 1, pp. 73–86, 1999. View at Publisher · View at Google Scholar · View at Scopus
  10. Z. Wei, G. Zhang, and Y. Xu, “Calibration approach for structured-light-stripe vision sensor based on the invariance of double cross-ratio,” Optical Engineering, vol. 42, no. 10, pp. 2956–2966, 2003. View at Publisher · View at Google Scholar · View at Scopus
  11. F. Zhou and G. Zhang, “Complete calibration of a structured light stripe vision sensor through planar target of unknown orientations,” Image and Vision Computing, vol. 23, no. 1, pp. 59–67, 2005. View at Publisher · View at Google Scholar · View at Scopus
  12. X. Zexiao, Z. Weitong, Z. Zhiwei, and J. Ming, “A novel approach for the field calibration of line structured-light sensors,” Measurement, vol. 43, no. 2, pp. 190–196, 2010. View at Publisher · View at Google Scholar · View at Scopus
  13. S. Wang, L. Qi, Y. Zhang et al., “Planar-Target-Based Structured Light Calibration Method for Flexible Large-Scale 3D Vision Measurement,” Sensors & Materials, vol. 25, no. 7, pp. 501–508, 2013. View at Google Scholar
  14. L. Xu and Z.-J. Zhang, “Error propagation analysis of structured light system,” Optics and Precision Engineering, vol. 17, no. 2, pp. 306–313, 2009. View at Google Scholar · View at Scopus
  15. T. Chen, J. Zhao, and X. Wu, “New calibration method for line structured light sensor based on planar target,” Acta Optica Sinica, vol. 35, no. 1, pp. 180–188, 2015. View at Publisher · View at Google Scholar · View at Scopus
  16. Z. Wei, L. Cao, and G. Zhang, “A novel 1D target-based calibration method with unknown orientation for structured light vision sensor,” Optics & Laser Technology, vol. 42, no. 4, pp. 570–574, 2010. View at Publisher · View at Google Scholar · View at Scopus
  17. J.-D. Han, N.-G. Lü, M.-L. Dong, and X.-P. Lou, “Fast method to calibrate structure parameters of line structured light vision sensor,” Optics and Precision Engineering, vol. 17, no. 5, pp. 958–963, 2009. View at Google Scholar · View at Scopus
  18. T.-F. Chen, Z. Ma, and X. Wu, “Calibration of light plane in line structured light sensor based on active vision,” Optics and Precision Engineering, vol. 20, no. 2, pp. 256–263, 2012. View at Publisher · View at Google Scholar · View at Scopus
  19. M. Ahmed and A. Farag, “Nonmetric calibration of camera lens distortion: differential methods and robust estimation,” IEEE Transactions on Image Processing, vol. 14, no. 8, pp. 1215–1230, 2005. View at Publisher · View at Google Scholar · View at Scopus
  20. J. G. Semple and G. T. Kneebone, Algebraic projective geometry, Clarendon Press, Oxford, UK, 1952. View at MathSciNet
  21. X. Meng and Z. Hu, “A new easy camera calibration technique based on circular points,” Pattern Recognition, vol. 36, no. 5, pp. 1155–1164, 2003. View at Publisher · View at Google Scholar · View at Scopus
  22. G. Steger, “An unbiased detector of curvilinear structures,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 2, pp. 113–125, 1998. View at Publisher · View at Google Scholar · View at Scopus
  23. H. Fengshan, Study on the Key Technique of Single Camera 3D Coordinate Vision Measurement System Using a Light Pen, Tianjin University, Tian Jin, China, 2005.