About this Journal Submit a Manuscript Table of Contents
Advances in Mechanical Engineering
Volume 2013 (2013), Article ID 580417, 8 pages
http://dx.doi.org/10.1155/2013/580417
Research Article

A Sphere-Based Calibration Method for Line Structured Light Vision Sensor

Key Laboratory of Precision Opto-Mechatronics Technology, Beihang University, Ministry of Education, Beijing 100191, China

Received 9 July 2013; Revised 15 September 2013; Accepted 28 October 2013

Academic Editor: Liang-Chia Chen

Copyright © 2013 Zhenzhong Wei et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

A major difficulty in calibrating line structured light vision sensor is how to obtain enough calibration feature points, due to the fact that known world points on the calibration target do not rightly fall onto the light stripe plane. This paper presents a calibration method using a target, consisting of one sphere and a reference board, to obtain feature points on the light stripe, which is the intersection of the structured-light plane and the reference board. By moving the sphere into several positions on the fixed reference board, we can get a plane in parallel with the reference one. From the acquired right-circular cone and known radius of the sphere, the function of the reference board can be deduced. Moving the target randomly into different positions, enough points lying on the structured-light plane can be obtained. Experiment results show that the accuracy of the proposed calibration method can reach 0.102 mm within the view field of 200 mm * 200 mm; meanwhile, robustness and easy operation can be indicated.

1. Introduction

The Line Structured Light Vision Sensor (LSLVS) consisting of one camera and one projector is widely used in industrial measurement owing to its wide measurement range, high precision, real-time ability, easy information extracting, and so forth. Calibration, with the purpose of establishing expression of the structured-light plane under the camera coordinate system, is one of the basic measurement tasks of LSLVS [1].

Heretofore, the calibration methods of LSLVS can be classified into three categories: 3D target based method, planar target based method, and 1D target based method. Unsatisfactorily, the 3D target method [24] is not accurate enough because of less feature points and the problem of mutual occlusion between different planes of the target. Additionally, the target, normally a cube with some special accessories, is difficult to produce precisely and is cumbersome for onsite calibration.

The method using a planar target is more available in the LSLVS calibration than 3D target based method. Typically, the method based on the invariance of double cross-ratio uses a planar checkerboard-pattern target to realize the calibration of the LSLVS [57]. The intersections of the light stripe plane and checkerboards with accurately known size can be obtained under the image coordinate system. Accordingly, feature points on the structured-light plane can be gained based on the invariance of double cross-ratio. Another calibration method represents the light stripe on a planar target with Plücker matrix [8]. Combining Plücker matrixes of the light stripes in different positions, the equation of the structured-light plane under the camera coordinate system can be solved. The method based on the vanishing point and vanishing line [9] is familiar. With a planar rectangle target, the normal vector of the structured-light plane can be calculated from its vanishing line. As the size of the target is known accurately, the representation of the structured-light plane can be deduced.

1D target based method [10, 11] is proposed as its convenient operation. One feature point, the intersection of the light stripe and the 1D target, can be got each time based on the invariance of double cross-ratio. By moving the target randomly into more positions, enough feature points can be obtained.

Xu et al. [12] use four spheres and a named standard board to get the expression of the plane through centers of the spheres and then attain the representation of the standard board as the known radius of each sphere. By moving the standard board into several positions, the height matrix, which gives the relationship between matching differences and height, can be deduced based on successive geometrical calculation. According to the model of calibration and the obtained height matrix, the structured light sensor of 3D measurement can be calibrated. Inspired by this method, a calibration method for line structured light vision sensor only using a white sphere and a black reference board is presented (see Figure 1).

580417.fig.001
Figure 1: The principle of the calibration.

By moving the sphere into several positions on the fixed reference board, we can get a plane through the centers of spheres in different positions. The center of each sphere under the camera coordinate system can be gained from its corresponding sphere projection on the image plane. Then the plane through the sphere centers (a virtual plane in order to express the reference board easily as illustrated in Figure 1) can be obtained by planar fitting easily. As the distance from the sphere center to the reference board is a known constant and the normal vector of the plane through sphere centers has been deduced, the representation of the reference board under camera coordinate system can be solved. Then a group of collinear feature points are obtained from the light stripe on the reference board.

Moving the target randomly into more than two different positions, we can get enough feature points of the structured-light plane easily. In this method, all feature points projected on the reference board can be used to fit the structured-light plane; meanwhile, the sphere projection is not concerned with the sphere orientation, which can make the calibration more accurate and robust.

2. Measurement Model of LSLVS

The location relationship between the camera in LSLVS and the structured-light plane projector remains unchangeable in the process of calibration and measurement. So the structured-light plane can be expressed as a fixed function, which is defined as (1), under the camera coordinate system. Consider where , , , and are the parameters of the structured-light plane’s expression.

The measurement model of LSLVS is illustrated in Figure 2. is the Camera Coordinate System (CCS), while is the Image Coordinate System (ICS). Under the CCS, the center of projection of the camera is at the origin and the optical axis points in the positive direction. A spatial point is projected onto the plane with , referred to as the image plane under the CCS, where is the effective focal length (EFL). Suppose the point is the projection of on the image plane. Under the undistorted model of the camera, namely the ideal pinhole imaging model, , and the center of projection are collinear. The fact can be expressed by the following equation:

580417.fig.002
Figure 2: The measurement model of the LSLVS.

Practically, the radial distortion and the tangential distortion of the lens are inevitable. When considering the radial distortion, we have the following equations: where is the distorted image coordinate, is the idealized one, and , are the radial distortion coefficients of the lens.

3. Calibration of the LSLVS

In our method, the calibration can be executed by the following three key points: (1) work out the coordinates of the sphere centers in different positions under the CCS, (2) compute the function of the reference board from the coordinates of the sphere centers in different positions, (3) obtain enough feature points on the structured-light plane.

3.1. Calculation of the Sphere Center in 3D Space

The projection of a sphere on the image plane is an ellipse (see Figure 3), which can be expressed as a matrix form: where is the homogeneous coordinate of the projective point on the image plane and are the parameters of the elliptical expression. Fitting the ellipse on the image plane, the function of the ellipse under ICS can be obtained.

580417.fig.003
Figure 3: The projection relationship of a sphere.

According to (2) and (4), we obtain the matrix representation of the right-circular cone under CCS as where the matrix is defined as and the related definitions are shown as follows: Defining the coordinate of the sphere center under CCS can be expressed by the following equation [13, 14]: where , , are the eigenvalues of matrix , and and must have the same sign, while must have the different one (when is a spherical matrix, we have is the eigenvector corresponding to , and is the radius of the sphere.

3.2. Determination of the Reference Board

As the sphere with known radius is moved on the reference board which is fixed in one position, we can get a plane through the centers of spheres located in different positions. Define the plane through the sphere centers as

When sphere centers located in more than three different positions have been gained, the unit normal vector of the plane through sphere centers under the CCS, which is expressed as and the parameter , can be solved.

As the plane through the sphere centers is in parallel with the reference board and the distance of the two planes is known as the radius , according to (10), the reference board can be expressed as

The parameter can be calculated as follows: defining the coordinate of the sphere center under the CCS as , the direction vector from the origin to the sphere center is . As the target should locate in the positive direction under the CCS, the location relationship is illustrated in Figure 4. can be deduced from the following equations:

580417.fig.004
Figure 4: The sphere in different positions and the reference board under CCS.
3.3. The Structured-Light Plane

Combining (2) and (11), the light stripe on the reference board under CCS can be worked out. Moving the reference board randomly into different positions and repeating as mentioned in Section 3.2, we can get enough feature points on the structured-light plane. Fitting the structured-light plane by the linear least square method, the representation under the CCS (1) can be solved.

4. Simulations and Discussions

In this section, simulations have been conducted to evaluate the impacts of some factors on the proposed calibration method. The intrinsic parameters of the camera used in simulations are listed in Table 1.

tab1
Table 1: The intrinsic parameters of the camera.

The simulation results are detailed below.

4.1. The Influence of the Projection and Light Stripe

In this method, feature points of the structured-light plane are obtained from the intersection of the reference board and the structured-light plane. Gaussian noise with means 0 is added to perturb both the sphere projection, which determined the precision of the reference board, and the light stripe on the image plane to evaluate the proposed method.(1) The assumed radius of the sphere is 20 mm. Gaussian noise with standard deviations varying from 0 to 0.5 pixels is added to both coordinates of the image points to generate the perturbed image points of the sphere projections in different positions. In Figure 5, the Root-Mean-Square (RMS) error and the mean absolute error (MAE) are illustrated. These errors are solved from the intersection angle of two normal vectors, the idealized structured-light plane’s and the perturbed one’s. Each point in Figure 5 represents result averaged 100 uniformly distributed rotations.(2) Gaussian noise with standard deviations varying from 0 to 1.0 pixel is added to both coordinates of the image points to generate the perturbed image points of the light stripe on the reference board. The Root-Mean-Square (RMS) error and the mean absolute error (MAE) are illustrated in Figure 6 with the same calculation method as mentioned in Figure 5. And each point in Figure 6 represents result averaged 100 uniformly distributed rotations.

580417.fig.005
Figure 5: The error as the function of the added Gaussian noise to sphere projections.
580417.fig.006
Figure 6: The error as the function of the added Gaussian noise to the light stripe.

From Figures 5 and 6 we can see that the calibration error, including the RMS error and the MAE, increases due to the bias of the image points on the sphere projections and the light stripe.

As we have only moved the sphere into 4 positions and the light stripe is fitted just by 1000 image points, the proposed method is significantly affected by the bias of the sphere projections.

4.2. The Influence of Other Factors

As the reference board is deduced from the plane through the sphere centers, the indirect factors to the calibration method include the radius, the number of movement, and localization of the sphere. The following simulations have been conducted to evaluate the impacts of the three factors.(1)The sphere with radius varying from 10 mm to 100 mm has been assumed in our simulations. Four different positions are moved into in each simulation. Gaussian noise with means 0 and standard deviation 0.1 pixels is added to perturb the projections of the sphere in different positions on the image plane. The influence of the noise is illustrated in Figure 7(a).(2)The sphere with a constant radius is moved into 3–8 different positions separately to fit the plane through the sphere centers in the simulations. Gaussian noise with means 0 and standard deviation 0.1 pixels is added to both coordinates of the image points to generate the perturbed image points of the sphere projections. The effect of the noise to the calibration result, when different numbers of positions the sphere is moved into, is shown in Figure 7(b).(3)The sphere with a constant radius is controlled to move into 4 different positions in the simulations. The four positions separately locate in the four vertices of a square with length of side varying from 10 mm to 70 mm. Gaussian noise with means 0 and a standard deviation 0.1 pixels is added to perturb the projections of the sphere in different positions. We get the effect as illustrated in Figure 7(c).

fig7
Figure 7: (a) The error as the function of radius of the sphere. (b) The error as the function of the number of positions. (c) The error as the function of the localizations of positions.

In these figures, both the Root-Mean-Square (RMS) error and the mean absolute error (MAE) are illustrated. These errors are solved from the intersection angle of two normal vectors, the idealized structured-light plane’s and the perturbed one’s, which is the same as mentioned in Section 4.1. Each point in Figure 7 represents result averaged 100 uniformly distributed rotations.

From Figures 7(a) and 7(b), we can see that, when the radius is greater or there are more positions the sphere is moved into, the calibration result will be better. It also significantly shows that the result of the proposed calibration method will be better when the relative distance of the positions increases, as illustrated in Figure 7(c).

5. Experiments and Discussions

The camera used in our calibration experiment is AVT Stingray F-504B with a resolution of 2452 * 2056 pixels and the view field is about 200 mm * 200 mm. Intrinsic parameters of the camera are listed in Table 2. are defined as mentioned in Table 1, while are the radial distortion coefficients of the lens.

tab2
Table 2: The intrinsic parameters of the camera.

The radius of the sphere used in our experiment is 10 mm, with the accuracy of 10 μm, while the size of the black reference board is 150 mm * 150 mm, with the machining accuracy of 30 μm (see Figure 8). The white sphere is moved into four positions on the black reference board. The structured-light plane projects on the reference board clearly, which is illustrated in Figure 8.

580417.fig.008
Figure 8: The images of the real target.

We first compensated for camera distortion by rectifying all real images. In order to improve the accuracy of the extraction of image points, including the contour of sphere projection and the light stripe on the image plane, the Hessian matrix algorithm is applied, which can make extracting of the image points at subpixel accuracy [15] (Figure 9).

580417.fig.009
Figure 9: The extraction of a sphere projection.
5.1. Accuracy Evaluation

A checkerboard-pattern target is used to evaluate the accuracy of the proposed calibration method.

Real distance (R-dist.): the grid pitch of the target is known accurately as , while the length of can be defined as (see Figure 10). Based on the invariance of cross-ratio, the following equation can be obtained:

580417.fig.0010
Figure 10: The obtainment of the real distance.

The real length of can be solved, so can . Then the distance between point and point can be worked out.

Calculated distance (C-dist.): combining (1) with (2), bring point and point under ICS into them, we can work out the coordinates under the CCS, and then the distance of the two points can be solved. Repeat until enough distances have been obtained. The results are shown in Table 3.

tab3
Table 3: The result of our experiment (mm).

As the radius of the sphere used in our experiment is 10 mm and the positions of the reference board in the process of calibrating the line structured light vision sensor is 5, when we just move the sphere into four different positions on the reference board, the result with the accuracy of 0.102 mm is well enough. The calibration method will be more accurate and stable if the conditions are improved in the experiment as mentioned in Section 4.

5.2. The Number of Located Positions of the Reference Board

In our method, all the image points of the light stripe projected on the reference board are used to fit the structured-light plane. More feature points will be obtained when more calibration images are used in the experiment, and the result will be more stable. For the purpose of determination, the reference board is moved to 2–6 different positions, and the same number of images can be used to obtain the light stripe in the calibration experiment. The accuracy is evaluated by the method mentioned in Section 5.1, which is shown in Table 4.

tab4
Table 4: The relation between the number of positions and the RMS error.
5.3. The Comparison with Other Methods

Our proposed method is compared with two typical calibration methods which are suited for onsite calibration, the method based on the invariance of double cross-ratio (2D target based method) [7] and the method based on 1D target [11]. As listed in Table 5, we can see that the three calibration methods have nearly the same accuracy obtained from experiments.

tab5
Table 5: The comparison with other methods.

Nevertheless, in the two typical calibration methods, only a few feature points on the light stripe are used to obtain the structure-light plane, which is improved in our proposed method as all feature points projected on the reference board are used to fit the structured-light plane. This feature makes the calibration stable and robust. Meanwhile, the traditional calibration methods have the problem of visual angle. When observed from different angles, different images of light stripe will be obtained, which can affect the accuracy of the calibration and restrict the movement of the target. However, the sphere-based method can prevent it, as the sphere projection is not concerned with the sphere orientation.

6. Conclusion

The method presented in this paper utilizes a sphere target to finish the calibration of line structured light vision sensor. As the function of the reference board can be deduced from the projections of the spheres in different positions under the camera coordinate system, enough feature points on the structured-light plane can be obtained by moving the target into several different positions. In this paper, we have conducted enough simulations and experiments to evaluate the proposed calibration method. As all feature points projected on the reference board can be used to fit the structured-light plane and the sphere projection is not relative of the sphere orientation, the calibration is improved more accurately and robustly. Experimental results show the accuracy of the method can reach 0.102 mm within the view field of about 200 mm * 200 mm, which can be improved if the experiment conditions are better. Additionally, efficiency and convenience can be deduced from the free combination of one sphere and a reference board. So this method is generally efficient for on-site calibration and no less accurate than other classical calibration methods.

Acknowledgments

This work is supported by the National Natural Science Foundation of China (50875014) and the Natural Science Foundation of Beijing (3092014).

References

  1. G. Zhang, Vision Measurement, Science Press, 2008.
  2. R. Dewar, “Self-generated targets for spatial calibration of structured light optical sectioning sensors with respect to an external coordinate system,” in Proceedings of the Robots and Vision Conference, pp. 5–13, Detroit, Mich, USA, 1988.
  3. K. W. James, “Noncontact machine vision metrology with a CAD coordinate system,” Autofact’88 Conference Proceedings, pp. 9–17, 1988.
  4. F. Duan, F. Liu, and S. Ye, “A new accurate method for the calibration of line structured light sensor,” Chinese Journal of Scientific Instrument, vol. 211, pp. 108–110, 2000.
  5. G. Xu, L. Liu, and J. Zeng, “A new method of calibration in 3D vision system based on structure-light,” Chinese Journal of Computers, vol. 18, no. 6, pp. 450–456, 1995.
  6. D. Q. Huynh, R. A. Owens, and P. E. Hartmann, “Calibrating a structured light stripe system: a novel approach,” International Journal of Computer Vision, vol. 33, no. 1, pp. 73–86, 1999. View at Publisher · View at Google Scholar · View at Scopus
  7. Z. Wei, G. Zhang, and Y. Xu, “Calibration approach for structured-light-stripe vision sensor based on the invariance of double cross-ratio,” Optical Engineering, vol. 42, no. 10, pp. 2956–2966, 2003. View at Publisher · View at Google Scholar · View at Scopus
  8. Z. Liu, G. Zhang, Z. Wei, and J. Jiang, “An accurate calibration method for line structured light vision sensor,” Acta Optica Sinica, vol. 29, no. 11, pp. 3124–3128, 2009. View at Publisher · View at Google Scholar · View at Scopus
  9. Z. Wei, M. Xie, and G. Zhang, “Calibration method for line structured light vision sensor based on vanish points and lines,” in Proceedings of the 20th International Conference on Pattern Recognition (ICPR '10), pp. 794–797, August 2010. View at Publisher · View at Google Scholar · View at Scopus
  10. Z. Wei, L. Cao, and G. Zhang, “A novel 1D target-based calibration method with unknown orientation for structured light vision sensor,” Optics and Laser Technology, vol. 42, no. 4, pp. 570–574, 2010. View at Publisher · View at Google Scholar · View at Scopus
  11. F. Zhou and F. Cai, “Calibrating structured-light vision sensor with one-dimensional target,” Journal of Mechanical Engineering, vol. 46, no. 18, pp. 7–12, 2010. View at Publisher · View at Google Scholar · View at Scopus
  12. J. Xu, J. Douet, J. Zhao, L. Song, and K. Chen, “A simple calibration method for structured light-based 3D profile measurement,” Optics and Laser Technology, vol. 48, pp. 187–193, 2013.
  13. Y. Shiu and S. Ahmad, “3D location of circular spherical features by monocular model-based vision,” in Proceedings of the IEEE Conference on System, Man and Cybernetics, pp. 576–581, Cambridge, Mass, USA, 1989.
  14. R. Safaee-Rad, I. Tchoukanov, K. C. Smith, and B. Benhabib, “Three-dimensional location estimation of circular features for machine vision,” IEEE Transactions on Robotics and Automation, vol. 8, no. 5, pp. 624–640, 1992. View at Publisher · View at Google Scholar · View at Scopus
  15. C. Steger, Unbiased extraction of curvilinear structures from 2D and 3D image [Ph.D. Dissertation], Technische Universitaet Muenchen, 1998.