Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2015, Article ID 283629, 19 pages
http://dx.doi.org/10.1155/2015/283629
Research Article

Machine Vision Based Automatic Detection Method of Indicating Values of a Pointer Gauge

1School of Automation and Electrical Engineering, University of Science and Technology Beijing, Beijing 100083, China
2Key Laboratory of Operation Safety Technology on Transport Vehicles, Ministry of Transport, Beijing 100088, China
3School of Automation, Shenyang Aerospace University, Shenyang 110136, China

Received 4 November 2014; Revised 27 December 2014; Accepted 4 February 2015

Academic Editor: Chih-Cheng Hung

Copyright © 2015 Jiannan Chi et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

This study proposes an automatic reading approach for a pointer gauge based on computer vision. Moreover, the study aims to highlight the defects of the current automatic-recognition method of the pointer gauge and introduces a method that uses a coarse-to-fine scheme and has superior performance in the accuracy and stability of its reading identification. First, it uses the region growing method to locate the dial region and its center. Second, it uses an improved central projection method to determine the circular scale region under the polar coordinate system and detect the scale marks. Then, the border detection is implemented in the dial image, and the Hough transform method is used to obtain the pointer direction by means of pointer contour fitting. Finally, the reading of the gauge is obtained by comparing the location of the pointer with the scale marks. The experimental results demonstrate the effectiveness of the proposed approach. This approach is applicable for reading gauges whose scale marks are either evenly or unevenly distributed.

1. Introduction

With the rapid development of information technology, the digital meter has been widely applied. But pointer gauge is still very popular in various fields due to its simple structure, high reliability, low price, and easy operation. However, owing to the nondigital signal output of the pointer gauge, a computer cannot conduct the processing and remote transmission of the data collected by the pointer gauge. This limits its application. Therefore, the method of providing the pointer gauge with digital features, such as automatic reading and transforming the collected value into the digital signal, should be urgently solved for wider application. In particular, in the process of calibration or metrological verification [1], the checkers need to read the output values of the standard and the indicating value of the calibrated gauge, respectively, and then make a comparison. Random errors may occur in this case, due to the limited acuity of human observation. Furthermore, when an operator is far away from the gauge, the operator has to repeat the reading and manually record the indicating value. This not only increases the operator’s burden but also lowers the efficiency of the gauge calibration. To overcome the above limitations and to make the pointer gauge more user-friendly, new technologies are expected to digitalize its reading. At present, machine vision is widely utilized to detect the gauge dial and pointer and then conduct a reading.

This study aims at the application of pressure gauge verification and proposes a computer vision based automatic reading approach for a pointer gauge that adopts a coarse-to-fine scheme. In this approach, the dial region and its center of the pointer gauge are located by, first, using the region growing method. Then, the circular scale region is determined by the adaptive threshold method under the polar coordinate system. In the circular region, the scale marks distribution diagram is produced using improved central projection. According to the scale marks distribution diagram, the main scale marks are located in the circular scale region of the dial. In the following steps, the Hough transformation is used to detect the pointer in the dial region and obtain its direction. Finally, the distance method is adopted to obtain the indicating value of the gauge by comparing the pointer direction with the position of the scale marks.

Since the indicating value is determined by comparing the pointer direction and the locations of the scale marks, this approach is also suitable for gauges wherein the scale marks are unevenly distributed. Therefore, the proposed approach is, in fact, a general approach for gauge calibration.

2. Related Work

2.1. Configuration of Hardware System Used in Gauge Verification

The hardware system used in gauge verification generally comprises the following configuration [25] which is shown in Figure 1. During the working process, the high precise gauge and the calibrated gauge are placed on the worktable. A computer sends impulse signals to the motor driver. Therefore, the stepping motor is controlled to make the worktable move to allow for the adjustment of the location of the gauge, so that the gauges can be appropriately placed in the camera’s view, while the standard signal source sends reference signals to the calibrated pointer gauge and the high precise gauge. The indicating value of the high precise gauge is treated as standard data. Next, computer vision is applied to automatically recognize the values indicated by the calibrated and standard gauges. Finally, a comparison is made between the two values and the verification result is reached.

Figure 1: Hardware structure of the automatic recognition of the indicating value of the pointer gauge.

For some other pointer gauges, the standard data produced by the reference signal generator is directly inputted into the computer by the other signal-acquisition equipment. The verification result is achieved by comparing the standard data with the indicating value of the calibrated gauge, which is extracted using computer vision.

2.2. The Process of Automatic Reading of the Pointer Gauge Based on Computer Vision

Figure 2 shows the procedure of the image processing used to recognize the pointer gauge value. First, to improve the image quality, the gauge image has to be preprocessed by means of noise suppression and image enhancement. Next, image processing and analyzing methods are used to detect the pointer and scale marks. According to the pointer direction and the location of scale marks, the value indicated by the pointer gauge can be detected. Finally, the indicating value results are outputted. Therefore, the automatic recognition methods of the gauge indicating value mainly include three parts:(1)image preprocessing aimed to improve image quality and suppress noise,(2)gauge pointer and scale marks detection based on image processing and analyzing,(3)automatic recognition of the pointer gauge indicating value.

Figure 2: The image processing procedure of the automatic recognition of the indicating value of the pointer gauge.
2.3. Related Work of the Pointer Gauge Indicating Value Detection

In the existing literature, the pointer gauge image should be preprocessed to remove noise or enhance the image, especially for the images extracted from industrial fields. Image preprocessing improves the image resolution. Thus, it is easier to detect the features relating to pointer and scale marks by means of image analyzing.

Image preprocessing of the pointer gauge mainly comprises two parts. First, image preprocessing is used to filter out interference and noise that result from the change of environment. For example, in [6], the homomorphic filtering method was adopted to process the gauge image that has a large area of glare and reflection. Second, image preprocessing is used to correct an image distortion caused by camera imaging or different shooting angles. For instance, [7] discussed how to reduce the reading error when the gauge dial was uneven with the camera lens surface. The papers [8, 9] proposed an intelligent method of gauge value reading to avoid the influence of imaging distortion; this method was able to correct the angle between the pointer and the zero-scale marks and the angle between the zero-scale marks and the other longer scale marks. It also adopted the Newton interpolation method to approximate the functional relationships between the pointer angle and its indicating value.

After image preprocessing, image segmentation should be implemented to obtain a binarization image, wherein the threshold-segmentation method is always applied [1013]. In the following step, image thinning needs to be conducted in the binarization image to achieve a one-pixel-wide pointer image. Then, straight-line fitting is used to detect the pointer. So the key point of the automatic recognition of the gauge indicating value is the detection of the pointer and scale marks. In the existing literature, several methods for pointer detection, such as subtraction method, the template matching method, the Hough transform method, and the central projection method, were proposed. In the subtraction method [4, 14, 15], two gauge images with roughly the same backgrounds were taken by camera, while the pointer pointed toward a different scale. When one image was subtracted from another one, there were only two pointers left in the difference image. Next, the difference image was segmented to detect the pointer. Finally, the gauge’s indicating value could be obtained by calculating the position of the pointer and scale marks. The subtraction method was easily implemented by means of image subtraction. However, the different image not only reserves the pointer target but also leaves out a lot of noise due to the influence of illumination. Therefore, in many cases, the subtraction method does not produce reliable results.

In the template feature method [16], the features that represent the pointer and its direction, such as shape, area, and gray distribution, formed a multiple parameters template used to match the pointer in the gauge image for pointer detection. Since the template comprises multiple cues of the pointer and its direction, the template feature method could avoid the influence of an uneven illumination and other environmental changing factors. However, the process of a template match is too complicated to accurately detect the pointer.

The Hough transform method has often been used in pointer detection. In [17, 18], after edge detection and image thinning in the gauge image, the Hough transform method or its improved version was applied to detect the pointer by fitting the pointer edge. The Hough transform method could achieve good results for the pointer gauge that has many words and symbols on the dial. In [19, 20], a novel pointer detection method, called central projection, was described. The main idea of the central projection method was to search for the pointer location by means of calculating the projection from the feature points to the known central point. Every projection value corresponded to the angle of the straight line that connected the projection and central points. There must be an angle that corresponds to the largest number of projection points and represents the pointer. Hence, the central projection method was always used to determine the location of the gauge pointer.

The least squares method [21] was also a straight-line fitting method, just like the Hough transform method, which was used to detect the pointer. Compared with the Hough transform method, the least squares method required fewer calculations.

After the pointer was detected, angle and distance methods [22] were always used to determine the indicating value of the point gauge. The angle method recognized the indicating value by calculating the deflection angle between the detected pointer and the zero-scale mark, according to the formula . Here, was the range, was the premeasured angle between the maximum- and zero-scale mark, and was the angle between the current pointer and zero-scale mark. The distance method [23] obtained the indicating value by calculating the distance between the pointer and nearest scale mark. In this method, each scale reticule was located and labeled with the corresponding scale number in a two-dimensional coordinate system. If the distance between the pointer and each reticule is as follows:then the indicating value of the pointer isHere is the coordinate of the points in the scale marks, is the distance between the point and the line , represents the indicating value of the pointer, and are the values of the scale marks which are located on both sides of the pointer, and and are the distance between the pointer and the nearest scale marks on both sides, respectively. The angle method is simpler than the distance method, but it is unsuitable for the pointer gauge in which the scale is uneven.

3. Overview of Our Work

As mentioned above, the current methods for gauge reading recognition are limited in terms of their practical application, mainly due to two deficiencies. The existing methods were all implemented in the global gauge images. Therefore, the pointer and scale marks segmentation cannot be accurately conducted when there is some interference or where there is some complex writing on the gauge dial. In many studies, the scale marks are not precisely detected. Hence, the indicating value of the pointer gauge with an uneven scale cannot be recognized.

This study aims to overcome the shortcomings of the existing methods. As its contextual application, it also aims to improve the calibration of the pressure gauge and develop an automatic recognition method of pointer gauge readings based on machine vision. This method implements a scheme that takes detection from coarse to fine in terms of the pointer and scale marks detection. First, the pointer gauge’s dial region is located in a global image. Then, the dial’s annular scale region is located. In the following steps, the scale marks are detected in the annular scale region and the pointer is detected in the dial region. Finally, based on the results of the detection of the pointer and scale marks, the improved distance method is used to recognize the gauge’s value. Since the improved distance method is implemented from the coarse region to the fine target, it has a strong ability for suppressing interference. Since the indicating value is obtained from the pointer and scale marks detection, the method is also suitable for reading the gauge whose scale marks are unevenly distributed.

3.1. Machine Vision Based Pressure Gauge Verification System

Metrological departments usually calibrate pressure gauges with piston manometers. In this case, the standard pressure output of piston manometers, given by manually added weights, is passed on to the tested pressure gauge. Then, the qualification of the pressure gauge is determined through reading, analysis, and comparison of the indicating value. In general, verification personnel manually handle this process through observation by recording and analyzing the indicating value of the gauge panel. Many problems still exist with this verification method. They include low accuracy and repeatability in human visual observations and poor work efficiency, due to the large workload of verification personnel. Hence, development of an automatic verification method for the pressure pointer gauge will boost the advancement of the technology used in their production and verification capabilities. The system composition for pointer pressure gauge verification is shown in Figure 3. We can see that images of pointer gauges are obtained through a camera and processed and analyzed by computer, wherein the indicating values of the standard pressure gauge and the test gauge are detected and recognized. The final results of the gauge reading will be displayed on a monitor to operators and reach a verification conclusion based on the analyses of the above readings. The detection software also has such multifunctions as the storage of historical data, graphical display, and image switching to realize the automatic verification of pointer gauges.

Figure 3: Schematic of a machine vision based pressure gauge verification system.
3.2. Introduction of Our Work

In this study, a new automatic recognition method of the pointer gauge that indicates value is proposed. Figure 4 shows a block diagram of this method.

Figure 4: The flow chart of the pointer gauge indicating value recognition.

In general, the computer vision based automatic recognition method of indicating value comprises two parts. One part is the detection of the indicating value of the pointer gauge. This mainly includes the scale detection and the pointer detection, which refer to the location of the dial region and the dial center, the determination of the annular scale region in the dial, the scale detection, and the pointer detection. The other is the automatic recognition of the reading. Namely, it makes the final judgment of the gauge reading according to the detected direction of the pointer and the distribution of the scale marks. As seen in Figure 4, the automatic recognition of the indicating value, proposed in this paper, falls into five steps.(1)The dial region and its center location: first, in the entire image, the dial region can be determined by semi-interaction (using the mouse to drop down the box in the image). The pixel point with the biggest gray value is searched out in the dial region and taken as the seed point of the region growing method used to produce the circular region in the dial. In the process of the region growing method, the similarity threshold value between the pixel points is automatically determined. Finally, according to the results of the region growing method, the radius and center of the circular dial are obtained.(2)Determination of the scale region in the dial: the image coordinate is first converted into the polar coordinate with the dial center as its origin, which is used to express the coordinate of each pixel in the image. Then, by comparing the pixel sum of the low gray value with that of the high gray value in the annular region, the annular scale region in the dial is determined. And finally, it is copied into a new image as the annular scale region.(3)The scale marks are detected using the central projection method. In the newly generated annular scale region image, first, the scale mark segmentation is conducted by the OSTU method to form the binarization scale image. Then, the angle of each line that links the black pixel with the circle center is calculated, and the number of each angle is obtained. The angles with large numbers (connecting line) in a narrow angle range correspond to the thick scale marks (main scales) on the dial. Thus, the angles of the scale marks in the scope of 0°–360° are obtained.(4)Pointer detection: first, the Canny operator is used to implement the border detection. Then, the Hough transform method is used to fit the detected edge and obtain the pointer contour. Finally, the pointer direction is obtained according to the center position and the pointer contour.(5)The indicating value is calculated according to the pointer direction and the dial scale distribution.

4. Indicating Value Recognition Approach of Pointer Gauge

4.1. Pointer Gauge Indicating Value Detection

As mentioned above, the detection of the indicating value is achieved through the scale and pointer detection. As for the scale detection, first, the gauge center and dial region should be located, followed by the extraction of the annular scale region and the scale mark detection. Finally, the detection of the pointer direction is performed.

4.1.1. Dial Region and Its Center Location

As for the detection of the indicating value, first the gauge pointer and its direction should be detected to determine its rotation center. In general, the gauge’s dial center is the pointer’s rotation center. Hence, first, the dial center should be detected. In this study, the circular dial region is detected for determining the dial center. As is shown in Figure 5(a), typically, there are the labels, words, and scale marks in the circular dial. Therefore, it is difficult to confirm the circular region of the dial by such segment methods as the border and threshold values. In view of the above, this paper introduces the adaptive-region-growing method to locate the gauge’s circular region and, further, to obtain its dial center. The specific steps are as follows.(1)The placement of the seed [24]: since the dial is usually white, the region with a high gray level or the pixel point with the highest gray level in the gauge is certainly located at the part where the dial is. Therefore, the latter is selected as the seed point of the region growing.(2)Subject to the similarity of the pixel gray level, the expansion of the dial region is conducted from the seed point. However, it is hard to fix an appropriate similarity threshold value of the gray level between pixels. As is shown in Figure 5(b), the circular dial region cannot be completely produced with a small threshold value, while an oversized dial region will be generated with a large threshold value.

Figure 5: The diagram of the dial region growing results.

This study proposes an adaptive method to determine the similarity threshold value of the gray level; the concrete process is as follows.(A)A small similarity threshold value, , of the gray level is empirically chosen, and then, the dial region growing method is conducted from the seed point.(B)The ratio of the circumscribed rectangle’s length and width of the newly generated dial region is calculated to detect whether it is in the 0.95–1.05 range.(C)If the ratio of the length and width fails to meet the conditions in (B), the similarity threshold value would be increased with a certain step length (), and the first step, (A), would continue to conduct the region growing from the seed point. However, if it meets the conditions in (B), the growing region will be the circular dial region, the center of the circumscribed rectangle will be the circle center, and the smaller value between the length and width will be the diameter of the circular region (Figure 6).

Figure 6: The region growing results under different pixel similarity threshold values.
4.1.2. Polar Coordinate Based Scale Region Location

(1) Image in Polar Coordinates. In general, in light of the circular permutation, the pointer gauge scale marks are distributed around the center. Therefore, to correctly and expediently show where the scale marks in the image are, the polar coordinate system is utilized to position the pixels in the image. In this study, a conversion from the image coordinates into the polar coordinates is performed. This means that the pixel coordinates, expressed by the rectangular image coordinate system, are converted into those expressed by polar coordinates. The polar coordinates defined in this study are shown in Figure 7, the circle center being its origin.

Figure 7: Schematic of the polar coordinate.

(2) Detection of the Scale Region in the Dial. The procedure for scale region detection based on the adaptive threshold method is as follows.(A)The circular dial radius detected in Section 4.1.1 is taken as the initial radius , its decreasing step length being and the initial . In the gauge dial, the gray value difference between the scale marks and dial background is generally rather large. Therefore, the adaptive threshold value method can be adopted to segment the scale. In the polar coordinate, the annular binarization image is gained by employing the OSTU method [25] to perform the binarization segmentation in the annular region.(B)The ratio between the number of black and white pixel points in the annular binarization region between the radius and is calculated and denoted by . After reducing the inner radius, according to the step length , the ratio between the number of black and white pixel points in the region between and is computed and denoted by .(C)If is obviously different from , the inner radius and the outer radius of the scale region would be and , respectively.(D)If is approximated to , the annulus between and would still be in the circular scale region. At this moment, let ; the ratio between the number of black and white pixel points in the annulus between and is computed and is still denoted by . Then, go to (B) (Figure 8).

Figure 8: Schematic of scale mark region detection.

After obtaining the annular scale region in the above polar coordinate, the scale region is selected on the basis of angle and radius and copied into a new image, called the scale mark region image (Figure 9).

Figure 9: The scale mark region diagram and its binarized image.
4.1.3. Scale Mark Detection Based on the Improved Central Projection Method

In the newly generated image, , which is the annular scale region, the OSTU method is adopted to segment the scale marks to form the binarized scale image. In this image, the pixel value of the scale mark is 0 (black) and those in other places are 255 (white). Each black pixel point and the circle center are linked, and the slope, , of the connecting line is calculated according to the formula , wherein is the angle between the -axis and the connecting line between the black pixel point and the circular center. Due to the origin of the image coordinate being in the top left corner and the -axis running downward, it is inconvenient to denote the central angle within the scope of 0°–360°. Therefore, in this study, the rectangular coordinate system, with the image center as its origin, is introduced to express the angle between the black pixel point on the scale mark and the circle center, instead of the image coordinate system. As is shown in Figure 10, the angles between the black pixel point and the circle center are in the 0°–360° range.

Figure 10: Schematic of the image coordinate system and the rectangular coordinate system.

After calculating the number for angle , the angle with a large number (connecting lines) in a small angle range is that of the thick scale mark (main scale) on the dial. The steps to detect the scale mark by central projection are detailed as follows.

The black pixel point and the image central point, that is, the circle center in the new image, are linked to determine the slope , which is converted to an angle according to the following formula:

As shown in Figure 10, the image coordinate system is transformed to a rectangular one. The angle , following formula (4), is converted into an angle in the rectangular coordinate:

An angle histogram is generated by recording the occurrence frequency of each angle. Figure 11 shows the angle histogram.

Figure 11: The angle histogram.

The angle with the most occurrence times in the histogram is selected and denoted by ; all the angles whose occurrence times are more than are found. The occurrence times of the angles are denoted by . Here, represents the angles that occur more than .

For every angle , we calculate the weighted average value , according the following to formula:Here, is the sign of the angle.

Calculate specific value , according to the following formula:

If is within the 0.9–1.1 range, is the angle of the thick scale mark. All the angles sorted out successively are taken as , which correspond to the thick scale marks. Therefore, the thick scale marks are located on Figure 12.

Figure 12: The location of the thick scale marks.
4.1.4. Improved Hough Transform-Based Pointer Detection

The edge detection is implemented in the pointer gauge image, using the Canny operator to obtain the edge image of the gauge. Then, the Hough transform method is adopted to fit the straight line in the image to obtain all the straight lines among which the pointer border is included. Finally, only the pointer border can be determined by excluding the other straight edges.

The Hough transform method is adopted to fit the straight line and find all the straight lines in the image. In addition, each straight line described in the image coordinate system by is calculated.

The distance, , from the circle center, , to the straight line is calculated according to the following formula:

Based on prior knowledge, we know that the distance from the circle center to the straight line of the pointer is much less than the diameter of the dial; all the straight lines conforming to are picked out. Here, is the diameter of the dial.

The angle of each straight line in is calculated according to formula (8) and stored into an array :

As the angles of the pointer direction are acute ones, generally in the range of ~, the angle is calculated subject to the following formula:

If it is in the situation where , the straight line should be selected as one side of the pointer. Therefore, both lines where the two sides of the pointer lie can be found.

According to the straight lines of the two sides, the pointer direction can be determined.

4.2. The Pointer Gauge Indicating Value Recognition

As is shown in Figure 11, the intersection point, , of the two straight lines of the pointer border is calculated and then linked to the circle center . The angle of the line, , is calculated. In accordance with the position of the intersection point, , the angle, , in the rectangular coordinate system is computed according to formula (1) (Figure 13).

Figure 13: Schematic of pointer fitting and its direction.

According to the angle, , the readings, , corresponding to each thick scale mark are obtained. Based on the angle that corresponds to the thick scale mark and its number, , which is detected in Section 4.1.4, the pointer reading can be determined according to formula (10). The scale within the adjacent thick scale marks is divided by linear interpolation:Here, are the angles that correspond to the thick scale marks.

5. Experimental Results and Analysis

To verify the effectiveness of the proposed approach, a series of experiments are conducted to test its performance.

5.1. Experiment Design

The experiments include two parts. First, the images of the different kinds of pointer gauges are selected to conduct the simulation experiments and verify the effectiveness and applicability of the approach. Then, the pressure gauge calibration system, which is composed of a float gauge, high precise pressure gauge, test pressure gauge, camera, and computer, is established to verify the actual performance of this approach in the pointer gauge calibration.

5.2. Part One of the Experiments: Reading Identification of the Various Pointer Gauges
5.2.1. Samples Selection

As can be seen in Table 1, two groups of representative samples are selected for the experiment. They include the pressure gauge with a lot of writing on the dial and the flow gauge with an uneven scale.

Table 1: Experimental samples.
5.2.2. Experimental Results and Analysis

In this part of the experiment, we adopt the proposed approach to the automatic recognition of the indicating value of the above mentioned two types of gauges. To verify the effectiveness of the proposed approach, Figures 14 and 15 illustrate the experimental results of each step.

Figure 14: The detection results of the pointer gauge with a lot of writing on the dial.
Figure 15: The detection results of the pointer gauge with an uneven scale.

Figure 14(a) is the original pointer gauge image. From Figure 14(a), we can see that the pointer and scale marks cannot be directly detected by segmentation methods since there is a lot of writing on the dial. Therefore, the proposed approach applies an idea that conducts a coarse-to-fine search. The dial and scale regions are located first, since they are more easily detected than the pointer and scale marks. As shown in Figures 14(c) and 14(d), the dial region is located using the adaptive-region-growing method, and the scale region is detected under the polar coordinate system. Then, in both regions, the pointer and scale marks are accurately detected. As shown in Figures 14(e) and 14(f), the improved Hough transform method is adopted to detect the pointer and the improved central projection method is used to detect the scale marks. The location of the thick scale marks is determined according to the angle histogram generated from the central projection of the scale marks. Their indicating values are given according to the location of the thick scale marks. The scale values within any two adjacent thick marks are determined by linear division. According to the direction of the pointer and the location of the scale marks, the angle method, which is adopted in the adjacent thick scale marks, is used to indicate the pointer gauge value.

The proposed approach detects the pointer direction and the scale marks location, respectively. In Figures 15(e) and 15(f), it can be seen that the location of the scale marks is precisely obtained by the angle histogram and the pointer is detected in the dial region. Based on the location of the scale marks and the pointer direction, the distance between the pointer and the adjacent scale marks can be calculated using the distance method; then, the pointer indicating value can be estimated. Therefore, whether or not the distribution of the scale is even, the proposed approach is able to precisely detect the gauge’s indicating value, while the most current existing methods cannot.

5.3. Part Two of the Experiments: The Recognition of the Float Pointer Gauge Indicating Value
5.3.1. The System and Procedure of Experiments

To verify the performance of the practical application of the proposed approach in practical application, the float gauge is used as the pressure gauge verification system. The type of float gauge is Y-047. The main technical parameters of the float gauge are described as follows. The range of output pressure is 0.01–0.25 MPa; the accuracy class is 0.05%; the rated pressure is 0.5 MPa; and the standard weights are from 0.01 MPa to 25 MPa. A high-precision gauge is used as the standard pressure gauge whose accuracy class is 1.5. The test gauge is a gauge with general accuracy. A visual detection system is established in front of the float gauge. An optical lens, whose focal length can be adjustable from 10 to 20 mm, and a 500-megapixel HD CMOS camera with a 60 fps frame rate are applied as image sensors. The system is equipped with a coaxial light source, which is placed on the camera lens. The hardware configuration of the computer used in this experiment is Inter Pentium(R) Dual, CPU E2200, and 2.2 GHz. The programming environment is VS2010. The verification system of the pointer gauge that is based on computer vision is shown in Figure 16. The camera used in the experiments takes images of the high-precision standard gauge and the test gauge. The proposed approach is adopted to detect the indicating value of both standard and test gauges.

Figure 16: The experimental system of gauge test.

According to the gauge calibration regulations, the experiment procedure is as follows. First, a positive stroke is conducted. Here, the float gauge’s output is increased by gradually loading the weights, while the gauge’s indicating value is increased. At each indicating value test point, the camera takes images of both the standard pointer and test gauges. The proposed approach is to use the images to recognize the indicating values. Then, a negative stroke is conducted, of which the process is that the float gauge output is reduced from the highest test point by gradually unloading the weights, while the gauge’s indicating value is accordingly reduced. An image of each test point is taken to recognize the indicating values.

5.3.2. Experimental Results and Analysis

The experimental results are shown in Figures 17 and 18. The gauge’s upper limit is 0.25 MPa; the lower limit is 0 MPa. Whatever the experiment is, in the positive or negative stroke, the float gauge output intervals are 0.025 MPa. The detection results for the indicating values of the standard and test gauges are shown in Tables 2 and 3, respectively.

Table 2: The standard gauge detection results.
Table 3: The test gauge detection results.
Figure 17: The standard gauge results.
Figure 18: The test gauge results.

As can be seen from the experimental results, the proposed approach is applicable to the pressure gauge verification. The indicating values of the test and standard gauges can be obtained by the proposed approach, instead of being obtained through manual observation. The test gauge calibration is completed by comparing the indicating value of the test gauge with that of the standard gauge. From the experimental results in Tables 2, 3, and 4, comparing the visual data with the manual reading, we can see that the detection results of the indicating value based on computer vision are more accurate, valid and have more significant digits and a faster detection speed.

Table 4: Comparisons between manual reading and visual detection.
5.4. Experimental Conclusion

From the above experimental results, we can draw the following conclusions.(1)The indicating value of the pointer gauge can be effectively detected using the method proposed in this paper; the approach can be used for pressure gauge verification.(2)The approach proposes a stable frame about the reading identification of the pointer gauge. From big region to small target, the approach first locates the scale and the pointer region. Then, it precisely segments the pointer and scale marks in the target area. Finally, the indicating value of the gauge is obtained based on the direction of the pointer and the distribution of the scale marks. This framework is a general recognition method of a pointer gauge based on computer vision and has good interference immunity. However, to achieve more satisfactory results in its practical application, each step has to be improved. For example, in the process of locating of the dial region based on the region growing method, many factors need to be considered to determine whether the use of this method should be terminated.(3)In its practical application, the image preprocessing method should be used along with the proposed method, so as to obtain more satisfactory results, especially for some industrial field applications.

6. Conclusion

This study proposes a pointer gauge automatic reading approach based on computer vision. The approach aims at overcoming the defects in the current pointer gauge automatic reading approach and the application of pressure gauge verification. According to the framework of the proposed approach, which is conducted from big-region segmentation to small-target detection, first, the target region in the dial is located. Then, the pointer and scale marks are detected in this region. Finally, the indicating gauge value is obtained. As such, the proposed approach has good interference immunity and is applicable for both gauges with and without evenly distributed scale marks. The approach can be widely applied to the reading identification of different types of pointer gauges.

The proposed method is robust and precise enough to be used in pointer gauge verification. However, in some practical applications, especially in the industrial field, it should be used by combining it with image preprocessing methods so as to achieve better effects. The improved approach can also be used for the pointer gauge whose dial is noncircular. Therefore, in the future we will conduct the following studies to complete the digitization work on the pointer gauge, so as to develop a general indicating value reading method. The image preprocessing methods will be developed to eliminate the influence of stray or reflective light or to correct the image distortion caused by changes in the relative position between the camera and gauge. A general automatic reading method will be developed for different types of pointer gauges, which can be used in various hardware systems and have a unified standard output format.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

The work is supported by Beijing Key Discipline Development Program (no. XK100080537), the Beijing Natural Science Foundation (4122050), and Opening Project of Key Laboratory of Operation Safety Technology on Transport Vehicle, Ministry of Transport, China.

References

  1. L. Yanwen, G. Jinfeng, Z. Hongwei, and R. Hui, “Detection system of meter pointer based on computer vision,” in Proceedings of the International Conference on Electronic and Mechanical Engineering and Information Technology (EMEIT '11), pp. 908–911, August 2011. View at Publisher · View at Google Scholar · View at Scopus
  2. Z. Jun, F. Haiping, and K. Ming, “Automatic calibration of dial gauges based on computer vision,” Infrared and Laser Engineering, vol. 37, pp. 333–336, 2008. View at Google Scholar · View at Scopus
  3. M. K. Gellaboina, G. Swaminathan, and V. Venkoparao, “Analog dial gauge reader for handheld devices,” in Proceedings of the IEEE 8th Conference on Industrial Electronics and Applications (ICIEA '13), pp. 1147–1150, June 2013. View at Publisher · View at Google Scholar · View at Scopus
  4. F. C. Alegria and A. C. Serra, “Automatic calibration of analog and digital measuring instruments using computer vision,” IEEE Transactions on Instrumentation and Measurement, vol. 49, no. 1, pp. 94–99, 2000. View at Publisher · View at Google Scholar · View at Scopus
  5. B. Sun, C. Zhang, Z. Liu, B. Qian, and H. P. Zhang, “Study of FPGA-based visual detection for pointer dial,” in Proceedings of the 26th Chinese Control and Decision Conference (CCDC '14), pp. 1467–1472, Changsha, China, May-June 2014. View at Publisher · View at Google Scholar
  6. G. Gui-fa, W. Ren-huang, and W. Huan, “A pretreatment of image based on reformative homomorphic filtering,” Journal of Guangdong University of Technology, vol. 26, no. 3, pp. 57–59, 2009. View at Google Scholar
  7. D.-C. Luo, S.-C. Wang, H.-G. Zeng, Z.-Z. Li, and X.-M. Lu, “Design of recognition system of analog measuring instruments,” Laser & Infrared, vol. 37, no. 4, pp. 377–380, 2007. View at Google Scholar
  8. H. Wen, Z. Teng, S. Yang, and S. Liu, “Intelligent reading method for analog meter based on computer vision,” Chinese Journal of Scientific Instrument, vol. 28, no. 7, pp. 1234–1239, 2007. View at Google Scholar · View at Scopus
  9. W. Qi, T. Xiling, D. Cheng, H. Yao, and F. Yanjun, “Automatic alignment system based on center point recognition of analog measuring instruments dial,” in Proceedings of the 39th Annual Conference of the IEEE Industrial Electronics Society (IECON '13), pp. 5532–5536, IEEE, Vienna, Austria, November 2013. View at Publisher · View at Google Scholar · View at Scopus
  10. F.-J. Sun, T.-J. An, J.-Q. Fan, C.-P. Yang, and Z. Xu, “Study on the recognition of pointer position of electric power transformer temperature meter,” Proceedings of the Chinese Society of Electrical Engineering, vol. 27, no. 7, pp. 470–475, 2007. View at Google Scholar · View at Scopus
  11. Z. Le, The research of indication recognition for dial instruments based on image processing [M.S. thesis], North China Electric Power University, Beijing, China, 2013.
  12. S.-H. Lu, Study on automatic test system for high precision pointer instrumeng based on machine vision [Dissertation for the Master Degree in Engineering], Harbin Institute of Technology, 2013.
  13. Q. Wang, Y. Fang, W. Wang, M. Wu, R. Wang, and Y. Fang, “Research on automatic reading recognition of index instruments based on computer vision,” in Proceedings of the 3rd International Conference on Computer Science and Network Technology (ICCSNT '13), pp. 10–13, October 2013. View at Publisher · View at Google Scholar
  14. Z. Shutao, Direct reading instrument calibration method based on computer vision [Ph.D. thesis], North China Electric Power University, Beijing, China, 2006.
  15. S. W. Wang and Y. W. Dai, “Automatic recognizing of multi-pointer’ water meter using image processing,” Chinese Journal of Scientific Instrument, vol. 26, no. 11, pp. 1178–1120, 2005. View at Google Scholar
  16. F.-J. Sun, T.-J. An, J.-Q. Fan, C.-P. Yang, and Z. Xu, “Study on the recognition of pointer position of electric power transformer temperature meter,” Proceeding of the CSEE, vol. 27, no. 7, pp. 70–78, 2007. View at Google Scholar
  17. B. Hemming and H. Lehto, “Calculation of uncertainty of measurement in machine vision case: a system for the calibration of dial indicators,” in Proceedings of the 18th IEEE Instrumentation and Measurement Technology Conference, pp. 665–670, Budapest, Hungary, May 2001. View at Scopus
  18. B. Chen and L. Jin, “A new meter dial indicator detection method based on center point projection,” Application Research of Computers, vol. 22, no. 1, pp. 246–248, 2005. View at Google Scholar
  19. J. Zetao, W. Shi, and L. Kewei, “An effective method on non-contact measurement for gauges with pointer,” Computer Application and Software, vol. 26, no. 4, pp. 281–285, 2009. View at Google Scholar
  20. J.-C. Yin, J. Lou, and H.-T. Zhao, “Vision system and its application in the auto door latch assembly line,” Techniques of Automation and Applications, no. 1, pp. 79–82, 2010. View at Google Scholar
  21. H. Zhou, H.-E. Xu, and C.-G. Geng, “Automatic checking of pointer automotive dashboard based on HIS model and Hough transformation,” Journal of Zhejiang University (Engineering Science), vol. 44, no. 6, pp. 1108–1113, 2010. View at Publisher · View at Google Scholar · View at Scopus
  22. J. Han, E. Li, B. Tao, and M. Lv, “Reading recognition method of analog measuring instruments based on improved hough transform,” in Proceedings of the IEEE 10th International Conference on Electronic Measurement and Instruments (ICEMI '11), pp. 337–340, August 2011. View at Publisher · View at Google Scholar · View at Scopus
  23. C. Li, L. Yang, Z. Liu, and K. Li, “A new automatic seeded region growing algorithm,” in Proceedings of the 6th International Congress on Image and Signal Processing (CISP '13), vol. 1, pp. 543–549, Hangzhou, China, December 2013. View at Publisher · View at Google Scholar · View at Scopus
  24. S. Hengqiang and W. Changji, “A new algorithm based on super-green features for ostu's method using image segmentation,” in Proceedings of the World Automation Congress (WAC '12), pp. 1–4, June 2012. View at Scopus
  25. J. Xu, X. Sun, D. Zhang, and K. Fu, “Automatic detection of inshore ships in high-resolution remote sensingimages using robust invariant generalized hough transform,” IEEE Geoscience and Remote Sensing Letters, vol. 11, no. 12, pp. 2070–2074, 2014. View at Publisher · View at Google Scholar · View at Scopus