Table of Contents Author Guidelines Submit a Manuscript
Journal of Sensors
Volume 2018, Article ID 1037083, 9 pages
https://doi.org/10.1155/2018/1037083
Research Article

Measuring the Angular Velocity of a Propeller with Video Camera Using Electronic Rolling Shutter

School of Mechanical Engineering, Hebei University of Technology, Tianjin 300130, China

Correspondence should be addressed to Tiejun Li; nc.ude.tubeh@nujeit_il

Received 7 October 2017; Accepted 12 February 2018; Published 21 March 2018

Academic Editor: Stephane Evoy

Copyright © 2018 Yipeng Zhao et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Noncontact measurement for rotational motion has advantages over the traditional method which measures rotational motion by means of installing some devices on the object, such as a rotary encoder. Cameras can be employed as remote monitoring or inspecting sensors to measure the angular velocity of a propeller because of their commonplace availability, simplicity, and potentially low cost. A defect of the measurement with cameras is to process the massive data generated by cameras. In order to reduce the collected data from the camera, a camera using ERS (electronic rolling shutter) is applied to measure angular velocities which are higher than the speed of the camera. The effect of rolling shutter can induce geometric distortion in the image, when the propeller rotates during capturing an image. In order to reveal the relationship between the angular velocity and the image distortion, a rotation model has been established. The proposed method was applied to measure the angular velocities of the two-blade propeller and the multiblade propeller. The experimental results showed that this method could detect the angular velocities which were higher than the camera speed, and the accuracy was acceptable.

1. Introduction

Rotation is one of basic motions, which is common in machines like motors, gears, and other wheels. The rotation should be under control to keep machines in good operation, and a lot of mechanical failures are caused by the rotary movement. So it is important to measure the angular velocity. To measure angular velocities, contact-type sensors are widely used, such as mechanical tachometers, optical tachometers, photoelectric encoders, and optical encoders [1]. These methods are usually based on mechanical contact, and as a result, they are easily affected by the rotation of the target or the small target inertia. In the past twenty years, noncontact methods have been developed like tomography, ultrasound, laser, and computer vision [2]. The advanced sensors can overcome the defects of contact-type sensors, and computer vision could be more widely used compared with the other noncontact sensors.

Over the past decade, some researchers have focused on angular measurements based on computer vision. Wang et al. [3] measured motor angular velocities with blur images which had motion information. Angular velocities could be extracted in those motion-blur images in polar coordinates. Ait-Aider et al. [4, 5] obtained the object pose and velocity by an electronic rolling shutter (ERS) camera. His method was based on the assumption that all the lines in the real world were straight and those lines warped in the image captured by the ERS camera. Magerand et al. [6] measured the object pose and motion from a single ERS image with automatic 2D-3D matching. He and Wei [7] measured a shaft’s velocity by using the ERS camera. Zhu and Yu [2] measured angular velocities of the object by Hough transform.

Measuring angular velocity with computer vision also has some defects, such as the environmental air quality and time-consuming process. There are two ways to improve the processing speed. One way is to reduce the resolution of the camera, but this way is not consistent with the development trend of cameras [8, 9]. The other way is to reduce the speed of the camera [10]. In other words, it is measuring a high-speed rotation with a low-speed camera. But when the angular velocity is much higher than the camera’s speed, it is difficult to find out the angle for the ambiguity or the distortion in the image. There are few studies on measuring the high-speed rotation with a low-speed camera. The row speed of an ERS camera is much higher than the speed of the camera. This special property would help measure the high-speed rotation of some objects, such as wheels or propellers which have symmetrical structures. Those objects are very common in machines and easy to be analyzed to extract angular velocities. The previous researches did not attempt to measure the high-speed rotation by an ERS camera [47].

In this paper, an ERS camera works as a sensor to measure the angular velocity of the propeller which rotates faster than the speed of the camera. To measure the angular velocity, a simulation is created to demonstrate through images that the propeller rotates at different speeds. According to the geometrical features in the images, an algorithm is proposed to calculate the angular velocity of the propeller. Experiments are also demonstrated to test and verify the proposed method for the measurement of angular velocity. The rest of the paper is organized as follows. In Section 2, the working principle of ERS is presented, and an algorism is proposed and verified on a simulation. In Section 3, two experiments are established to verify the proposed method in the actual environment. Finally, Section 4 concludes the paper.

2. Method

In the field of digital signal processing, the Nyquist–Shannon sampling theorem establishes a sufficient condition for a sample rate that permits a discrete sequence of samples to capture all the information from a continuous-time signal of finite bandwidth. A sufficient sample rate is at least double the highest frequency to be sampled; moreover, in practice, the sample rate is usually fourfold. To measure angular velocity by the camera, there is massive data to be processed. In order to decrease the data generated by the camera, a low-speed camera with ERS could be used. The ERS camera has a special feature that records images row by row, and this feature could increase the speed of the camera in some way.

2.1. Working Principle of ERS

The ERS camera is a type of CMOS image sensor and it is very common on cellphones. The amount of signal generated by the image sensor depends on the amount of light that falls on the images, in terms of exposure.

Therefore, an on-chip electronic shutter is required to control the intensity and duration of exposure. The shutter of CMOS has two types, global shutter and rolling shutter. With global shutter image sensors, every row of the pixels in the image is exposed at the same time. So there are no motion artifacts in the resulting image. With rolling shutter image sensors, the rows in the image are exposed in sequence starting at the top and proceeding row by row to the bottom. For each row in the image, the time of integration and row delay is fixed, leading to a uniform staggered time across the frame. When the motion direction of the object is orthogonal to the row direction of the image, visual artifacts will arise in the image captured. The working principle of ERS is demonstrated in Figure 1. Since the image sensor effectively integrates each row of the pixel array at a different point in time, the static lamppost, which is vertical to the expressway, becomes inclined as shown in Figure 1(b). With such characteristics of the ERS camera, the low-speed camera can record more information of the fast-moving object. Although some works applied this property to the measurement of linear motion or vibration [7, 11], most works tried to eliminate this effect on the image [1214]. In this paper, this property is used for the measurement of the angular velocity of the propeller.

Figure 1: Working principle of ERS: (a) time sequence of row delay and integration and (b) lamppost shot by the ERS camera in a car with a speed about 90 km/h.
2.2. Background Segmentation

Before the calculation of rotating velocity, the propeller should be segmented from the background. Gaussian mixture model (GMM) [15] is used as an unsupervised image change detection to extract the propeller. Expectation maximization (EM) [16] is a popular technique used to determine the parameters of a mixture with an a priori given number of components. The EM algorithm provides a particular way that implements the maximum likelihood estimation for the parameter in GMM. But the M-step of the EM algorithm cannot evaluate the prior distribution in a closed form because of the complexity of the maximum likelihood estimation. Therefore, for each iteration in the EM algorithm, a reparatory projection step [17] is applied to the M-step for the purpose that the prior probabilities are positive and sum to one.

The segmentation by GMM has many defects like holes and noises in the image. Before the segmentation, the images are denoised with the method proposed by Xu et al. [18]. The rotating calculation closely depends on the geometric characteristics of the propeller segmented from the background. Those defects on the propeller would cause errors on the result of the calculation. The holes should be filled and the little isolate area would be erased.

2.3. Calculation of Angular Velocity

A propeller is selected as the measurand as shown in Figure 2. This propeller has two isolated blades and its initial angle is dictated by . In order to demonstrate the patterns recorded by the ERS camera, a simulation is illustrated in Figure 3 (in this paper, all the units of angular velocity are r/s). The propeller is at the center of the single image whose size is 1000 × 1000. The exposure sequence of the ERS camera is from top to bottom. The time of row delay is 996 μs, and the exposure time of every row is 4996 μs. There is no frame delay, so the whole image time is one second. The propeller rotates clockwise at different speeds as shown in Figure 3. The initial angle of the first-row images in Figure 3 is 0.25; the initial angle of the second-row images in Figure 3 is 0.5. To extract the angular velocity from a single image, the procedure includes two parts: searching the center of the propeller and extracting the rotating angle. All the operations are based on an assumption that the two-blade propeller has central symmetry.

Figure 2: Two-blade propeller.
Figure 3: Simulation of a two-blade propeller: (a) 0.5 r/s, (b) 1 r/s, (c) 2 r/s, (i) , and (ii) .

The points on the different blade edges cluster into different sets: , where is the quantity of the isolated blade. Point is picked out at random in the image. The distance from to the point in is

Let , where max() is a mathematical function to find the maximum values of its input. If can satisfy the condition in (2), the center is .

But this method for center calculation is sensitive to noise and it is ambiguous in some situations as shown in Figure 3(iic, iid). So this method could not be used for finding the center, but it can work as a quick search and narrow the area of the rotating center. Let be the expectation of:

The rotating center belongs to the area where is a threshold belonging to the range of (0, 0.1). When , the areas of color in Figure 4 are the range of the center. To find the precise center from , the angular velocity should be considered.

Figure 4: Searching the center of the propeller: (a)–(c) correspond to Figures 3(a)–3(c), and (i) and (ii) correspond to (i) and (ii) in Figure 3.

For the purpose of a better solution, a scanning method is proposed. Point is picked from , and it works as the center of a circle whose radius is . This circle scans the blade anticlockwise, and its initial position is at its left horizontal axis. There are rising edges and falling edges on the circle. The points on the rising edges are denoted by , where is the quantity of the rising edges on the circle. Meanwhile, the points on the falling edges are denoted by . The ordinates of these edge points are and , respectively. The angles between the initial position and these edge points are and , respectively. Let and , where min() is a mathematical function to find the minimum values of its input. The propeller rotates from to as where is the quantity of blades and its value is two in the simulation. When the value of is from the first class in (5), the rotational direction is clockwise. When the value of is from the second class in (5), the rotational direction is anticlockwise. The angular velocity could be calculated out by the points on the rising edges. where is the row delay time of the ERS camera. The points on the falling edges can also be used on the calculation of angular velocity: where , , and is similar to . Let be the accumulative error corresponding to radius at point .

Let , where is a function . The value of of different points are normalized and indexed by a color map as shown in Figure 4. Let . Point is the center of the propeller. In order to promote the accuracy of the calculation, the optimized value of angular velocity is weighted by .

The result of the simulation is shown in Table 1. The actual center position is (500,500), and the most deviation of the center of the propeller is within 5 pixels. The relative errors of angular velocity are no more than 2%.

Table 1: Result of the simulation.

3. Experiments and Results

In order to verify the measurement of angular velocity with the ERS camera, two experiments were designed and carried out under controlled conditions. The propeller to be measured had two or multiple blades. All the angular velocities to be detected were higher than the speed of the camera.

3.1. Experimental Setup for the Two-Blade Propeller

The measurement system included an ERS camera, an extra light, and a propeller, as shown in Figure 5(a). The camera was from Basler in Germany with the model of acA3800-14uc. It had an image sensor of MT9J003 whose shutter type was ERS. The row delay of MT9J003 was 23.09 μs, and its exposure time could be selected from 35 μs to 1,599,535 μs by software. The camera selected resolution was 3840 × 2748, and the maximum frame rate was 14 frames per second. The propeller had two centrosymmetric blades and its diameter was 135 mm. Due to the same color of the propeller and the background, a black poster board was typically used as a background for contrast. The propeller was driven by a motor with encoder and the motor was mounted on the top of a tripod. When the propeller axis was not parallel to the camera axis, the image of the propeller in the video would be declining. Hence, the axis of the camera and the axis of the propeller would be set coaxial. The value of row exposure time was 350 μs and the extra light was adjusted appropriately to get good quality pictures by the camera. A view from the camera was shown in Figure 5(b). Before recording the rotation of the propeller, the ERS camera was calibrated with the method proposed by Heikkila and Silven [19].

Figure 5: Rotation experiment on the two-blade propeller: (a) experimental layout and (b) view of the propeller from the RES camera.
3.2. Results of the Two-Blade Propeller

The photoelectric encoder on the motor recorded the propeller velocities, while the ERS camera shoot the rotation of the propeller. The propeller rotated anticlockwise at different velocities as illustrated in Figure 6. The contours of the propeller were extracted by GMM and they were repaired by a technology that included filling holes and filtering noise. The results of the background segmentation are shown in Figure 7. The areas of color in Figure 7 are the range of the center of the two-blade propeller, when the value of is 0.05. The different colors in these areas indicate the value of of different points. The camera captured four images at a time and it took four times to capture 16 images. The actual velocities of those images are shown in the third column of Table 2. The second column of Table 2 shows the deviation of the center of the propeller. The maximum deviation of the center is within six pixels. The angular velocities are obtained by the method proposed in this paper, and they are given in the fourth column of Table 2. The related error of velocity is less than 4%.

Figure 6: Two-blade propeller rotated at different velocities: (a) 8.51 r/s, (b) 15.91 r/s, (c) 22.00 r/s, and (d) 25.18 r/s.
Figure 7: Searching the center of the two-blade propeller: (a)–(d) correspond to Figures 6(a)6(d).
Table 2: Angular velocity of the two-blade propeller.
3.3. Experiment on the Multiblade Propeller

The multiblade propeller was also measured by almost the same equipment in the experiment of the two-blade propeller. The ducted propeller was part of the engineering, as shown in Figure 8(a), which was a product of QX MOTOR. The propeller had five blades and its diameter was 64 mm. The row exposure time was adjusted to 350 s and a view from the camera was shown in Figure 8(b). Due to the lack of an encoder, a handheld tachometer was used to measure the angular velocities of the propeller.

Figure 8: Rotation experiment on the multiblade propeller: (a) propeller driven by motor and (b) view of the propeller from the RES camera.
3.4. Results of the Multiblade Propeller

The tachometer recorded the multiblade propeller velocities, when the ERS camera shot the rotation of the propeller. Figure 9 shows samples of images which were taken during the rotation with different velocities. When the value of is set to 0.06, the color areas in Figure 10 are the range of the center of the multiblade propeller. The ERS camera also captured four images at a time, and it captured 16 images within four times. With the method proposed in this paper, the angular velocities were obtained as shown in Table 3. The maximum deviation of the center is within seven pixels, and the related error of velocity is less than 5%. The maximum angular velocity is six times more than the speed of the camera.

Figure 9: Multiblade propeller rotated at different velocities: (a) 29.06 r/s, (b) 57.77 r/s, (c) 75.28 r/s, and (d) 109.50 r/s.
Figure 10: Searching the center of the multiblade propeller: (a)–(d) correspond to Figures 9(a)9(d).
Table 3: Angular velocity of the multiblade propeller.

4. Conclusions

The computer vision provides an effective way to measure angular velocities. In this paper, an algorithm has been proposed for the measurement of angular velocity of propeller with the low-speed camera. It bases on the principle of rolling shutter. By analyzing the exposure procedure of the camera, a simulation model was established to simulate the motion of propeller captured by the ERS camera. According to the recorded contour shape of the propeller, a method was proposed to search the rotational center and angular velocity. To validate this method, the angular velocities of a two-blade propeller and a three-blade propeller were measured with this technique. The experimental results show that the proposed method can be used to measure the high-speed rotation of the propeller with the low-speed camera. The maximum angular velocity could be six times more than the speed of the camera. Furthermore, the simulation and experiments show that this method is efficient, applicable, noncontact, and less expensive for the measurement of angular velocity of the propeller.

The method proposed in this paper still has some limitations and imperfect aspects. The detection accuracy is sensitive to illumination. Improving exposure time can get good quality images, but the captured image will be very blur when the propeller rotates fast. In the future work, automatic exposure hardware and related algorithm should be added to the system. This method bases on the precise contour of the propeller. So, the background segmentation affects the result greatly. To improve the quality of background segmentation, an artificial neural network will be applied in the future.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

This paper is supported by the National Nature Science Foundation of China (no. 51175145) and Hebei Province Science and Technology Support Program (no. 13211910D).

References

  1. A. H. Kadhim, T. K. M. Babu, and D. O’Kelly, “Measurement of steady-state and transient load-angle, angular velocity, and acceleration using an optical encoder,” IEEE Transactions on Instrumentation and Measurement, vol. 41, no. 4, pp. 486–489, 1992. View at Publisher · View at Google Scholar · View at Scopus
  2. X.-D. Zhu and S.-N. Yu, “Measurement angular velocity based on video technology,” in 2011 4th International Congress on Image and Signal Processing, vol. 4, pp. 1936–1940, Shanghai, China, 2011. View at Publisher · View at Google Scholar · View at Scopus
  3. S. Wang, Q. Li, and B. Guan, “A computer vision method for measuring angular velocity,” Optics and Lasers in Engineering, vol. 45, no. 11, pp. 1037–1048, 2007. View at Publisher · View at Google Scholar · View at Scopus
  4. O. Ait-Aider, N. Andreff, J. M. Lavest, and P. Martinet, “Exploiting rolling shutter distortions for simultaneous object pose and velocity computation using a single view,” in Fourth IEEE International Conference on Computer Vision Systems (ICVS'06), p. 35, New York, NY, USA, 2006. View at Publisher · View at Google Scholar · View at Scopus
  5. O. Ait-Aider, A. Bartoli, and N. Andreff, “Kinematics from lines in a single rolling shutter image,” in 2007 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–6, Minneapolis, MN, USA, 2007. View at Publisher · View at Google Scholar · View at Scopus
  6. L. Magerand, A. Bartoli, O. Ait-Aider, and D. Pizarro, “Global optimization of object pose and motion from a single rolling shutter image with automatic 2D-3D matching,” in Computer Vision – ECCV 2012. ECCV 2012, vol. 7572 of Lecture Notes in Computer Science, pp. 456–469, 2012. View at Publisher · View at Google Scholar · View at Scopus
  7. Z.-Y. He and P. Wei, “New method for 2D velocity measurement based on electronic rolling shutter,” in International Symposium on Photoelectronic Detection and Imaging 2007: Related Technologies and Applications, vol. 6625, Beijing, China, 2007. View at Publisher · View at Google Scholar · View at Scopus
  8. J. G. Chen, N. Wadhwa, Y.-J. Cha, F. Durand, W. T. Freeman, and O. Buyukozturk, “Modal identification of simple structures with high-speed video using motion magnification,” Journal of Sound and Vibration, vol. 345, pp. 58–71, 2015. View at Publisher · View at Google Scholar · View at Scopus
  9. A. Davis, K. L. Bouman, J. G. Chen, M. Rubinstein, F. Durand, and W. T. Freeman, “Visual vibrometry: estimating material properties from small motions in video,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 39, no. 4, pp. 732–745, 2017. View at Publisher · View at Google Scholar · View at Scopus
  10. X. Zhu and S. Yu, “Researching of the method of measuring angle velocity based on non-high video camera,” Computer Measurement & Control, vol. 20, no. 1, pp. 53–55, 2012. View at Google Scholar
  11. A. Davis, M. Rubinstein, N. Wadhwa, G. J. Mysore, F. Durand, and W. T. Freeman, “The visual microphone: passive recovery of sound from video,” ACM Transactions on Graphics, vol. 33, no. 4, article 79, 2014. View at Publisher · View at Google Scholar · View at Scopus
  12. Y. Sun, G. Liu, and Y. Sun, “An affine motion model for removing rolling shutter distortions,” IEEE Signal Processing Letters., vol. 23, no. 9, pp. 1250–1254, 2016. View at Publisher · View at Google Scholar · View at Scopus
  13. C.-K. Liang, L.-W. Chang, and H. H. Chen, “Analysis and compensation of rolling shutter effect,” IEEE Transactions on Image Processing., vol. 17, no. 8, pp. 1323–1330, 2008. View at Publisher · View at Google Scholar · View at Scopus
  14. O. Grau and J. Pansiot, “Motion and velocity estimation of rolling shutter cameras,” in CVMP '12 Proceedings of the 9th European Conference on Visual Media Production, pp. 94–98, London, UK, 2012. View at Publisher · View at Google Scholar · View at Scopus
  15. K. Blekas, A. Likas, N. P. Galatsanos, and I. E. Lagaris, “A spatially constrained mixture model for image segmentation,” IEEE Transactions on Neural Networks, vol. 16, no. 2, pp. 494–498, 2005. View at Publisher · View at Google Scholar · View at Scopus
  16. A. T. Galecki, T. R. Ten Have, and G. Molenberghs, “Simple and fast alternative to the EM algorithm for incomplete categorical data and latent class models,” Computational Statistics & Data Analysis, vol. 35, no. 3, pp. 265–281, 2001. View at Publisher · View at Google Scholar · View at Scopus
  17. C. Nikou, N. P. Galatsanos, and A. C. Likas, “A class-adaptive spatially variant mixture model for image segmentation,” IEEE Transactions on Image Processing, vol. 16, no. 4, pp. 1121–1130, 2007. View at Publisher · View at Google Scholar · View at Scopus
  18. L. Xu, C. Lu, Y. Xu, and J. Jia, “Image smoothing via L0 gradient minimization,” ACM Transactions on Graphics, vol. 30, no. 6, article 174, 2011. View at Publisher · View at Google Scholar · View at Scopus
  19. J. Heikkila and O. Silven, “A four-step camera calibration procedure with implicit image correction,” in Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognitionpp, pp. 1106–1112, San Juan, PR, USA, 1997. View at Publisher · View at Google Scholar