Journal of Sensors

Journal of Sensors / 2020 / Article
Special Issue

Sensors and Applications in Agricultural and Environmental Monitoring

View this Special Issue

Research Article | Open Access

Volume 2020 |Article ID 3125708 | https://doi.org/10.1155/2020/3125708

Bo Zhao, Ye Wang, Jun Fu, Rongqiang Zhao, Yashuo Li, Xin Dong, Chengxu Lv, Hanlu Jiang, "Online Measuring and Size Sorting for Perillae Based on Machine Vision", Journal of Sensors, vol. 2020, Article ID 3125708, 8 pages, 2020. https://doi.org/10.1155/2020/3125708

Online Measuring and Size Sorting for Perillae Based on Machine Vision

Academic Editor: Yuan Li
Received18 Sep 2019
Accepted31 Oct 2019
Published13 Jan 2020

Abstract

Perillae has attracted an increasing interest of study due to its wide usage for medicine and food. Estimating quality and maturity of a perillae requires the information with respect to its size. At present, measuring and sorting the size of perillae mainly depend on manual work, which is limited by low efficiency and unsatisfied accuracy. To address this issue, in this study, we develop an approach based on the machine vision (MV) technique for online measuring and size sorting. The geometrical model and the corresponding mathematical model are built for perillae and imaging, respectively. Based on the built models, the measuring and size sorting method is proposed, including image binarization, key point determination, information matching, and parameter estimation. Experimental results demonstrate that the average time consumption for a captured image, the average measuring error, the variance of measuring error, and the overall sorting accuracy are 204.175 ms, 1.48 mm, 0.07 mm, and 93%, respectively, implying the feasibility and satisfied accuracy of the proposed approach.

1. Introduction

Perillae is widely distributed throughout Asia, and it has attracted an increasing interest in the field of medicine and pharmacy. Perillae has been applied for bacteriostatic, detoxifying, antitussive, and phlegm for thousands of years in traditional Chinese medicine. Recently, more attentions have been paid to the study with respect to perillae, including the perillae herba ethanolic [1], the luteolin [2], and the rosmarinic acid [3]. Besides, perillae is also a popular food in countries such as China, Japan, and Korea. The size of perillae indicates its quality and maturity that closely relates to its medical and edible value. Hence, it is of great significance to measure and sort the size of perillae in advance. At present, measuring and sorting the size of perillae mainly depend on manual work, which is of low efficiency and unsatisfied accuracy. Therefore, it is valuable to develop an approach for accurate online measurement and sort of perillae.

Machine vision (MV) is the technique to provide imaging-based automatic inspection [46], and it has been widely used in the fields of industry, such as object detection and robot guidance [7, 8]. For the applications in agriculture, previous work has explored into various crops [9, 10]. In [11], a compact machine vision system based on hyperspectral imaging and machine learning is presented to detect aflatoxin in chili pepper. In [12], a hierarchical grading method is applied for real-time defect detection and size sorting of potatoes. In [13], a study is conducted to predict ripening quality in mangoes using RGB images, for which the hierarchical clustering method is employed to classify the ripening period into five stages based on quality parameters. In [14], the authors combine the MV and the support vector machine to develop an intelligent system for sorting of peeled pistachio kernels and shells. In [15], Sabliov et al. develop an MV-based method to measure volume and surface area of ellipsoidal agricultural products by regarding the objects as the sum of superimposed elementary frustums of right circular cones. In [16], Yao et al. develop real-time detection instrumentation for aflatoxin-contaminated corn using a narrow-band fluorescence index. In [17], Pedreschi et al. present an inexpensive computer vision system for measuring the color of a highly heterogeneous food material such as potato chips. In [18], Huang et al. propose an approach for identification of defect pleurotus geesteranus based on computer vision. In [19], Sun et al. develop a machine vision system and the dynamic weighing system for the measurement of egg external physical characteristic and weight, such that the nondestructive method for online estimation of egg freshness is achieved. Additionally, a set of studies have been conducted on measurement, cabbage [20], flower mushroom [21], cherry [22], litchi [23], and other agricultural products [2429].

However, the automatic online detection of perillae has not been explored till now. The geometry model and the corresponding mathematical model of perillae used for automatic detection have not been built yet. The size sorting of perillae still requires manual work. To address this issue, in this study, we develop a novel approach for online measuring and size sorting of perillaes. A charge-coupled device (CCD) camera is employed to acquire the images of perillaes in the lighting box, and the MV method is proposed for image processing. The contributions of this study can be summarized as follows. (1)We build the general geometry model of perillae and the corresponding mathematical model for image processing(2)We propose the MV-based method for accurate online measuring and size sorting of perillae under the proposed models(3)We develop the practical system for our theoretical method, the feasibility and accuracy of which are verified by the experimental results

The remainder of this paper is organized as follows. Section 2 introduces the proposed geometry model and mathematical model. Section 3 provides the MV-based online measuring method. Section 4 presents the experimental results and analysis. Section 5 concludes this paper.

2. Geometry and Mathematical Models

2.1. Geometry Model

Figure 1 shows a typical perillae for medical and food usage. The notation is used to denote the length of a perillae, i.e., the maximum size of perillae leaf. We mainly consider the length between and as it is the useful part of perillae, regardless of the medical or food usage. The length between and , i.e., the length of perillae, is not taken into the measurement.

Figure 2 presents the sorting principle for the size of perillae. In this study, the sizes are divided into four grades according to . As shown in Figure 2, the first grade is expressed as “Spec M” with the size of , denoting the smallest perillae. The second grade, “Spec L(s),” and the third grade, “Spec L(b),” are determined by and , respectively. The fourth grade, expressed as “Spec 2M,” is used to represent the largest size, given by .

2.2. Mathematical Model

Next, we provide the mathematical model for imaging of perillae. The camera used in this study was calibrated before it is employed for imaging. As shown in Figure 3, we captured 20 chessboard images with different poses and then utilized the calibration tool in the OpenCV to calibrate the camera.

Let , and denote the horizontal coordinate, the vertical coordinate, and the position perpendicular to the chessboard in the camera coordinate, respectively, and let , and denote the corresponding coordinates in the world coordinate. The calibration results demonstrate the distortion error, and the reprojection error can be ignored. Assume is a pixel in the world coordinate system, and this pixel is then projected into the camera coordinate system, expressed as . Let be the intrinsic matrix of the used camera. Based on the used calibration tool, the transformation between the world coordinate system and the camera coordinate system can be expressed as

The matrix is obtained through the calibration, and the result is

By combining the geometrical model and (2), we can rewrite the size of perillae as follows: where , , , and are the coordinates of points and in the world coordinate system and , , , and are the coordinates of points and in the image coordinate system. Obviously, the required size can be calculated out directly if parameter and the coordinates of and are known. Hence, we introduce the proposed method to determine this information in the next section.

3. Method for Measuring Perillae

In this section, we provide the proposed method for measuring and size sorting of perillae based on the MV and the proposed models. The method is composed of the following steps, binarizing the original image, determining the positions of and , matching information, and estimating parameter .

3.1. Image Binarization

Figure 4(a) presents the original image of perillaes. The image should be binarized before it is used for measuring. The binarization principle is given by where is the value of the pixel in the binarized image. The depth of the image is 8 bits, and therefore, the value of white pixels is set to 255. As indicated in (4), a pixel is determined to be white as long as , , and , where and are the values of the pixel in the blue channel and the red channel, respectively. The first condition, , ensures that the background is set to be black, and only the area containing perillaes is considered. The second and the third conditions are utilized to determine whether a pixel belongs to the area of perillae. The binarized image is shown as Figure 4(b).

3.2. Search for

We first derive the contour from the binarized image. The strategy is to set the thresholds for the largest and the smallest areas of the perillaes. Thus, the areas that satisfy the constraint of thresholds are selected, and their contours are regarded as the contours of perillaes. This process can be expressed as where denotes the total number of contours in the binarized image, denotes the area of the contours, and is the judging result. Then, the minimum enclosing circle and the convex hull of each perillae can be obtained. These perillaes are labelled as 1, 2, 3, and 4 from left to right based on the following judgement: where is the judgement for each position of perillae, denotes the horizontal coordinate of the center of the enclosing circle, and “” denotes the amount of perillae in the captured image. The results are displayed in Figure 5(b) and are saved as “.

Finally, we utilize the following strategy to find the endpoints of the contours, written as and : where denotes the pixel in the convex hull, and are the coordinates of , and are the coordinates of the center of the enclosing circle, represents the radius of the enclosing circle, and is the amount of pixels of the convex hull. In fact, can be regarded as the point furthest away from the center of the enclosing circle, and is the point furthest away from . These two points are generally and . The data of “,” “, and are saved as “.

3.3. Search for

To acquire the position of , we conduct one close operation, four erode operations, and four dilate operations in the OpenCV for the binarized image, successively. The results are presented in Figure 6(a). Then, we derive the treated contours using the strategy given by (7) and (8). The results are shown in Figure 6(b) and are saved as “.” Finally, the minimum enclosing circle and the convex hull of the derived contours are obtained, shown as Figure 6(c). Similar to the procedure to search for and , two endpoints, and , are selected as the alternative points of . The other one is defined as . The data of “,, and are saved into “.

3.4. Information Matching

Next, we should recognize and from , , , and . For this purpose, we first compute the Euclidean distance from to , to , to , and to , respectively. Then, we compare the four distances and find out the two points that lead to the largest distance. Assuming the two points are and , we can draw the conclusion that is , as is the alternative point of and , and the distance from to is always larger than that of to . Repeat the procedure for all “” contours of perillae, such that the position of and of each perillae can be obtained. We summarize all four possibilities in Table 1.


Points leading to the largest distance

and
and
and
and

3.5. Scale Factor Estimation and Size Computation

As indicated in (8), parameter is the factor controlling the scale of imaging. Hence, we utilize the following scale method to estimate . As shown in Figure 7, the original plane is . We use to represent the manual measurement and use to represent the corresponding measurement in the camera coordinate system. Then, we manually set to to acquire the measurement by using (8). Based on these measurements, we have

Therefore, can be calculated by

To eliminate the measurement error, we repeat the above process for 20 times, and the average value is used for the experiments, computed by where denotes the calculated value in the trial.

The proposed approach is summarized in Figure 8.

4. Experimental Results

Our experimental system is shown in Figure 9, composed of a lighting box and a personal computer (PC). The lighting box contains four light-emitting diode (LED) light located as a ring on the top of the box and a CCD camera in the center of the box. The experimental setup is described as follows. Before conducting the experiments, we manually set and obtain by using the strategy introduced in Subsection 3.5. This value of is used for all experiments in this study. First, we randomly chose 100 perillaes and manually measured their sizes. Then, we used the developed system to measure these perillaes.

The results by both manual measurement and automatic online measurement are presented in Figure 10. We computed the measurement error for each perillae, and the results are provided in Figure 11. Obviously, the results obtained by online measurement are close to those obtained by manual measurement. As provided in Table 2, among the 100 perillaes, the maximum measuring error (MAME) is 3.66 mm, while the minimum measuring error (MIME) is 0.02 mm. Most of the errors are lower than 3 mm. The overall average measuring error (OAME) and the variance of measuring error (VME) are 1.47 mm and 0.07 mm, respectively.


ATCCI (ms)MAME (mm)MIME (mm)OAME (mm)VME (mm)OSA (%)

Results204.1753.660.021.470.0793

5. Conclusions

We developed an MV-based approach for automatic measuring and size sorting of perillae. We first built the geometric model for perillae and the mathematical model for imaging. Based on the models, the measuring and size sorting method was proposed, including image binarization, key point determination, information matching, and parameter estimation. We employed the CCD camera for imaging and the OpenCV tools for image processing. Experimental results have verified the feasibility of our system and its high accuracy of measuring and size sorting. By using 100 perillaes for experiments, the MAME and the MIME are 3.66 mm and 0.02 mm, respectively. Most of the errors are lower than 3 mm. The OAME and the VME are 1.47 mm and 0.07 mm, respectively. Further study could address on the following issues. First, miniaturization for the developed system requires future work to make the proposed approach more particle. Second, utilizing the RGB image or hyperspectral image to detect the maturity of perillaes is also valuable for further study.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors claim no conflicts of interest.

Acknowledgments

This work was supported by the National Key Research and Development Project under Grant 2017YFD0401305.

References

  1. J. Yang, J. Yoo, E. Lee et al., “Anti-inflammatory effects of Perillae Herba ethanolic extract against TNF-α/IFN-γ-stimulated human keratinocyte HaCaT cells,” Journal of Ethnopharmacology, vol. 211, pp. 217–223, 2018. View at: Publisher Site | Google Scholar
  2. X. Kong, G. Huo, S. Liu, F. Li, W. Chen, and D. Jiang, “Luteolin suppresses inflammation through inhibiting cAMP-phosphodiesterases activity and expression of adhesion molecules in microvascular endothelial cells,” Inflammopharmacology, vol. 27, no. 4, pp. 773–780, 2019. View at: Publisher Site | Google Scholar
  3. N. Ito, T. Yabe, Y. Gamo et al., “Rosmarinic acid from perillae herba produces an antidepressant-like effect in mice through cell proliferation in the hippocampus,” Biological and Pharmaceutical Bulletin, vol. 31, no. 7, pp. 1376–1380, 2008. View at: Publisher Site | Google Scholar
  4. V. G. Narendra and K. S. Hareesh, “Quality inspection and grading of agricultural and food products by computer vision - a review,” International Journal of Computer Applications, vol. 2, no. 1, pp. 43–65, 2010. View at: Publisher Site | Google Scholar
  5. B. Zhang, W. Huang, J. Li et al., “Principles, developments and applications of computer vision for external quality inspection of fruits and vegetables: a review,” Food Research International, vol. 62, no. 62, pp. 326–343, 2014. View at: Publisher Site | Google Scholar
  6. Y. Chen, K. Chao, and M. S. Kim, “Machine vision technology for agricultural applications,” Computers and Electronics in Agriculture, vol. 36, no. 2-3, pp. 173–191, 2002. View at: Publisher Site | Google Scholar
  7. T. Brosnan and D. Sun, “Inspection and grading of agricultural and food products by computer vision systems--a review,” Computers and Electronics in Agriculture, vol. 36, no. 2-3, pp. 193–213, 2002. View at: Publisher Site | Google Scholar
  8. H. S. El-Mesery, H. Mao, and A. E. F. Abomohra, “Applications of non-destructive technologies for agricultural and food products quality inspection,” Sensors, vol. 19, no. 4, p. 846, 2019. View at: Publisher Site | Google Scholar
  9. Y. Ying, X. Rao, Y. Zhao, and J. YiYuan, “Application of machine vision technique to automatic quality identification of agricultural products (I),” Transactions of the Chinese Society of Agricultural Engineering, vol. 16, no. 3, pp. 103–108, 2000. View at: Google Scholar
  10. N. Zhao, P. Zhao, and Y. Gao, “Study on application of machine vision technology to modern agricultural in China,” Journal of Tianjin Agricultural University, vol. 22, pp. 55–58, 2015. View at: Google Scholar
  11. M. Atas, Y. Yardimci, and A. Temizel, “A new approach to aflatoxin detection in chili pepper by machine vision,” Computers and Electronics in Agriculture, vol. 87, pp. 129–141, 2012. View at: Publisher Site | Google Scholar
  12. N. Razmjooy, B. S. Mousavi, and F. Soleymani, “A real-time mathematical computer method for potato inspection using machine vision,” Computers & Mathematics with Applications, vol. 63, no. 1, pp. 268–279, 2012. View at: Publisher Site | Google Scholar
  13. V. N. Eyarkai, K. Thangave, S. Shahir, and V. Thirupathi, “Comparison of various RGB image features for nondestructive prediction of ripening quality of “alphonso” mangoes for easy adoptability in machine vision applications: a multivariate approach,” Journal of Food Quality, vol. 39, no. 6, 825 pages, 2016. View at: Publisher Site | Google Scholar
  14. H. Nouri-Ahmadabadi, M. Omid, S. S. Mohtasebi, and M. Soltani Firouz, “Design, development and evaluation of an online grading system for peeled pistachios equipped with machine vision technology and support vector machine,” Information Processing in Agriculture, vol. 4, no. 4, pp. 333–341, 2017. View at: Publisher Site | Google Scholar
  15. C. M. Sabliov, D. Boldor, K. M. Keener, and B. E. Farkas, “Image processing method to determine surface area and volume of axi-symmetric agricultural products,” International Journal of Food Properties, vol. 5, no. 3, pp. 641–653, 2002. View at: Publisher Site | Google Scholar
  16. H. Yao, Z. Hruska, R. Kincaid et al., “Development of narrow-band fluorescence index for the detection of aflatoxin contaminated corn,” in Conference on the Sensing for Agriculture and Food Quality and Safety III, vol. 8027, Orlando, Florida, United States, APRIL 2011. View at: Publisher Site | Google Scholar
  17. F. Pedreschi, J. Leon, D. Mery, and P. Moyano, “Development of a computer vision system to measure the color of potato chips,” Food Research International, vol. 39, no. 10, pp. 1092–1098, 2006. View at: Publisher Site | Google Scholar
  18. X. Huang, S. Jiang, Q. Chen, and Z. Jiewen, “Identification of defect pleurotus geesteranus based on computer vision,” Transactions of the Chinese Society of Agricultural Engineering, vol. 26, no. 10, pp. 350–354, 2010. View at: Google Scholar
  19. L. Sun, L. Yuan, J. Cai, H. Lin, and J. W. Zhao, “Egg freshness on-line estimation using machine vision and dynamic weighing,” Food Analytical Methods, vol. 8, no. 4, pp. 922–928, 2015. View at: Publisher Site | Google Scholar
  20. H. Li, H. Sun, and M. Li, “Identification of cabbage ball shape based on machine vision,” Transactions of the Chinese Society for Agricultural Machinery, vol. s1, pp. 141–146, 2015. View at: Google Scholar
  21. H. Chen, Q. Xia, T. Zuo, T. HeQun, and B. YinBing, “Determination of shiitake mushroom grading based on machine vision,” Nongye Jixie Xuebao, vol. 45, no. 1, pp. 281–287, 2014. View at: Google Scholar
  22. W. Hui, L. Yuchun, K. Feng, W. Qi, Z. Bo, and Z. Qin, “Size detection for cherry fruit based on machine vision,” Transactions of the Chinese Society for Agricultural Machinery, vol. 43, no. s1, pp. 246–249, 2012. View at: Google Scholar
  23. J. Xiong, X. Zou, N. Liu, P. HongXing, L. JinHong, and L. GuiChao, “Fruit quality detection based on machine vision technology when picking litchi,” Nongye Jixie Xuebao, vol. 45, no. 7, pp. 54–60, 2014. View at: Google Scholar
  24. F. Zhang, S. Li, and Z. Liu, “Screening method of abnormal corn ears based on machine vision,” Transactions of the Chinese Society for Agricultural Machinery, vol. 46, pp. 45–49, 2015. View at: Google Scholar
  25. Z. Ping, Z. ChunJiang, W. JiHua, Z. Wen'gang, S. ZhongFu, and W. YouXian, “Egg geometry calculations based on machine vision,” Nongye Jixie Xuebao, vol. 41, pp. 168–171, 2010. View at: Google Scholar
  26. H. Wang, J. Xiong, Z. Li, J. Deng, and X. Zou, “Potato grading method of weight and shape based on imaging characteristic parameters in machine vision,” Transactions of the Chinese Society of Agricultural Engineering, vol. 32, no. 8, pp. 272–277, 2016. View at: Google Scholar
  27. W. Wei, Y. Xing, Y. Li, Y. Peng, and W. Zhang, “Online detection and classification system of external quality of leaf for dining hall and family,” Transactions of the Chinese Society of Agricultural Engineering, vol. 34, no. 5, pp. 264–273, 2018. View at: Google Scholar
  28. L. Li, Y. Peng, and Y. Li, “Design and experiment on grading system for online non-destructive detection of internal and external quality of apple,” Transactions of the Chinese Society of Agricultural Engineering, vol. 34, no. 9, pp. 267–273, 2018. View at: Google Scholar
  29. G. Wang, L. Sun, X. Li, M. Zhang, Q. Lyu, and J. Cai, “Design of postharvest in-field grading system for navel orange based on machined vision,” Journal of Jiangsu University, vol. 38, no. 6, pp. 672–676, 2017. View at: Google Scholar

Copyright © 2020 Bo Zhao et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views232
Downloads344
Citations

Related articles

We are committed to sharing findings related to COVID-19 as quickly as possible. We will be providing unlimited waivers of publication charges for accepted research articles as well as case reports and case series related to COVID-19. Review articles are excluded from this waiver policy. Sign up here as a reviewer to help fast-track new submissions.