In this paper, a method to monitor the growth of bean plants from images taken in their vegetative stage is presented. Through a fuzzy system and the RGB image components, the growth stages of the plant are classified to achieve a powerful tool in the precision agricultural field, which can reduce cost and increase system portability. This also serves as a reference for the growth of the plant according to its age, vigor, and healthy that could help to create the necessary environmental conditions. To carry out this research, the development of twenty bean plants was periodically monitored from their germinal to the first trifoliate leaf stage. Images were obtained with controlled background and lighting; later they were segmented by color to count the pixel’s average for each case. These data were used to propose six different fuzzy systems to choose the best one within them. Finally, it was found that the artificial vision system can identify the vegetative stages of germination, emergence, primary leaves, and first trifoliate leaf.

1. Introduction

The technification of agriculture is one of the most important challenges for primary production worldwide because through new technologies to monitor and control crops, it is possible to guarantee the production of food with fewer resources, for instance, soil, water, fertilizers, among others. In this regard, the sophistication of bean cultivation is crucial for food production because the bean crop is an important source of proteins and minerals, it is low in fat and it has antioxidant properties. Indeed, it is estimated that production will increase by 5.74% annually in the coming years [1]. While computer vision and artificial intelligence are some techniques used for controlling and monitoring different kinds of crops, [2, 3] many authors have worked in this direction using these techniques; in [4] a new algorithm based on computer vision was proposed for the automated detection of individual leaves and the stem of the corn plant to calculate new phenotypes, the same as in [5]. Also, in [6] authors identified nutritional deficiencies by analyzing coffee leaves with artificial vision and using different classification techniques such as K-Nearest Neighbors (KNN), Naive Bayes, and neural networks in [7]. In [8, 9], the authors presented characteristics for the identification and classification of diseases through digitally processed images. In [10], the authors reported computer vision for the phenotyping of stress in the soybean plant with a specific focus on the rapid and automatic assessment of the severity of iron deficiency chlorosis.

In [11, 12], the authors used an artificial vision system to detect the nutritional status of some characteristics in corn and canopies plants. On the other hand, the use of a stereoscopic system in real time to identify the corn’s grow stages, by using the plant’s skeleton, obtained a high success rate classification as reported in [13].

As it can be noticed from the preceding paragraphs, there is a need to increase the monitoring and technification of the agricultural crops, as well as guaranteeing their growth. Therefore, in this paper the farmer’s visual inspection method is taken as reference. Usually, farmers approximately determined by visual inspection the stage of development a plant is located, knowing the days that it takes of sowing as well as the climatic conditions to estimate the quality of the crop. By using artificial vision and fuzzy logic techniques, in this paper the task of the farmer is emulated, classifying the stages of growth of bean plants in real time and in an automated way by estimating their biomass. This work is organized as follows: Section 2 explains the methodology and experiments, Section 3 shows the results of the proposal, in Section 4 a final discussion is reported, and finally in the last section the conclusions of the proposed approach are given.

1.1. Concepts

To correctly classify the growth’s stages of a bean plant, it is necessary to have a prior knowledge of its phenological stages, appearance, and size. Additionally, the distinctive characteristics and the life time of each state of the plant are also required. This information of the bean plants is detailed in the following paragraphs.

The growth process of bean plants has two main phases [14]: the vegetative and the reproductive ones (see Figure 1). The first one corresponds to the interval between the symmetry of the seed and the moment when the stem branches (known as floral cluster). This phase usually lasts 35 days and the bean seed undergoes several changes that are classified into five stages: V0: germination, V1: emergency, V2: primary leaf, V3: first trifoliate leaf, and V4: third leaf trifoliate (see Figure 2).

The reproductive phase is divided into five more stages: R5: preflowering, R6: flowering, R7: pod formation, R8: filling pods, and R9: maturation. This phase can last until day 77; from this moment the seed sows. In Figure 2, the physical appearance of the bean seed [14] is observed in its vegetative phase, as the present experiment will focus only on some stages of this phase; the reasons for doing so are detailed in the following sections.

2. Materials and Method

The raw material of this work was a bean seed crop analyzed in its vegetative stage (V4) under controlled conditions inside a laboratory with maximal sun lighting to the west of 3300 lux and a temperature range of 8° to 25° Celsius. A seed was placed in a transparent plastic cup with 6 grams of cotton with 15 milliliters of water daily (see Figure 3). To obtain reliable data for this study, twenty bean seeds were planted for 25 days; the growth of the samples was registered by means of photographs taken in a light box specially constructed for the experiment in the laboratory.

2.1. Data Acquisition

To create the classification system, an experimental setup for the acquisition was designed, which is shown in Figure 4.

A light box of 1m x 1m x 1m with white walls was built in such a way that the dark color of the bean plant contrasts with the background and makes its thresholding easier. Two cameras were placed inside. One camera was on the left side wall of the box 10 cm from the floor so that it can have the widest range of vision towards the opposite wall of the box. A second camera was placed on the roof of the light box 70 cm away from the left side wall (also to achieve the widest range of vision). The sample cup was placed just below the top camera and centered in front of the lateral camera.

To capture the images, the cup containing the sample was marked on its base, with the four cardinal points and it was rotated by placing each of these points in front of the side camera (as shown in Figure 5). In this way, eight photos of each sample were acquired: four of the lateral and four of the top camera. By following the last steps, the greatest possible amount of information to estimate the size of the plant in pixels was achieved.

The database was organized in folders with the plant’s stage-growth name to which the images belong and within it, folders were created to separate the images by days and within the latter, the images were named by singles (or samples). Illumination conditions for the inside of the light box were directly related with the illumination of the exterior environment. In this regard to determine its numerical value, a lux meter was used for each experiment and the averaged illumination was 210 lux.

In the experiments two Logitech c525 cameras with the following technical characteristics were used: image capture (4:3) with a resolution of 2 Mpx, diagonal field of view of 69°, focal length: N/A, and lens and sensor based on plastic.

2.2. Automatic Visual Classification Method

The proposed system is shown in Figure 6, and it consists of three main blocks: thresholding, characteristics extraction, and fuzzy system classification. The general operation of the system is described as follows: the photographs taken by the data acquisition stage are used as input to block 1; those are divided into the color channels: red, green, and blue (RGB). After that, those are subjected to a binarization process. The result of each channel is multiplied with the others to obtain the intersection of the three segmented channels; this intersection is considered as the number of pixels used by the plant in its image.

On the other hand, in block 2, the results of the thresholding of each image are taken to calculate their averaged area and standard deviation in pixels; this process is repeated per each sample plant. Those characteristics become the block 3 inputs, which are used for the fuzzy training and validation where the means and variances are classified. Finally, the system is tested with a group of reserved data to observe the performance of the classification system. As a result, the bean’s plant growth stage is achieved by comparing its performance.

2.3. Block 1: Thresholding

To determine the size of each plant, the number of pixels occupied in the photographs must be extracted. In the proposed approach, a color thresholding that contains the images must be done. The bean in its seed form contains red tones, in its state of germination and emergence it contains combinations of reds and blues, and in more mature stages, the green color predominates. By using the MATLAB Color Thresholder tool, the RGB scale thresholding that offered the best results was found, which is shown in Figure 7.

In Figure 7, it can be seen (at the top) that each color channel is represented on a scale from 0 to 250; by analyzing twenty plants the following thresholds were achieved:The obtained thresholds were the ones that reported the best thresholding response in the overall stages of bean growth that were classified. The result of binarization of each channel is observed in Figure 8.

Figure 8 shows that, for each color channel, there is a part of the surface of the plant (marked in white) so as to reconstruct the whole plant and make the fine count of pixels. The intersection, made with the AND operation, of the R, G, and B channels is shown in Figure 9.

In order to take advantage of all the information provided by the light sensors, the raw data was taken from the image matrix in RGB format. According to the training images, the limits that separate each of the color components of the image were chosen according to the MATLAB Color Thresholder tool in RGB. The thresholding boundaries were calibrated only for the bean plant in the experimental box using a controlled illumination.

2.4. Block 2: Feature Extraction

The statistical mean of pixels is employed to describe the plant size for each vegetative maturity stage, and the standard deviation is used to represent a range of size ambiguity. This is because the fuzzy logic is inspired by the principle of incompatibility [15], which states that “as the complexity of a system increases, our ability to be precise and build instructions on their behaviour decreases to the threshold beyond which, the precision and the meaning are exclusive characteristics.” By following the last state, the design of the diffuse system was performed, using the linguistics of the people to describe each vegetative stage of the bean plant: seed, very small plant, small plant, medium plant, large plant, and very large plant.

The first step was to obtain the area with the plan’s pixel count, and for this stage, as seen in Figure 9, the black background was composed of pixels that have zeros and the white pixels contain ones. So, the image’s plant area was obtained by counting the pixels in one. The second step was to obtain an averaged and standard deviation of the four lateral photos and the four top view photos per each sample. The third step was labeling the basis of each photograph according to the stage of the plant that appears in the image. These three stages conceive as inputs for training and testing the fuzzy system. Some results of this data processing stage are shown in Table 1.

To convert the area from pixels to a black cube of 1.5 cm on each side was used, giving a base area of 2.25 . This cube was photographed with the two cameras of the light box and the images were subjected to a thresholding to know the area in pixels that the cube occupies, resulting in the top camera: 130 px, and for the side camera 165 px. Finally, with (4) and (5), the areas were transformed from pixels to (Table 1).

2.5. Block 3: Fuzzy System

The fuzzy system was designed to classify from the emergency to the first trifoliate leaf stage. The information acquired from the characteristic’s extraction (see Table 1) indicates that it is possible to train the fuzzy system by using the information obtained from the images taken by the lateral camera; that is, the average in pixels of the area of the plant in the image and its corresponding standard deviation. This is because the characteristics extracted from the lateral’s camera images had greater differences between stages compared with the right one, which facilitates their classification. The fuzzy system was created using the MATLAB ANFISEDIT tool, which has the advantage of designing a fuzzy system from data [16]. With this tool, the set of data that can work as inputs for training can be selected; in our case, the image’s averaged pixel area (lateral camera) and the standard deviation and the expected output with each one of the entrances must be indicated. In addition, it must indicate how many membership functions describe the data of each entry and its type, which can be triangular, trapezoidal, Gaussian, among others. Finally, in the training process, the structure of the neural network was defined and how many epochs or iterations were required to reduce the classification error. MATLAB’s ANFISEDIT tool uses the Takagi-Sugeno fuzzy inference mode to calculate the weights () of the membership functions (MF) and AND method (or multiplication) was applied between the values of the MF evaluated in a point of interest as follows:The classified output of the fuzzy system is given by (7). Here, Zi represents the weight of the fuzzy rule in the output.Considering (6) and (7) and the logic of the Takagi-Sugeno model, several fuzzy system design configurations were tested to compare the training errors against the training data in continuous numbers. The systems designed and trained with the ANFISEDIT tool are shown in Figure 10. The RMSE (Root Mean Square Error) values are given in Figure 10; for each fuzzy system, those values were training and testing errors that the ANFISTEDIT tool delivered.

Initially, six linguistic variables were chosen: seed, very small plant, small plant, medium plant, large plant, and very large plant. In addition, experiments were conducted to observe the improvement in the accuracy of the proposed system by varying the number of membership functions and the types of functions that were shown in results. With respect to the parameters of each function, its values were optimized using Matlab’s ANFIS tool; this process is also known as adjustment or tuning.

3. Results and Discussion

To test the designed fuzzy systems, some samples of each stage of bean growth were randomly separated from database. These data were not used for the fuzzy system’s training or testing. At the output of the classification systems, the approximate number (considering the RMSE error of each training) assigned from each stage of bean growth must be obtained when entering the plant area average in pixels and the standard deviation at the entrance. It is expected to obtain at the end two for the emergency stage, three for the primary leaves stage, and a four for the first trifoliate leaf stage. From Table 2, the performance of each designed and tested fuzzy system with the characteristics of the same samples is reported. Fault cases are marked in bold font. The above information provides a good idea for the selection of the fuzzy system that should be used for the classification of the bean growth stage. Therefore, their receiver operating characteristic (ROC) curves were constructed, which serve to evaluate the performance of the outputs of the classifiers.

The first filter for the selection of the classifier is the comparison of the ROC curves of the two fuzzy systems of triangular membership functions that are triangular 6x6 and triangular 7x7, as seen in Figure 11(a).

The same task was done to compare the two designed fuzzy systems whose membership functions are of the Trapezoidal type (Figure 11(b)) and the two fuzzy Gaussian MF systems (Figure 11(c)). The criteria for evaluating the performance of a classifier was the ROC curves that consist of seeing which of the curves is closest to the maximum value of the vertical axis (True Positive Rate), i.e., 1, and similarly for the horizontal axis (False Positive Rate). Mathematically it can be said that the classifier with more area under the ROC curve has the best performance to classify the test data.

From Figures 11(a), 11(b), and 11(c) the designed fuzzy systems that have the most significant area are those corresponding to the ones with the least membership functions in each case. In Figure 11(d), it is observed that the system with the worst behavior is the Gaussian MF. On the other hand, to differentiate which of the other two is the best one requires an integration of each one to achieve the selection of one for the application developed.

According to results, the fuzzy system with better behavior in terms of the number of failures is the one that was assigned seven triangular membership functions for the average input of pixels and seven triangular functions for standard deviation input (Triangular 7x7). On the other hand, the design with the worst classification is the proposal with ten Gaussian membership functions. In this design, the classification fails in almost 50% of the data of the stage of growth labeled with the number 3 and the same happens for the stage of growth labeled with the number 4, according to Table 1. In Table 3, the area under the ROC curve of the three systems represented in Figure 11(d) is presented, and it is calculated with the MATLAB trapezoidal integration method.

Based on the results presented in Tables 2 and 3, the system that offers the best classification of the three stages of bean growth (emergency, first leaves, and first trifoliate leaf) is composed of 6 Triangular membership functions for the average of pixels (area of the plant in the image) and 6 Triangular membership functions for the standard deviation. The RMSE training and validation errors seen in Figure 10 are presented in the last column of the Table 3. Even though they are not determinant because the output is discreet (1,2,3,4), they show the quality of the descriptors, in this case, the means and standard deviation of the area in pixels of the bean plant.

4. Conclusions

A method and system of artificial vision were designed, which take photographs to automatically classify the bean plant in three stages of its vegetative phase, with a degree of 95.24 % recognition, according to the ROC analysis. The proposed estimation is based on the calculation of the number of pixels occupied by the plant in the images at fixed distances from the cameras with respect to the study sample. The photograph’s acquisition was carried out in a controlled environment to isolate the studied variable, which was the bean plant, from other elements, as well as a white background to facilitate the thresholding with thresholds in its RGB components. Different fuzzy systems were tested to optimize the number of membership functions and their type. To find the most significant dispersion data (average and standard deviation) of the accounting of the segmented pixels, it is inferred that this work can be extended for more phases of the bean plant and other plants that vary their life cycle increasing its size in photographs taken at a fixed distance.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that there is no conflict of interest regarding the publication of this paper.


The authors greatly appreciate the support of TecNM, CONACyT, PRODEP, UG, ITESI, and ITESS. This work was partially funded by the CONACYT grant FC-2016-1961.