Abstract

In order to spray onto the canopy of interval planting crop, an approach of using a target spray robot with a composite vision servo system based on monocular scene vision and monocular eye-in-hand vision was proposed. Scene camera was used to roughly locate target crop, and then the image-processing methods for background segmentation, crop canopy centroid extraction, and 3D positioning were studied. Eye-in-hand camera was used to precisely determine spray position of each crop. Based on the center and area of 2D minimum-enclosing-circle (MEC) of crop canopy, a method to calculate spray position and spray time was determined. In addition, locating algorithm for the MEC center in nozzle reference frame and the hand-eye calibration matrix were studied. The processing of a mechanical arm guiding nozzle to spray was divided into three stages: reset, alignment, and hovering spray, and servo method of each stage was investigated. For preliminary verification of the theoretical studies on the approach, a simplified experimental prototype containing one spray mechanical arm was built and some performance tests were carried out under controlled environment in laboratory. The results showed that the prototype could achieve the effect of “spraying while moving and accurately spraying on target.”

1. Introduction

With multiple advantages such as improving the efficiency of pesticide use and reducing environment pollution, target spray has become a hot topic in the field of precision agriculture. It automatically sprays on target after getting the target’s information such as crop’s location, morphology, diseases, and insect pests and combining with spray requirements and operation information of actuator such as sprayer’s speed, mechanical arm motion control, and nozzle features. It has significance to increase the utilization ratio of pesticide, reduce pesticide residues on crops, and protect the environment and workers [1, 2].

Compared with robot control based on infrared sensor, ultrasonic sensor, and laser radar, visual servo control exhibits high flexibility and intelligence and is more suitable for agricultural environment [35]. In recent years, many scholars in America, Europe, Japan, China, and South Korea have carried out research on the target spray technology based on visual servo and have made notable progress. Giles and Slaughter developed a visual servo spray system, which could make spray nozzle move along the center line of crop row [6]. Precision herbicide application system, reported by Tian, can separately control each nozzle to do variable spray in each control zone according to weed infestation conditions detected by vision [7, 8]. Steward and Tang designed one set of target spraying systems based on machine vision and finite state machine controller, used for accurate spraying of herbicide to remove weeds in the field [9]. Li et al. studied a spray system to detect diseases and insect pests of plant in greenhouse and spray automatically [10]. Zhang et al. developed a mobile spray robotic system in greenhouse, which achieved spray control based on regional differences in terms of plant diseases and insect pests detected by machine vision [11, 12]. Yufeng et al. established indoor smart pesticide spray system based on machine vision and accomplished in-depth study on some problems such as image processing, application decision-making, and data exchange [13]. Based on existing knowledge, however, accurate spraying onto the canopy of interval planting crops directly has not been implemented.

To better solve the problem of accurate spraying onto the canopy of interval planting crops, this paper proposed a schematic design of target spray robot system based on composite visual servo and built a simplified experimental prototype to verify partial performances of the design in laboratory. Section 2 of the paper describes the construction and working method of the system. Section 3 details the solutions to four key issues involved in the realization of system function. Section 4 builds a simplified prototype containing one spray mechanical arm to test the accuracy of spray position and spray time and other functions of the design.

2. Material and Methods of the Target Spray Robotic System

2.1. System Structure

The target spray robot system based on composite visual servo is mainly composed of robot body, spray system, visual processing, and servo control system.

The robot body is shown in Figure 1. Rigid frame, the mounting base of other parts, is fixed on the front of motion platform (not shown). Movable frame, mounted on the rigid frame, can move up and down ( direction) along guide rail of the rigid frame under the action of lifting linear actuator. Translation stage is fixed on the front end of the movable frame. Spray mechanical arms, mounted on the translation stage and arranged for one or more as desired (three in Figure 1), can move horizontally ( direction) along guide rail of the translation stage. The mechanical spray arms are a three-degree-of-freedom (DOF) serial mechanism and can achieve nozzle movement in , , and directions.

The composition of spray system is shown in Figure 2. Manual switch is used to manually control on/off of the liquid box. dc pump provides power for spray process. Filter is used to filter the impurities in the liquid, to protect pipeline and nozzle. Pressure sensor and flow sensor acquire relevant information of main line, which is provided to the control system for decision-making. Throttle in main line and overflow valve in return line are used to adjust flow and pressure of the main line. Each nozzle is equipped with an independent solenoid valve to control the start-stop of spray process.

Visual processing and servo control system is composed of visual unit, sensor unit, and control unit. Among them, the vision unit includes one scene camera and several eye-in-hand cameras. The sensor unit mainly consists of one speed sensor to detect the speed of the mobile platform and several position sensors to detect the positions of all moving parts of the spray mechanical arms. The control unit consists of one PC (host computer) and several DSP (slave computer). DSP acquires and processes the information of eye-in-hand camera, position sensor, speed sensor, flow sensor, and pressure sensor and combines with PC instructions to control the spray mechanical arm and solenoid valve to do corresponding actions and uploads relevant information to PC. PC analyses the scene camera images and DSP feedback information to generate control instructions, which will be transmitted to the corresponding DSP through the switch. The working principle of the visual processing and servo control system is shown in Figure 3.

2.2. Working Method

The basic working process of the system can be expressed as follows: before spraying, control the movement of the lifting linear actuator to adjust initial height of the spray mechanical arm, and make the nozzles basically consistent with average spray height of the crops. And then, adjust the position of each spray mechanical arm on the translation stage to make the distance between them basically consistent with the row spacing and each nozzle corresponding to a row of crops. In the process of spraying, the scene camera roughly positions crops and transmits the information to the processing system which will control the spray mechanical arm move in advance. While the crops come into the field-of-view (FOV) of the eye-in-hand camera, it will pinpoint the crop and guide nozzle to move to appropriate location above the centroid of crop canopy through controlling the spray mechanical arm to complete target spraying.

2.3. Composite Visual Servo System Structure

Monocular vision system can indirectly obtain depth information through motion compensation method. Despite complex algorithm, it costs smaller computing resource and has better real-time performance than binocular system. Compared with scene vision system, eye-in-hand vision system shows higher accuracy of target positioning, but lower accuracy in working space [14, 15]. Considering spray robot has the features of big working space and real-time and high target positioning accuracy, a kind of composite vision servo method has been designed based on monocular scene vision and monocular eye-in-hand vision. The scene camera, mounted at the front of the robot body, had a fixed angle with the ground and could acquire crops’ image information in real time during the process of marching with the robot in the field. Centroids’ coordinates of crops canopy in the field, acquired through recognizing crops’ image information, could be used to roughly position the crops and to direct the spray mechanical arm moving in advance to make the crops smoothly enter the FOV of eye-in-hand camera. The image information acquired by eye-in-hand cameras which were mounted at the end of spray mechanical arms with nozzles was recognized to get accurate spray position and time of target crop and guided nozzle to complete spraying.

The composite visual servo system used a double closed-loop structure (as shown in Figure 4). The inner ring adopted location information given by position sensors of the spray mechanical arms, while the outer ring adopted location information of crop acquired by the visual image. The sampling period of the outer ring depended on real-time processing speed of the image detection system, and its sampling frequency was far lower than that of the inner ring. Such control structure could help to improve the stability and dynamic performance of the system. Feature selector controlled the source of feedback information. When target crop entered the FOV of eye-in-hand camera, the decision information of vision controller would use features detected by eye-in-hand vision and would not switch to scene vision features until spraying is completed.

3. Principle and Method of Implementation

To realize visual servo of the spray robot, two aspects are needed to be solved. One is image interpretation, that is, how to quickly and accurately extract effective features from images. And the other is servo control, that is, how to map visual features to control space of the spray mechanical arm. The solution involves four key issues such as background segmentation, features extraction and target location, camera calibration, and visual servo method. This study uses spraying operation of field gourd seedlings as an example.

3.1. Background Segmentation

Generally, gourd seedlings use spacing planting, with space of 0.4~0.6 m. When 4–6 true leaves sprout from crop, its foliage needs to have ethephon sprayed 2-3 times to promote the formation of female flowers and increase production. The information of crop image acquired under natural light environment mainly includes gourd seedlings, soil, dead leaves, small weeds, and black (or white) plastic film (as shown in Figure 5(a)). Gourd seedlings and weeds are green and the others are nongreen while weeds are smaller than gourd seedlings.

CIVE operator (0.441R-0.811G + 0.385B + 18.78745) [16] and OTSU method [17] are selected to carry out the background segmentation of acquired RGB images (R: red, G: green, and B: blue) through experiments. Figure 5(b) is the gray image after graying origin image by CIVE operator, where crop and noncrop (background) can be easily distinguished.

When the target occupies a proper proportion in the image, adaptive threshold segmentation based on OTSU method can effectively solve the impact of environment change on image quality [18]. Figure 5(c) is the binary image after segmentation by the method, which has less noise points and ideal segmentation effect.

3.2. Feature Extraction and Target Positioning

Target positioning is to acquire the location of target crop in robot reference frame, by controlling the spray mechanical arm move to make the nozzle arrive at the operation location to complete spraying. Target crop moves backward relative to the spray robot during work. Provisions of robot reference frame can be determined by right-hand rule. Positive -axis is the horizontal forward direction. Positive -axis is the vertical downward direction (as shown in Figure 1).

3.2.1. Scene Camera Image

Scene camera image is used for coarse positioning of target crop to guide the spray mechanical arm move in advance. The centroid of crop canopy is selected as the reference feature for positioning. It is possible to extract image coordinates of the centroid first and then calculate its 3D coordinates in robot reference frame according to transformation relationship between the image reference frame and robot reference frame, which is shown in Section 3.3.1. The extraction method of centroid is as follows.

Generally, the binary image after threshold segmentation will include some defects, such as weed image and random noise and individual crop’s leaf separation. A processing method can be as follows: Firstly, apply morphological closing operation to improve image connectivity. Secondly, label connected regions with dual scanning method, calculate pixel area of each region, and remove small separation regions such as weeds, random noise, and far crops with area filtering method. And then calculate the average values of pixel coordinates of each connected region, and make them as the centroid’s coordinates of target crops. The results are shown in Figure 5(d). The centroids’ position of the crops was cross marked.

3.2.2. Eye-in-Hand Camera Image

Eye-in-hand camera image provides feature parameters used for determining spray position and spray time of target crops.

For vertical downward spraying with full cone nozzle (such as TeeJet D5 and TeeJet DVP-4, Spraying System Co., IL, USA), droplet coverage is approximately circular (as shown in Figure 6), and its area can be approximated as where denotes proportion coefficient of actual coverage of droplet relative to its theoretical coverage, which is related to liquid viscosity, liquid temperature, liquid surface tension and spray height, and so forth. Let denote flow through nozzle per unit time and let denote amount of liquid needed for crop per unit area. Then, the spray time for target crop will be

By formulas (1) and (2), we know that if , , , and keep constant during spraying, then area of actual coverage (denoted by ) is proportional to spray height (denoted by ), and is proportional to .

In order to meet the requirement of precision spray, we can build a 2D minimum-enclosing-circle (MEC) of crop canopy and make actual coverage of droplet match it during spraying. Then, the center of MEC can be used to determine nozzle’s , position, the area (or radius) to determine relative height between nozzle and crop canopy (namely, nozzle’s position) by formula (1) and by formula (2).

Figure 7(a) shows original image in eye-in-hand camera of crop and Figure 7(b) shows the binary image processed with the above segmentation and filtering method, in which MEC of crop canopy and circular mark in the center are made at the same time.

The coordinate of MEC center in the eye-in-hand camera reference frame can be determined according to corresponding relationship between target’s image size and camera position. Target’s image size, according to pinhole imaging model (Figure 8), is related to the movement of camera along the optical axis ( direction), but basically not with the movement of camera along , directions.

Suppose that the camera moves from starting position to end position along -axis; the distance moved is ; based on geometrical relationship it can be seen thatCombine the above and solve; the following can be obtained:

The distance in direction between target crop canopy and current camera is , which is coordinate of MEC center in current camera reference frame. It is possible to obtain 3D coordinates of MEC center in the camera reference frame in accordance with , image projection coordinates of MEC center, and intrinsic parameter matrix of the camera. Besides, combining with eye-in-hand calibration matrix provided in Section 3.3.2, the 3D coordinates of MEC center in nozzle reference frame can be obtained.

3.3. Camera Calibration

In order to accurately acquire crop’s location information from images, intrinsic and extrinsic parameters of camera need to be calibrated. The calibration of intrinsic parameters can use classic Zhang calibration algorithm [19, 20]. Extrinsic parameters to be calibrated mainly include transformation matrix between scene camera reference frame and robot reference frame, transformation matrix between eye-in-hand camera reference frame and nozzle reference frame. Intrinsic and extrinsic parameters, supposed to be fixed in normal use, can be calibrated offline.

3.3.1. Extrinsic Parameter Calibration of Scene Camera and Target Location Estimation

While calibrating camera intrinsic parameters by Zhang calibration algorithm, we can obtain transformation matrix between each calibration target reference frame and camera reference frame (extrinsic parameters) at the same time. If calibration target is placed at a setting position of robot reference frame and we make the two reference frames in the same direction, there is only translation transformation between the two reference frames. Let denote the translation vector and let denote the transformation matrix between calibration plate reference frame and camera reference frame; then the transformation matrix between robot reference frame and camera reference frame will be

Assume that the height of crops is equal, and the canopies are within plane of robot reference frame. Let denote the image projection point of the centroid (denoted by ) of a crop canopy and let denote the image coordinates of ; then the coordinates of in camera reference frame will bewhere denotes intrinsic parametric matrix of the scene camera and denotes its focal length. The homogeneous coordinates of may be represented as . Let denote the mapping point of in robot reference frame; then

Similarly, homogeneous coordinates of the mapping point (denoted by ) in robot reference frame of the origin of the scene camera reference frame will be . From geometrical relationship it can be seen that means the intersection point between plane and extension line of and . Therefore, the coordinates of in robot reference frame can be approximately expressed as where . , , and denote the 3D coordinates of , , and , respectively.

3.3.2. Eye-in-Hand Calibration

Eye-in-hand calibration means estimating the rigid transformation matrix between the eye-in-hand camera reference frame and nozzle reference frame.

Let denote one point in eye-in-hand camera reference frame and let denote the mapping point of in nozzle reference frame; thenwhere and denote rotation matrix and translation vector of the transformation from eye-in-hand camera reference frame to nozzle reference frame, respectively. And they will remain unchanged because the relative position between eye-in-hand camera and nozzle is fixed. Let the optical axis of eye-in-hand camera be arranged vertically downward and parallel to the axis of nozzle; then the two reference frames will have no relative rotation and is an identity matrix. Therefore, we only need to calculate .

Similarly, let denote the mapping point of in robot reference frame; thenwhere and denote, respectively, rotation matrix and translation vector of the transformation from nozzle reference frame to robot reference frame. They are determined by the position and orientation of nozzle. The matrix is also an identity matrix because the spray mechanical arm only can make nozzle translation motion along , , and directions.

Combining formula (9) and formula (10) gives

Thenwhere and denote 3D coordinates of in the two reference frames, respectively.

Using formula (12) the following can be obtained:

In that, can be directly acquired from the state of spray mechanical arm. can be some special point on a regular object (such as center point and centroid point of square and midpoint of edge), while can be obtained by measuring and can be calculated according to the method provided in Section 3.2.2. This study selected several points and used least-square method to improve computational accuracy when practically calculating .

3.4. Visual Servo Method

The processing of the spray mechanical arm guiding nozzle to complete spraying can be divided into reset stage, alignment stage, and hovering spray stage while the movable platform is moving forward. Reset stage begins at the end of spraying former crop and stops when the distance between the robot and target crop is equal to a set value which should ensure target crop completely into FOV of eye-in-hand camera. During the stage, linear actuator returns to positive limit position, linear actuator returns to middle position, and linear actuator returns to upper limit position. Alignment stage begins at the end of reset stage and stops when nozzle moves to spraying position above the canopy of target crop. During the stage, linear actuator shall be fixed, linear actuator moves downward, and linear actuator moves to target position. During hovering spray stage, linear actuator goes back, and and linear actuators track target position, to make the nozzle hover over the spraying position until the end of spray process.

During reset stage, the system acquires crop location information in real time from scene camera images to control the spray mechanical arm. At the end of reset stage, the target crop has completely entered FOV of eye-in-hand camera, and control information will be switched and provided by eye-in-hand camera images. That is to say, eye-in-hand camera images will be used for accurate and real-time control of nozzle during alignment and hovering spray stages.

3.5. Speed Control of Three Electric Cylinders at Each Stage
3.5.1. Reset Stage

Let denote the forward speed of the movable platform. , , and denote the reset distances of three electric cylinders detected by the position sensor, respectively. Also let denote direction space between the centroid of target crop canopy and the movable platform detected by visual system, and denotes the set space between them at the end of reset stage. Then, the reset speeds of three linear actuators are, respectively,

3.5.2. Alignment Stage

linear actuator is fixed, so

The nozzle moves to target position with the movable platform and its direction movement speed is

The time spent for nozzle arriving at target position is .

In that, denotes sampling period and and denote coordinate of spray target position in nozzle reference frame at and moment, respectively.

When the movable platform moves for again after , it can be judged that the nozzle has arrived at target position in direction and alignment stage is over. The and linear actuators should make the nozzle arrive at target position in and direction before the end of alignment stage, respectively. The speeds are

In the above, is a scale factor, in order to make nozzle’s and directions arrive at target position in advance of its direction. Take ; and denote and direction deviation of nozzle’s position relative to target position at moment, and and are allowable variation range of and direction deviation, respectively.

3.5.3. Hovering Spray Stage

To ensure nozzle hover over the target position during the process of spraying, the astern speed of linear actuator should be matched with movable platform’s forward speed; and should be restricted within corresponding allowable ranges and by controlling and linear actuators. The speeds of three linear actuators are, respectively,where is another scale factor, and take . , , , and denote and coordinates of spray target position in nozzle reference frame at and moment, respectively.

4. Experiment Verification

A simplified prototype containing one spray mechanical arm (as shown in Figure 9) was built to demonstrate the feasibility of the design. The stroke of and linear actuator (01TS, Jike Instrument Co., Beijing, China) of the spray mechanical arm was 200 mm, and that of linear actuator (XC800, Xunchi Electric Co., Zhejiang, China) was 150 mm. The scene camera (acA1300, Basler AG, Ahrensburg, Germany), which had a resolution of 1296 × 966 pixels and a frame rate of 30 fps, was mounted on the top right of the frame with its optical axis declining 46 degrees. The eye-in-hand camera (a1200, Lanseyaoji Co., Jiangsu, China), which had a resolution of 640 × 480 pixels and a frame rate of 30 fps, was vertically mounted on the extending end of linear actuator. A quad processing (Core i5) portable computer was used as the main image-processing computer. A low-cost microcontroller (STC89C52RC) was used as a controller to control the three linear actuators and all sensors. The link between PC and controller was a RS-232 serial communication line. To measure spray position more easily, the nozzle was replaced with a laser light (SU6505, Sulei Laser Technology Co., Guangdong, China), and the crop was replaced with printed crop pictures, in which there were marked MEC, ruler, and grids (as shown in Figure 10). The ruler’s origin was the center of the MEC and its index value was 1 mm.

For spray process, spray position and spray time are important indicators, the accuracy of which was tested through prototype experiments under controlled environment in laboratory. In the experiments, expected spray position of nozzle in and directions coincides with the center of MEC in crop picture, and actual spray position in and directions can be obtained by measuring the spot of laser light dropped on crop picture. Expected spray position of nozzle in direction is calculated by (origin: according to formula (1), let , suppose , , and get , due to the limitations of prototype structure, which need to be multiplied by a factor of 2; then we get ), and actual spray position in direction is provided by the position sensor of linear actuator. Expected spray time of nozzle is calculated by (origin: according to formula (2), let ), and actual spray time is obtained by software system through the difference between starting and stopping of spraying.

Five crop pictures of spacing placement were as experimental objects, with space about 400~600 mm in direction and 30~60 mm in direction. The experiments, measuring spray position and spray time of each crop picture, had been repeated for 20 times in the same circumstances. During the experiments, actual spray position in and directions had some fluctuation, and the center of fluctuation range was treated as actual spray position. Because their desired value can be regarded as 0, the deviation (actual value − expected value) is equal to actual value. The deviation of spray position in , , and directions is shown in Figure 11(a), and we can think that the positioning accuracies of prototype in , , and directions are about , , and , respectively. Figure 11(b) shows the fluctuation quality of actual spray position in and directions. The fluctuation in direction is obviously larger than that in direction, which was mainly related to velocity fluctuation of the motion platform. Figures 11(c) and 11(d) compare average spray position with expected position in , directions and in direction, respectively. And we can see that the deviation of average spray position relative to expected position in , , and directions is about (−2.5 mm~1 mm), (−0.5 mm~1 mm), and (−1 mm~1 mm), respectively. Figure 11(e) shows the deviation of actual spray time relative to expected spray time, and we can think that the spray time accuracy is about . Figure 11(f) compares average spray time with expected time, and the deviation of average spray time relative to expected time is about (−0.05 s~0.07 s). Moreover, the switching process between scene visual servo and eye-in-hand visual servo is fluent and real-time performance of the system is good during the whole experiments.

5. Conclusions

(1) A schematic design of target spraying robotic system used for spraying onto the canopy of interval planting crop was proposed. And composite vision servo method based on monocular scene vision and monocular eye-in-hand vision and double closed-loop control structure for the system were studied.

(2) On the basis of image segmentation and feature extraction, methods of roughly positioning target based on single scene camera and positioning spray location based on single eye-in-hand camera were studied. A kind of mechanical arm servo control method which divided the process of spray into reset stage, alignment stage, and hovering spray stage was proposed.

(3) The experimental results showed that the system exhibits higher control accuracy and acceptable real-time performance. The switch between scene visual servo and eye-in-hand visual servo was smooth. Using this system, combined spraying and moving and accurately spraying on target can be achieved.

Competing Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work was supported by the Specialized Research Fund for the Doctoral Program of Higher Education, China (Grant no. 20120008110046), and the Natural Science Foundation of Shandong Province, China (Grant no. ZR2012CQ026), and the Scientific Research Fund of Liaocheng University, China (Grant no. 318011519).