Journal of Sensors

Journal of Sensors / 2017 / Article

Research Article | Open Access

Volume 2017 |Article ID 8367979 | https://doi.org/10.1155/2017/8367979

Yue Shen, Destaw Addis, Hui Liu, Fida Hussain, "A LIDAR-Based Tree Canopy Characterization under Simulated Uneven Road Condition: Advance in Tree Orchard Canopy Profile Measurement", Journal of Sensors, vol. 2017, Article ID 8367979, 13 pages, 2017. https://doi.org/10.1155/2017/8367979

A LIDAR-Based Tree Canopy Characterization under Simulated Uneven Road Condition: Advance in Tree Orchard Canopy Profile Measurement

Academic Editor: Ayman Suleiman
Received07 Jul 2017
Accepted09 Nov 2017
Published27 Dec 2017

Abstract

In real outdoor canopy profile detection, the accuracy of a LIDAR scanner to measure canopy structure is affected by a potentially uneven road condition. The level of error associated with attitude angles from undulations in the ground surface can be reduced by developing appropriate correction algorithm. This paper proposes an offline attitude angle offset correction algorithm based on a 3D affine coordinate transformation. The validity of the correction algorithm is verified by conducting an indoor experiment. The experiment was conducted on an especially designed canopy profile measurement platform. During the experiment, an artificial tree and a tree-shaped carved board were continuously scanned at constant laser scanner travel speed and detection distances under simulated bumpy road conditions. Acquired LIDAR laser scanner raw data was processed offline by exceptionally developed MATLAB program. The obtained results before and after correction method show that the single attitude angle offset correction method is able to correct the distorted data points in tree-shaped carved board profile measurement, with a relative error of 5%, while the compound attitude angle offset correction method is effective to reduce the error associated with compound attitude angle deviation from the ideal scanner pose, with relative error of 7%.

1. Introduction

Laser scanning sensors can provide more accurate detection of tree crop structures than infrared sensors and have potential to be incorporated in intelligent machines in precision agriculture [13]. As Rosell and Sanz stated in [1], tree crop canopy characterization is a significant factor in numerous applications in agriculture. Some important agricultural tasks that can benefit from these plant-geometry characterization are the application of pesticides, irrigation, fertilization, and crop training. In the field of pesticide application, knowledge of the geometrical characteristics of plantations will permit a better adjustment of the dose of the product applied, improving the environmental and economic impact [1, 4]. Obtaining a precise tree crop canopy profile at any point during its production cycle by means of fast and accurate detection system will help to establish precise estimations of crop water needs as well as valuable information that can be used to quantify its nutritional requirements [1, 5]. Development of fast, easy, and efficient methods to determine the fundamental parameters used to characterize a canopy structure is thus an important need. Recently, many research papers have investigated the use of LIDAR laser scanner for a canopy measurement due to its high accuracy, high scan speed, and insensitivity to light sources and found good relationship between LIDAR and field measures with values typically ranging from 0.85 to 0.95 [610]. More recently, Wei and Salyani in [11] developed a laser scanning system to measure canopy height, width, and volume in citrus trees. In citrus trees, this device showed an accuracy of 96% in length measurements in three perpendicular directions. In [12], a 270° radial range laser scanning sensor was evaluated for its accuracy to measure target surface with complex shapes and sizes in X, Y, and Z Cartesian coordinates in different travel speeds and detection distances and found good results. In [13], Grella et al. designed a sprayer prototype able to automatically adapt spray and air distribution according to the characteristics of the target, to the level of crop disease, and to the environmental conditions. It is still necessary to resolve several technological questions including improving detection systems which is able to characterize the tree crop canopy profile under a complex terrain. In most side-view monitoring activities of orchards and high row-cultivated plants, the detection system must constantly contend with the uneven and complex path when collecting the laser scanning sensor data. The level of error associated with attitude angles from undulations in the ground surface results in a distorted dataset which leads to an incorrect target profile measurement [3]. Most studies in the field have evaluated the accuracy of the LIDAR sensor to measure canopy structures regardless of the level of error associated with these attitude angles. This paper proposed an offline attitude angle offset correction method to adjust the distorted sensor dataset from undulations in the ground surface for precise tree crop detection and profile characterization. The method was developed based on a 3D affine coordinate transformation. In order to test the proposed correction method, an indoor target detection platform, with a laser sensor scanner, was built. The platform was used to conduct a spray target detection experiment under the simulated uneven road conditions. The experiment was divided into three test conditions based on the orientation of the laser sensor: (1) ideal measurement condition test, (2) single-attitude angle offset test, and (3) a compound attitude angle deviation test. The laser sensor data acquisition, data analysis, and data correction method are discussed in the subsequent sections.

2. Methods and Materials

2.1. Canopy Profile Measurement Platform

An indoor LIDAR-based target profile measurement platform was developed to measure a tree canopy profile under simulated complex terrain. Figure 1 shows the structure and main components of the platform, and Figure 2 shows the block diagram of the complete system which is composed by a sliding motion control system and a LIDAR-based target detection unit.

2.1.1. Sliding Motion Control System

The sliding motion control system was mainly consisted of a host controller, a custom-designed speed sensor, a high-performance AC servo drive (Servo pack SGDM-08ADA, Yaskawa Electric Corporation, Japan), an AC servo motor (SGMSH-20A2A61, Yaskawa Electric Corporation, Japan), and a linear aluminum slider. The servo drive provided the link from the motor to the host controller, and it served as the “core of the control.” It was selected to be compatible with the servo motor and the host controller. The servo drive received command signals from the host control unit through the HMI (Human Machine Interface) settings, processed the signals, and transmitted the signals to the servo motor in order to produce motion proportional to the command signals [14]. The command signals represented the position, the rotating speed, and the direction of the motor. The servo drive has several built-in control loop functions. In this design, we used the position loop control of the servo drive to manipulate the speed and the position of the laser sensor mounted on the sliding table [15]. The platform used customized aluminum GT 80 series synchronous belt slider (FA80GT-5900, Shanghai Pei Machinery Co. Ltd., China) to realize the rectilinear movement of the sliding table. The slider has a 6.4 m length, and the table on which the laser sensor was mounted is 30 cm long and has an 18 cm width. The sliding table was driven by the servo motor (SGMSH-20A2A61, Yaskawa Electric Corporation, Japan) which has 2 kW rated output power and 3000 rpm rated speed for 6.36 Nm rated torque. The STM-32 ARM processor- (STM32F103VET6, STMicroelectronics, France) based host control unit was designed to control the laser sensor movement along the slipway by specifying a position and speed as a set point. The system incorporated the embedded integrative touchscreen (MCGS, model TPC1061Ti, Beijing Kunlun Tongtai Automation Software Technology Co. Ltd.) to provide effective user-machine interaction through the custom-designed graphical user interface (GUI). The integrative touchscreen was configured in the MODBUS communication protocol with MCGS (monitor and control generated system) full-function configuration software to facilitate human machine interaction which includes manual parameter setting and real-time system status monitoring. The touchscreen was connected to the embedded processor via a standard serial RS-485 COM port. The embedded program was developed to attain the required system operation and monitoring. The embedded software comprised a system initialization program, a switch button input program, a communication program, and a sliding motion control program. The entire software coding was completed with MDK-ARM Keil IDE in C programming language.

2.1.2. Target Detection Unit

The LIDAR-based target detection unit included a high-speed laser scanning sensor and an industrial microcomputer for real-time target detection and data acquisition. The target detection unit used a 270° range laser scanning sensor (Model UTM-30LX, Hokuyo Automatic Co. Ltd., Japan) to measure target object surface distances based on the time-of-fight principle. This sensor is able to continuously transmit and receive 1080 signals in a 270° radial range at 0.25° angular resolution within a 0.025 s measurement cycle. The time between transmission and reception of laser signals was used to measure the distance between the sensor and the target object surface [16]. The laser sensor was mounted on the adjustable frame at a height of 1.65 m above the ground surface in such a way that its 90° blind surface faced downward the ground. The adjustable frame was designed to set the orientation of the laser sensor to simulate the actual uneven road conditions. It was connected to the industrial microcomputer via a universal serial bus (USB) interface for data communication. A data acquisition program was developed with C++ programming language based on the Visual Studio platform (Microsoft Visual Studio 2005, Microsoft Corporation, USA) to control the laser sensor and to acquire the measurement data from the laser scanning sensor in real time.

2.2. Attitude Angle Offset Correction Algorithm Development

The laser sensor can be rotated about three orthogonal axes, as showed in Figure 3(a). These rotations will be referred to as yaw, pitch, and roll. These rotations can be used to place the laser sensor in any orientation in a 3D space. Figure 3(b) shows two reference coordinate systems which are defined as a sensor-fixed coordinate system (, ) and a ground coordinate system (, ) based on the laser sensor installation on the platform. The sensor-fixed coordinate system is rigidly attached to the laser sensor, and its origin is the center of the sensor [17].

In an ideal measurement condition where the laser sensor is positioned perfectly parallel to the ground surface, the sensor-fixed coordinate system is brought into line with the ground coordinate system. In this case, the laser sensor’s datasets are on the same fan-shaped plane perpendicular to the sky. When the laser sensor is in a static mode, its dataset (1080 points per frame) can provide a 2-dimensional grid of line-of-sight distances between the sensor and the target object. When the sensor travels horizontally, it can offer an array of distance data which can form a 3D surface with proper algorithms. Since the 90° blind surface of the sensor faced downward the ground, the sensor detects objects on both sides of the slipway, and each side can have a maximum of 540 detected target surface points [12]. As illustrated in Figure 4, each laser sensor data point (p) can be defined by the distance and the angle referred to 0° at the position of the laser sensor in the sensor-fixed coordinate system as where is the detected point index (1, 2, 3,…), is the distance from center of laser sensor to the target surface point, and is the scan angle.

From (1), the projection of each distance vector () in -, -, and -axes for the nth frame (1080 points) can be defined as [18, 19] where is the distance measured from the sensor to the target for frame and beam (detected point) and is the angle between the vertical 0° line and the laser beam j (Figure 4).

The essential computation in LIDAR dataset processing is the calculation of coordinates of the detected target surface point using the LIDAR parameters with respect to the relevant coordinate systems [17, 20]. In the ideal measurement condition, the relative dataset distortion is insignificant. However, in the real field due to undulations in the soil surface, the sensor is rotating randomly about the sensor-fixed coordinate axes by , , and in , , and directions, respectively. This rotation causes dataset distortion. Since the orientation of the laser sensor can be defined by its roll, pitch, and yaw rotations from an initial position, we used a 3D affine coordinate transformation to develop an attitude angle offset correction algorithm for a proper target profile measurement [21]. The algorithm was developed according to the following three steps: (1)Determining the laser sensor range vector for each frame of dataset by using the scan angle ().(2)Generating three rotation matrices that align the sensor-fixed coordinate frame to the ground coordinate frame by using the respective attitude angles (, , and ). The rotation matrices (roll, pitch, and yaw) which transform the range vector under a rotation of a sensor-fixed coordinate system by angles in roll, in pitch, and in yaw about the -, -, and -axes, respectively, are(3)Applying the three rotation matrices to the range vector based on the orientation of the laser sensor.

The resulting formula for a single-attitude angle offset correction is shown in (4). where is the range vector with respect to the sensor-fixed coordinate system, is the rotated range vector with respect to the ground coordinate system, and is the rotation matrix.

In (4), is the rotation matrix which rotates the range vector counterclockwise about one of the three orthogonal axes of the sensor-fixed coordinate frame by the respective rotation angle. For instance, if there is only a rotation about the -axis by the roll angle , then the above equation becomes .

In an actual spray application field, however, the sensor is randomly rotating about the sensor-fixed coordinate axes by the roll, pitch, and yaw angles. The above single-attitude angle offset correction formula (4) cannot be applied to this randomly combined attitude angle offset condition. Therefore, a single rotation matrix must be used to determine the orientation of the sensor in a 3D space. The single composite rotation matrix can be formed by multiplying the yaw, pitch, and roll rotation matrices as where and abbreviate cosine and sine operations, respectively, and is a composite rotation matrix.

The composite rotation matrix can describe the orientation of the laser sensor relative to the ground’s coordinate system as expressed in the subsequent formula. where is the range vector for frame and beam with respect to the sensor-fixed coordinate system, is the rotated range vector for frame and beam with respect to the ground coordinate system, and is the composite rotation matrix.

It is important to note that performs the roll first, then the pitch, and finally the yaw. If the order of these operations is changed, a different rotation matrix would result [22]. Even though there are six possible orderings of these three rotation matrices and, in principle, all are equally valid, the rotation matrices do not, however, commute meaning that the composite rotation matrix depends on the order in which the roll, pitch, and yaw rotations are applied. If the laser sensor rotates randomly about the three principal axes of the sensor-fixed coordinate system by the yaw, roll, and pitch angles, then the composite rotation matrix can be used to rotate each detected data point by respective attitude angles (, , and ) in , , and directions, respectively, as indicated in (6).

It is also essential to analyze the effect of each (roll, pitch, and yaw) attitude angle offset on the laser sensor dataset relative accuracy. The following paragraphs discuss the errors associated with each attitude angle offset along with their correction methods. Figure 5(a) illustrates the laser sensor’s roll, pitch, and yaw rotations. And Figure 5(b) depicts detected target surface point with respect to sensor-fixed and ground coordinate frames with corresponding attitude angle offset.

When the laser sensor rotates around u-axis from the ideal position for the roll angle (β) as showed in Figure 6, the blind area of the sensor is placed on the detection side as the roll angle increases. This situation will have two effects on the detection result: first, the roll angle offset will change the depth value of the target canopy profile and cause the value as a direct result of the horizontal projection of the range (distance from the center of the sensor to the target surface) and second, wrong target detection or missing part of the target profile. As the roll angle increases in the scanning direction, the laser sensor would scan a part or none of the target profile. All these data distortion and wrong detection could be corrected by applying (7) for each frame of the laser sensor data.

If the laser sensor rotates about the -axis by the pitch angle (γ), then as perfectly explained in [21], the measured height of the target will be a function of the inclination angle of the laser sensor. Figure 7 shows three target measurement conditions under the pitch angle deviation. In the ideal measurement condition (Figure 7, left), the sum of the measured height H1 and the installation height of the device Hdev is equal to the true height H. However, if the sensor is aligned along a downward slope (Figure 7, center), H1 will become H2 which is larger than H1. This will lead to a measured height H2 + Hdev which is larger than the true height H. Similarly, if the sensor is aligned along an upward slope (Figure 7, right), this will lead to a measured height H3 + Hdev, which is smaller than the true height.

The measurement error associated with the inclination angle offset could be corrected by rotating the laser sensor data about the -axis by the inclination angle or the pitch angle (α).

If there is only a rotation about the -axis by the yaw angle , then the sensor detects the target surface and the sliding table which affect the detection result in 3 ways: (1) depth value became too large, (2) detected point will be in advance or delay from the detected frame, and (3) distorted dataset results in missing or wrong detection points. The yaw angle deviation can be corrected by applying (9). The equation rotates each laser detected data point about the -axis at the offset yaw angle () [23].

2.3. Attitude Angle Offset Correction Algorithm Validity Verification

The validity of the proposed attitude angle offset correction algorithm was verified by conducting an indoor target profile measurement test under a simulated laser sensor path. Figure 8 shows the developed spray target detection platform with an adjustable laser sensor frame. The adjustable frame was built to set the orientation of the laser sensor for the sensor path simulation.

In this indoor experiment, an artificial tree and a tree-shaped carved board were included as target of interest (Figure 9). During the experiment, the target objects were positioned on the ground in such a way that their centerlines were located in the same straight line which was parallel to the laser sensor travel direction. To meet the required detection distance in the real spray application filed, the distance between the target objects and the laser sensor is set to be 2 m. The actual physical dimensions of the target objects are listed in Table 1.


ObjectsHeight (m)Width (m)Canopy height (m)

Tree-shaped carved board1.801.201.40
Artificial tree1.601.101.00

The experiment mainly included two parts: data collation and data analysis; the related operation flowchart is given in Figure 10. In the data collection, the laser sensor was driven along the slipway at a constant travel speed to continuously scan the target objects. Real-time measurement data was stored in the on board computer in the form of polar coordinates. However, the target profile measurement calculation, attitude angle offset correction, and 3D image construction were performed offline by an exclusively designed program in MATLAB software (Version 7.7.0.471, Math works, Inc., Natick, Massachusetts). The distance data matrix was kept in a format of gray scale values for constructing pseudo-color images that mapped the 3D object surface [12]. During the image construction process, the program also filtered unnecessary measurement points reflected from the ceiling and the ground based on a distance threshold value. The threshold was determined based on the distance of the target object from the laser sensor. Based on the threshold value, the data points are classified either as necessary or unnecessary measurement points.

The verification experiment for the proposed correction algorithm consisted of three test scenarios. Firstly, the laser sensor was adjusted in the ideal measurement condition where there is no attitude angle deviation (the sensor is perfectly paralleled to the ground surface). Under this condition, the tree-shaped carved board and the artificial tree geometrical profiles were continuously measured at 0.6 m s sensor travel speed and 2 m detection distance. The real-time measurement data was stored in the on board computer in the form of polar coordinates. It was then analyzed offline by the MATLAB program. Secondly, the tree-shaped carved board was selected as the laser sensor target object to test a single-attitude angle offset correction algorithm. The laser sensor was manually oriented for six selected attitude angle values (−30°, −20°, −10°, 10°, 20°, and 30°) for roll, pitch, and yaw, respectively, to simulate single uneven road condition. For each attitude angle (yaw, roll, and pitch) deviation, the profile of the target object was repeatedly measured for five times at the same travel speed and detection distance as in an ideal measurement condition. The single-attitude angle offset correction algorithm discussed in Section 2.3 was used to correct errors associated with these attitude angle deviations. The entire data collection and data processing of the experiment were performed according to the test procedure flow chart shown in Figure 10.

In the last test scenario, three kinds of attitude angles were combined to verify the validity of the combined attitude angle deviation correction algorithm. During the test, the orientation of the laser sensor was adjusted by the three groups of roll, yaw, and pitch angles in the , , and directions to simulate complex laser sensor path. Attitude angle (roll, pitch, and yaw) values such as (, , and ), (, , and ), and (, , and ) were selected for group 1, group 2, and group 3, respectively. For each group of combined attitude angle deviations, the artificial tree was repeatedly scanned for five times with the same sensor travel speed and detection distance mentioned in the above two test scenarios. The data collection and data processing were performed based on the operation flow chart described in Figure 10.

3. Result and Discussion

In this section, the experiment results obtained from the above three correction algorithm validity tests are presented.

3.1. Test under Ideal Measurement Condition

During the first experiment in which the laser sensor was oriented in the ideal measurement condition, the real-time measurement data was processed offline by the MATLAB program. The program calculated the dimensions of the two target objects (tree-shaped carved board and the artificial tree) from the acquired raw distance matrix. 3D images of the targets were constructed and analyzed with respect to digital photos of the targets. Figure 11 and Table 2 show the reconstructed 3D images and profile dimensions of the target objects, respectively.


Target parametersTree-shaped carved boardArtificial tree
Actual value (m)Average value (m)Absolute error (m)Relative error (%)Actual value (m)Average value (m)Absolute error (m)Relative error (%)

Height1.801.820.021.111.601.570.031.87
Width1.201.160.043.331.101.070.032.73
Canopy height1.401.360.042.851.000.980.022.00

Figure 11 shows images of the two target objects obtained from the digital camera and the laser sensor, respectively. Images from the laser sensor were reconstructed from the dataset acquired at 2 m detection distance and 0.6 m s−1 travel speed by the MATLAB program. The colors in the image represent distances from the laser sensor to the surfaces of the target objects. As depicted in the figure, each paired images of the target objects under ideal measurement conditions comparatively matched each other. Table 2 shows the actual value, the average value, and the absolute error and relative error values of the target object parameters. As reported in the table, the relative error for both target object canopy height measurements in the specified detection distances and the travel speed is less than 3.0%. However, for width and height measurements, the relative errors are ranging from 1.11% to 3.33%.

The obtained result shows that the accuracy of the laser sensor to measure the geometrical profile of the target objects in the ideal measurement condition is in an acceptable range.

3.2. Test under Single-Attitude Angle Deviation

In the second test scenario where a single-attitude angle deviation existed, the results with and without correction algorithm were compared to verify the performance of the single-attitude angle offset correction algorithm. The result shows that the correction algorithm consistently improved the accuracy across the selected attitude angle values. Figure 12 shows the reconstructed 3D images of the tree-shaped carved board before and after correction algorithm is applied. The 3D images were developed from the laser sensor measurement dataset which was acquired at 0.6 m s−1 travel speed and 2 m detection distance with 20° and −20° attitude angle offsets. The color of the image in the figure represents the depth value of the target object. The change in color in Figures 12(a)–12(d) and 12(g)–(j) before and after correction indicates the depth value data correction. As depicted in Figures 12(e), 12(f), 12(l), and 12(k), the 20° and −20° pitch angle offsets have significant influence on the target object canopy characterization. The influence is clearly seen on the reconstructed 3D images before correction. The algorithm reduced the relative error in canopy height measurement from 5.71% to 2.14% for 20° and from 8.57% to 2.14% for −20° pitch angle deviations (Table 3). This is also illustrated in the corrected 3D image of the target object (Figures 12(k) and 12(i)). Table 3 also shows the relative errors of the target object parameters before and after correction algorithm for each attitude angle deviation.


Attitude angle deviations (°)Roll angleYaw anglePitch angle
HeightWidthCanopy heightHeightWidthCanopy heightHeightWidthCanopy height
Average value with correction (m)/relative error with correction (%)/average value w/o correction (m)/relative error w/o correction (%)Average value with correction (m)/relative error with correction (%)/average value w/o correction (m)/relative error w/o correction (%)Average value with correction (m)/relative error with correction (%)/average value w/o correction (m)/relative error w/o correction (%)Average value with correction (m)/relative error with correction (%)/average value w/o correction (m)/relative error w/o correction (%)Average value with correction (m)/relative error with correction (%)/average value w/o correction (m)/relative error w/o correction (%)Average value with correction (m)/relative error with correction (%)/average value w/o correction (m)/relative error w/o correction (%)Average value with correction (m)/relative error with correction (%)/average value w/o correction (m)/relative error w/o correction (%)Average value with correction (m)/relative error with correction (%)/average value w/o correction (m)/relative error w/o correction (%)Average value with correction (m)/relative error with correction (%)/average value w/o correction (m)/relative error w/o correction (%)

−101.83/1.66/1.76/2.221.18/1.66/1.26/3.331.41/0.07/1.44/2.851.82/1.67/1.84/2.221.18/1.67/1.17/2.501.42/1.42/1.34/4.281.82/1.11/1.85/2.771.18/1.67/1.15/4.171.41/0.71/1.47/5.00
−201.85/2.78/1.72/4.441.18/1.66/1.26/3.331.42/1.43/1.45/3.571.83/1.66/1.75/2.771.18/1.67/1.17/2.501.38/1.42/1.44/2.851.78/1.11/1.84/2.221.23/2.50/1.16/3.331.38/1.42/1.44/2.85
−301.65/8.33/1.32/26.441.18/1.66/1.27/5.831.42/1.43/1.47/5.001.78/1.11/1.71/5.001.18/1.67/1.17/2.501.38/1.42/1.34/4.281.83/1.66/1.86/3.331.15/4.17/1.12/6.661.42/1.43/1.46/4.28
101.83/2.22/1.85/2.771.18/1.66/1.17/2.141.41/0.07/1.38/1.431.78/1.11/1.76/2.221.18/1.67/1.17/2.501.38/1.42/1.43/2.141.82/1.11/1.77/1.661.17/2.50/1.23/5.381.42/1.43/1.43/2.14
201.84/2.22/1.86/3.331.18/1.66/1.17/2.141.42/1.43/1.36/2.851.78/1.11/1.73/3.891.22/1.66/1.18/1.661.39/0.71/1.45/3.571.82/1.11/1.75/2.771.24/3.33/1.26/5.001.37/2.14/1.38/1.42
301.84/2.22/1.88/4.441.18/1.66/1.16/2.851.43/2.14/1.35/3.571.76/2.22/1.72/4.441.22/1.67/1.17/2.501.38/1.42/1.47/5.001.84/2.22/1.73/3.881.23/3.33/1.25/4.161.43/2.14/1.45/3.57

As reported in Table 3, the relative errors of the target object parameters were improved after the correction algorithm was applied. The method was performed well for low attitude angle values. However, for roll angle −30°, the target height relative error before correction is about 26.44% due to the large roll angle offset. This large roll angle deviation causes the laser sensor to have a big blind spot on the detection side. In the actual spraying process, the undulation of the soil surface might not be too big to cause this much angular offsets. Generally, the experiment result verifies that the correction method for the single-attitude angle offset has a significant positive influence on the error reduction process.

3.3. Test under Compound Attitude Angle Offset

As we discussed in Section 2.3, three different attitude angles were combined to verify the validity of the combined attitude angle deviation correction algorithm. During the test, the orientation of the laser sensor was adjusted by three groups of roll, yaw, and pitch angles in the , , and directions. The experiment results with and without correction algorithms are shown in Figures 13 and 14 and Table 4, respectively. The effect of the correction algorithm on the reconstructed 3D image of the target object can be clearly seen in Figure 13. The substantial change in color and shape in the images before and after the correction demonstrated that the developed correction algorithm can considerably adjust the distorted dataset for correct target profile measurement. As reported in Table 4, the measurement errors associated with attitude angle deviations are significantly improved by the correction algorithm. This is also showed in Figure 13. The bar graph in Figure 14 demonstrated that the relative errors in width, height, and canopy height measurement after correction became less than 7%.


Attitude angle deviations (°)HeightWidthCanopy height
Actual value (m)/LIDAR-measured average value with correction (m)/relative error with correction (%)/LIDAR-measured average value w/o correction (m)/relative error w/o correction (%)Actual value (m)/LIDAR-measured average value with correction (m)/relative error with correction (%)/LIDAR-measured average value w/o correction (m)/relative error w/o correction (%)Actual value (m)/LIDAR-measured average value with correction (m)/relative error with correction (%)/LIDAR-measured average value w/o correction (m)/relative error w/o correction (%)

Group 11.60/1.54/1.25/1.67/4.371.10/1.08/1.81/1.02/7.271.00/0.94/6.00/1.13/13.0
Group 21.60/1.56/2.50/1.48/7.501.10/1.07/2.73/1.15/4.541.00/0.97/3.00/1.11/11.0
Group 31.60/1.53/4.37/1.49/6.871.10/1.15/4.54/1.03/6.361.00/0.95/5.00/1.09/9.00

4. Conclusion

Erroneous dataset from the simulated uneven laser sensor path resulted in incorrect target profile measurement. Measurement error correction method was developed to efficiently correct the distorted data points. Since the occurrence of the distorted data points is governed by the attitude angle offset of the laser scanning sensor, the developed method was based on the coordinate system transformation and data point rotation under controlled conditions. In order to verify the validity of the correction method, an indoor LIDAR-based target profile measurement platform was built. The platform is composed of sliding motion control unit and target detection unit to provide precise laser sensor motion along the slipway and real-time laser sensor data collection. In the verification experiment, artificial tree and a tree-shaped carved board were used as laser scanner target objects. These target objects were continuously scanned at constant sensor travel speed and detection distances under three different measurement conditions. Firstly, both target objects were scanned in the ideal measurement condition where the laser sensor was oriented perfectly parallel to the ground surface. Secondly, the tree-shaped carved board was scanned under a single-attitude angle offset condition. Thirdly, the profile of the artificial tree was measured under a controlled compound attitude angle offset. The obtained results indicated that the single-attitude angle offset correction method was able to correct the distorted data points in a tree-shaped carved board profile measurement, with a relative error of 5%, while the compound attitude angle offset correction method was effective to reduce the error associated with compound attitude angle deviation from the ideal sensor pose, with a relative error of 7%. The attitude angle offset correction methods, for both single and compound attitude angle deviations, are limited on a controlled and simulated environment which makes the method unfeasible to practical applications of automated crop monitoring in precision agriculture especially in orchard cultivations. However, this study plays an initial level role in the future research work in the field. Our approaches proved that different positions of the laser sensor with respect to the ideal position can affect the final result of the canopy detection. The real-time orientation of the laser sensor could be measured by IMU (inertial measurement unit) and DGPS. By developing a real-time attitude angle inclination correction algorithm based on the real-time laser sensor position data, it is possible to improve the detection system for several applications in precision agriculture and this would be the extension of this research work.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

The authors gratefully acknowledge the funds provided by the National Natural Science Foundation of China (Grant no. 51505195) and the Priority Academic Program Development (PAPD) of Jiangsu Higher Education Institutions (China).

References

  1. J. R. Rosell and R. Sanz, “A review of methods and applications of the geometric characterization of tree crops in agricultural activities,” Computers and Electronics in Agriculture, vol. 81, pp. 124–141, 2012. View at: Publisher Site | Google Scholar
  2. J. R. R. Polo, R. Sanz, J. Llorens et al., “A tractor-mounted scanning LIDAR for the non-destructive measurement of vegetative volume and surface area of tree-row plantations: a comparison with conventional destructive measurements,” Biosystems Engineering, vol. 102, no. 2, pp. 128–134, 2009. View at: Publisher Site | Google Scholar
  3. L. N. Liu Hui, S. Yue, and X. Hui, “Spray target laser scanning detection and three-dimensional reconstruction under simulated complex terrain,” Transactions of the Chinese Society of Agricultural Engineering, vol. 32, 2016. View at: Google Scholar
  4. D. K. Giles, P. Klassen, F. J. A. Niederholzer, and D. Downey, “Smart” sprayer technology provides environmental and economic benefits in California orchards,” California Agriculture, vol. 65, no. 2, pp. 85–89, 2011. View at: Publisher Site | Google Scholar
  5. Q. U. Zaman, A. W. Schumann, and W. M. Miller, “Variable rate nitrogen application in Florida citrus based on ultrasonically-sensed tree size,” Applied Engineering in Agriculture, vol. 21, no. 3, pp. 331–335, 2005. View at: Publisher Site | Google Scholar
  6. M. Bietresato, G. Carabin, R. Vidoni, A. Gasparetto, and F. Mazzetto, “Evaluation of a LiDAR-based 3D-stereoscopic vision system for crop-monitoring applications,” Computers and Electronics in Agriculture, vol. 124, pp. 1–13, 2016. View at: Publisher Site | Google Scholar
  7. C. J. Houldcroft, C. L. Campbell, I. J. Davenport, R. J. Gurney, and N. Holden, “Measurement of canopy geometry characteristics using LiDAR laser altimetry: a feasibility study,” IEEE Transactions on Geoscience and Remote Sensing, vol. 43, pp. 2270–2282, 2005. View at: Publisher Site | Google Scholar
  8. E. Naesset and K. O. Bjerknes, “Estimating tree heights and number of stems in young forest stands using airborne laser scanner data,” Remote Sensing of Environment, vol. 78, no. 3, pp. 328–340, 2001. View at: Publisher Site | Google Scholar
  9. E. Næsset and T. Økland, “Estimating tree height and tree crown properties using airborne scanning laser in a boreal nature reserve,” Remote Sensing of Environment, vol. 79, no. 1, pp. 105–115, 2002. View at: Publisher Site | Google Scholar
  10. J. O. Sexton, T. Bax, P. Siqueira, J. J. Swenson, and S. Hensley, “A comparison of lidar, radar, and field measurements of canopy height in pine and hardwood forests of southeastern North America,” Forest Ecology and Management, vol. 257, no. 3, pp. 1136–1147, 2009. View at: Publisher Site | Google Scholar
  11. J. Wei and M. Salyani, “Development of a laser scanner for measuring tree canopy characteristics: phase 2. Foliage density measurement,” Transactions of the ASAE, vol. 48, no. 4, pp. 1595–1601, 2005. View at: Publisher Site | Google Scholar
  12. H. Z. Hui Liu, “Evaluation of a laser scanning sensor in detection of complex-shaped targets for variable-rate sprayer development,” Transaction of the ASABE, vol. 59, no. 5, pp. 1181–1192, 2016. View at: Publisher Site | Google Scholar
  13. M. Grella, P. Marucco, M. Manzone, M. Gallart, and P. Balsari, “Effect of sprayer settings on spray drift during pesticide application in poplar plantations (Populus spp.),” Science of the Total Environment, vol. 578, pp. 427–439, 2017. View at: Publisher Site | Google Scholar
  14. H. Si-Wu, L. Chaoying, S. Zheying, and W. Zengfang, “Real-time intelligent control of liquid level system based on MCGS and MATLAB,” in 2014 International Conference on Machine Learning and Cybernetics, pp. 131–136, Lanzhou, China, 2014. View at: Publisher Site | Google Scholar
  15. Yaskawa Electric Corporation, Sigma-II Series SGMH/SGDM User’s Manual, Yaskawa Electric Corporation, Japan, 2003.
  16. J. Llorens, E. Gil, J. Llop, and A. Escola, “Ultrasonic and LIDAR sensors for electronic canopy characterization in vineyards: advances to improve pesticide application methods,” Sensors, vol. 11, no. 12, pp. 2177–2194, 2011. View at: Publisher Site | Google Scholar
  17. P. A. Barankov, “Coordinate and basis transformations in view of the interrelation of two finite rotations of a rigid body,” Cosmonautics and Rocket Engineering, vol. 2015, pp. 111–119, 2015. View at: Google Scholar
  18. J. Wei and M. Salyani, “Development of a laser scanner for measuring tree canopy characteristics: phase 1. Prototype development,” Transactions of the ASAE, vol. 47, no. 6, pp. 2101–2107, 2004. View at: Publisher Site | Google Scholar
  19. J. Llorens, E. Gil, J. Llop, and M. Queralto, “Georeferenced LiDAR 3D vine plantation map generation,” Sensors, vol. 11, no. 12, pp. 6237–6256, 2011. View at: Publisher Site | Google Scholar
  20. L. Ji, H. Wang, T. Zheng, and X. Qi, “Motion trajectory of human arms based on the dual quaternion with motion tracker,” Multimedia Tools and Applications, vol. 76, no. 2, pp. 1681–1701, 2017. View at: Publisher Site | Google Scholar
  21. L. Zhang and T. E. Grift, “A LIDAR-based crop height measurement system for Miscanthus giganteus,” Computers and Electronics in Agriculture, vol. 85, pp. 70–76, 2012. View at: Publisher Site | Google Scholar
  22. B. Hallberg, “Method for determining orientation of inertial measurement unit (IMU), involves using angular orientation errors to selectively mix gyroscopic quaternion and field quaternion to supply current sample quaternion,” US Patent US2016327396-A1. View at: Google Scholar
  23. I. del-Moral-Martinez, J. Arno, A. Escola et al., “Georeferenced scanning system to estimate the leaf wall area in tree crops,” Sensors, vol. 15, pp. 8382–8405, 2015. View at: Google Scholar

Copyright © 2017 Yue Shen et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views1087
Downloads381
Citations

Related articles

We are committed to sharing findings related to COVID-19 as quickly as possible. We will be providing unlimited waivers of publication charges for accepted research articles as well as case reports and case series related to COVID-19. Review articles are excluded from this waiver policy. Sign up here as a reviewer to help fast-track new submissions.