Guidance of an agricultural robot with variable angle-of-view camera arrangement in cornfield

Guidance performances of an agricultural robot were evaluated when the robot traveled along cornrows and performed the row to row turning maneuver in cornfield with variable angle of view (AOV) camera arrangement. The arrangement with one low-cost camera has flexible AOV by controlling Servo motors motions as human vision. In this work, a forward far AOV was used to segment clearly and quickly cornrows due to more crop information and strong contrast between cornrows and soil, and a forward near AOV with less crop information was chosen once the number of crops pixels reduced due to the spacing headland. And a lateral near AOV was adopted to guide the robot by acquiring the crop image near the end of cornrows, due to a blind spot area in front of the robot for the near AOV arrangement. Then the headland turning was performed. During headland turning, the robot obtained the cornrows information using a lateral far AOV arrangement. After the headland turning, the robot traveled in the next cornrow with the AOV same to the first far AOV camera arrangement. Guidance information was detected with an image-processing algorithm employing mathematically morphological operations, and fuzzy logic control was applied to guide the robot according to guidance information. Tests showed that the robot traveled successfully and performed the headland turning in corn field with three replications. Test data from GPS were processed and showed stable guidance behavior with a maximum guidance error 22.58 mm and good headland turning operation. The low-cost variable AOV camera arrangement has a promising field application for autonomous agricultural vehicles.


INTRODUCTION
Driving along the crop row and turning at the headland (headland turning) are the main maneuvers for autonomous agricultural vehicles.For the former, although there are many researches based on all kinds of sensors, visual navigation has been a research hotspot due to its low cost and good navigation performance (Reid et al., 2000;Mousazadeh, 2013).For example, some guidance systems based on camera were developed and worked in tractors (Easterly et al., 2010), a field sprayer (Tian et al., 1999), a weeding cultivator (Tillett et al., 2002) and a grain harvester (Benson et al., 2003).Additionally, the same method was applied for autonomous agricultural robots (Bakker et al., 2008).These applications have the common characteristics -a fixed forward angle of view (AOV) of camera.However, the camera with a fixed AOV at a specific height that produces only a fixed vision scene cannot obtain the different scenes and more information with flexible vision angles as human vision by looking up, looking straight ahead, looking down and turning.Meanwhile, the farther forward AOV of cameras, including cameras with larger field of view indicates the longer blind distance in front of vehicle which is not appropriate to detect crop plants near the end of crop rows.On the other hand, cameras with wider field of view such as omnidirectional cameras are becoming attractive in some mobile robotic applications, but there are some questions such as geometric constraints, distortion and higher prices (Mariottini et al., 2012).
Meanwhile, more and more attention has been paid to the studies on headland turning in recent years.Zhu et al. (2007) created a suboptimal reference course for headland turning of a robot tractor and design a pathtracking controller to guide the robot tractor along the reference course by using a fiber optic gyroscope, a laser auto-tracking range finder, a potentiometer, computer simulation and field tests were performed to validate the path-tracking performance.Subramanian and Burks (2009) developed a row to row turning maneuver of an autonomous at the headlands of citrus groves on the basis of the sensors such as video camera, laser radar, inertial measurement unit and wheel encoder.But Bakker et al. (2009) identified the advantages (the smaller headland space and the less time for turning) of using all degrees of freedom of four-wheel steered vehicles for headland turning by simulating the kinematic model compared to the current manual control using only two degrees of freedom.Huang et al. (2010) presented path planning and headland turning control algorithms based on the improved pure pursuit model for autonomous agricultural machine.The simulation results showed that algorithms were simple with small headland space and high tracking accuracy.
Of course, the author and his colleagues (Xue, 2011;Xue et al., 2012) have conducted some researches on the guidance and headland turning of a robot with a variable AOV camera arrangement.The arrangement mainly consisting of a low-cost webcam and two or three low-cost Servo motors have flexible AOV as human vision.Here, according to the works done before, the author conducted further tests to validate the application of row guidance and headland turning with the variable AOV camera arrangement, when the robot traveled along the cornrows and performed the row to row turning maneuver in cornfield.

Variable AOV camera arrangement and robot platform
A fixed forward AOV usually has such specific vision range as the ABCD area shown in Figure 1 when the camera is fixed with certain height and vision angle.Except the ABCD area, the other area belongs to blind spots and it is impossible to detect crop plants in the blind spots in front of camera with the distance L in the horizontal plane.To acquire more information, we can take the smaller vision angle α which means getting a bigger vision range, but with longer blind distance L as shown in Figure 1(a).If bigger vision angle α is taken, it indicates smaller vision range and less information but with shorter blind distance L as shown in Figure 1(b).
The variable AOV camera arrangement was developed by mounting camera on a fixture driven by Servo motors as shown in Figure 2 where Motors 1 and 2 controlled the pitch and yaw motion of camera, respectively.All angles of camera were calculated according to the camera position that camera is looking straight ahead where vision angle is set to 0˚.The pitch angle was defined as α up or α down when the camera is looking up and down, respectively and the yaw angles were defined as β left and β right when the camera turns to the left and to the right, respectively.Therefore, the camera arrangement has the similar characteristics to human vision such as looking up, looking straight ahead, looking down and turning.
The developed variable AOV camera arrangement was mounted on a robot platform named "AgTracker", its detail was illustrated by Xue et al. (2012).

Row guidance methods
The variable AOV camera arrangement can be used to find guidance line by detecting crop plants flexibly.In this work, some AOVs including far AOV, near AOV and lateral AOV were used to guide the robot along cornrows at both sides of the robot.When the robot was far away from the headland, the first far AOV, with a number of crop information, was employed to segment two cornrows clearly and quickly due to the strong contrast between green cornrows and soil.Once green corn pixels reduced due to the upcoming headland space, a near AOV including less crops information was chosen.Due to a certain blind distance in front of the robot, a lateral AOV with the least crops information was implemented to guide the robot near the end of cornrows when crops pixels in the near AOV were less than the second threshold value, which implied that the end of cornrows was approaching.The second far AOV was chosen during the headland turning, and the first far AOV was used again to guide the robot in the new cornrows after the headland turning.
The measure principle of the offset and heading angle in the different AOV arrangement was detailed in the literature (Xue et al., 2012).With a comprehensive consideration of the characteristics of corn plants and corn rows, the detection algorithm of guidance row was developed to guide the robot between cornrows as shown in Figure 3.What is interested was not the whole image but a local window including two cornrows or one side cornrow when the robot platform traveled along cornrows.The local window interested can improve the real-time detection algorithm and minimize the effect of useless information and noises.In this work, different size windows were employed to different AOV camera arrangement, the details were introduced in Section 3. Subsequently, the green feature extraction was conducted to separate effectively the crop from soil image was converted to a binary image.Then, mathematically morphological opening operations with line-shaped structuring element were chosen to segment and identify corn plants and corn rows.According to the new binary image, some equidistant image strips were divided, and the edge pixel point coordinates of two side cornrows were searched in each image strip for the forward AOVs.For the lateral AOV, the edge pixel point coordinates nearest to maximum pixel column were found for every corn crop.Based on the edge pixel point coordinates, the linear fitting method was used to calculate the line(s) of corn row(s).Finally, the guidance line was calculated according to the line(s) of corn row(s).

Headland turning methods
Once no crop plants appears in image, it shows that the robot is approaching the headland and it is about to perform the headland turning.Given that the robot turn left to the next row, the headland turning operation is as follows: (1) Moving forward a distance: The robot moved forward a distance along current direction without image acquisition and processing.In the work, the distance was set to 1 m, which ensured the robot completely away from the end of cornrows.
(2) First steering operation: Without direction sensor, indoor tests and field trials in sequence were conducted to determine the speeds of both side wheels and steering time for steering to left which will be introduced below.To simplify the headland turning control, the wheel speeds and steering time were set fixed values   which ensured the robot turn 90˚ to left.

u\\E
(3) First position calculation: The robot stopped and changed the camera to the second far AOV arrangement which means that the camera was turned to the direction perpendicular to longitudinal direction of the robot in the horizontal plane, thus the camera captured the image of the left cornrows and calculated the position and direction of the robot relating to the next cornrows.
(4) Backing operation: This operation must be performed, which depends on inter-rows distance in cornfield and length of the robot.
According to the Step (3), the backward distance and the wheels speeds are determined.If the robot is perpendicular to the cornrows, it backs for a distance at the same speeds of both side wheels.Otherwise, it needs to change the wheels speeds to ensure the robot perpendicular to the cornrows as far as possible.
(5) Second steering operation: The operation was same to the Step (2).( 6) Second position calculation: The robot stopped again and changed the camera to the first far AOV again and then it traveled along the next cornrows according to the robot's position and heading calculated by using the far AOV arrangement.

Row guidance and headland turning control
The robot was driven by Servo motors via chains and gearboxes and steered according to the different speeds of left and right wheels.After offset and heading angle of the robot are found out, the row guidance control and backing control during headland turning were implemented by adjusting pulse-width modulation (PWM) value of motors which caused change of the wheels speeds.Fuzzy control was adopted to realize the control of row guidance with two inputs of lateral offset E and heading angle θ of the robot platform and one output U of the difference between two side PWM values as shown in Table 1.Here, the input and output fuzzy sets ere {NG, NM, NS, ZE, PS, PM, PG}, and the universes of discourse were [-30 cm, 30 cm], [-30º, 30º] and [-300, 300] for lateral offset E, heading angle θ and PWM difference U respectively, and their fuzzy domains all were {-3, -2, -1, 0, 1, 2, 3}.It should be noted that, negative sign of lateral offset, heading angle and PWM control difference means that the robot deviated from the centerline to the right side, performs clockwise deflection and steers to the right, respectively.And triangular membership function using uniform distribution, Max-Min fuzzy inference algorithm and decoupling sentencing law with the method of center of gravity were applied in the fuzzy guidance control.
During headland turning, the second far AOV camera arrangement was employed to calculate the robot's position and heading, thus it further determines how to perform the backing operation.An indoor simulation was chosen to decide the robot's position relative to the cornrows as shown in Figure 4, where three black tape lines represented three actual cornrows.After the robot accomplished first steering operation, the second far AOV was

Experimental procedures
In order to verify performances of the variable AOV guidance system, tests were conducted in cornfield of experimental farm in the University of Illinois as shown in Figure 5. Corn crops were planted in rows with about 750 mm inter-row distance and about 150 mm intra-row distance and the height of corn was about 200 mm.A webcam was selected as visual device and was connected to a laptop computer.Crop images were acquired and processed using Matlab image acquisition and processing software.After processing an image, the computer sent commands to the microcontroller in the robot platform through serial communication.
For image segmentation, a large number of crop images with the different time and different weather were sampled to obtain the appropriate threshold values of images and structuring elements.The structuring element, as an essential part of open and close operations in morphological operations, was used to probe input binary images.With a 0.2 m/s initial speed, the robot traveled along cornrows from the starting point, 30 m away from the end of cornrows to the stopping point in the next cornrows, 30 m away from the end of cornrows too.During the headland turning operation, the left and right wheel speeds were set to 0.01 and 0.2 m/s for two steering operation, the steering times of the first steering, the second steering were 8 s and backward speed was set to 0.1 m/s.
At the beginning of tests, the first far AOV (α down = 25˚, β left = 0˚) was chosen to guide the robot (Figure 6(a)).When the crop pixels fell to the first threshold, a near AOV (α down = 48˚, β left = 0˚) was adopted automatically (Figure 6(b)).Once the crops pixels became smaller than the second threshold which implies that the end of cornrows is approaching, a lateral AOV (α down = 80˚, β left = 30˚) was employed to find corn plants in left side (Figure 6(c)).Once no corn plants appeared in images, the headland turning operation was performed.During this operation, the second far AOV (α down = 0˚, β left = 90˚) was selected to calculate the robot position and direction, thus to plan the backing operation during headland turning (Figure

6(d))
. Then the robot traveled to the stopping point corresponding to the starting point in the next cornrows after the headland turning.
According to row guidance method mentioned above, images from the first far AOV, the near AOV and the lateral AOV camera arrangement were processed (Figures 7, 8 and 9).It should be noted that a small rectangle window from 110 pixel row to 330 pixel row was used for the image from the far AOV, a large rectangle window from 1 pixel row to 280 pixel row for the image from the near AOV, and a large side window from 1 pixel column to 340 pixel column for the image from the lateral AOV.
The algorithm of the whole test is shown in Figure 10.Once corn pixel value Np fell to the first threshold value Thresh 1 due to the upcoming headland space, a near AOV was taken.A lateral AOV with the least crops information was implemented to guide robot near the end of cornrows when crops pixels value Np in the near AOV were less than the second threshold value Thresh 2, which implied that the end of cornrows was approaching, due to a blind distance of 1.5 m in front of the robot in this work.If the robot arrived at the end of row, headland turning operation would be performed immediately.After the headland turning, the robot traveled along the new cornrows according to the information from the far AOV arrangement.
According to the algorithm, the tests were performed with three replications from the same starting point to the same stopping point.An real-time kinematic (RTK) GPS receiver (Trimble 5800 GPS), with 1 cm accuracy was mounted to store the running trajectory data of the robot for analyzing performances of row guidance and headland turning.Once positioned, the robot was guided autonomously and was stopped at the stopping position by an operator.
From GPS data, the maximum error, average error, RMS error and standard deviation were calculated to judge row guidance accuracy.And the point after first steering operation, the point after second steering operation, the farthest point of the robot trajectory were found out to show the headland turning performances.By comparing the end points after the two steering operation and the farthest point during first steering operation in every test and  surveying the turning trajectory, the effect of the stage of moving forward a distance during headland turning was evaluated.Moreover, the backing control was evaluated according to the backward distance and the backing trajectory.And the performance of second steering operation was analyzed according to the turning trajectory and the end point.

RESULTS AND DISCUSSION
Tests were performed to judge guidance accuracy of the variable AOV machine vision arrangement and headland turning performance under the test conditions mentioned above.During each test, the data from GPS were recorded automatically at 1 s intervals via serial communication cable to the laptop computer.The row guidance performances for each test were evaluated according to the maximum error, average error, RMS error and standard deviation of errors.Some points including the end points after the two steering operation and the farthest point, and the backward distance, relative to the starting point were tabled for each test.
The statistical results of the row guidance are shown in Table 2, where the errors were calculated according to the central line of the next cornrows when using the third AOV arrangement (same to the first AOV arrangement), and the measurement results of the headland turning are shown in Table 3.The "-" sign in Table 2 indicates that the error is biased to the right side of the robot at the starting point, and X and Y in Table 3 represent positions coordinates in the directions parallel to and perpendicular to cornrows, respectively.Figure 11 shows one trajectory of the three replications, when the robot traveled along cornrows from the starting point being 30 m away from the headland, implemented the headland turning to the next cornrows, and then traveled 30 m distance along the new cornrows.
From Table 2, the maximum average error and standard deviation were less than 8.78 and 6.46 mm in the total run, respectively, which indicates that the fuzzy control is striving to reduce the error to zero (Figure 11).Table 2, there is the smallest value for the far AOV which demonstrates a more accurate detection to crop rows and a better control performance.The maximum RMS error of 10.64 mm appears in the near AOV stage, which indicates an acceptable discrete degree of error data in this stage.Although, there is the smallest amount of crop information in the lateral AOV stage, it had the smallest value of maximum error in Table 2 due to short traveling distance of 2 m.
However, the maximum error 22.58 mm which may be related to wet soil and the track trend of convergence to zero indicated that the variable AOV machine vision guidance system performed well in the corn field.In addition, the distances from the starting point of the robot, at which the camera view was adjusted from the first far AOV to the near AOV and to the lateral AOV, are basically the same for the three replication, that is, about 10 m during the first far AOV stage and about 18 m during the near AOV stage.
From Table 3, there are obvious difference in Y direction for the data of position after the two steering operation and the farthest point.At the point after first steering operation and the farthest point, the offsets in Y direction were 23.07 and 19.53 mm, respectively which indicate that the stage of moving forward a distance has effect on first steering operation, although the effect did not make the performance of headland turning worse due to good backing control mentioned below.The difference of backward distance was no more than 18.79 mm among the three tests.Moreover, from Figure 11, it shows that the robot always strived to keep perpendicular to the cornrows during the backing operation.These show good backing control by using the second far AOV camera arrangement.In addition, the maximum deviation from the center line of was 17.26 mm after the headland turning in the second test.According to this data and the trajectory in Figure 11, it shows that the developed algorithm enables the robot to close the central line of the next cornrows with high accuracy.

Conclusions
Variable AOV machine vision based guidance systems were applied to guide a robot along cornrows and implement headland turning operation.Different AOV arrangements were tested such as far AOV, near AOV and lateral AOV.Guidance lines were found in corn field by using morphological operations, and a fuzzy logic control was implemented for the row guidance.GPS data were stored to evaluate the guidance performance and headland turning operation.
According to the tests data, the variable AOV camera arrangement had acceptable accuracy with the maximum guidance error of 22.58 mm, and the headland turning operation performed well, especially using of the far AOV before backward control ensured completion of this operation.The tests show that the method mentioned is capable of guiding a robot and accomplishing headland turning operation in a cornfield with acceptable accuracy and stability.Therefore, the low-cost variable AOV machine vision has a promising field application for autonomous vehicles.

Figure 1 .
Figure 1.Vision range and blind distance of a fixed forward AOV camera.(a) Small vision angle; (b) Big vision angle.

Figure 3 .
Figure 3. Detection algorithm of guidance line.

Figure 6 .
Figure 6.Several AOV arrangement in this work.(a) The first far AOV; (b) The near AOV; (c) The lateral AOV; (d) The second far AOV.
Compared to the average errors of different AOV stage in

Table 1 .
Table for fuzzy control rules.

Table 2 .
Statistical results of the row guidance.

Table 3 .
Measurement results of headland turning.