Autonomous navigation is crucial for various robotics applications in agriculture. However, many existing methods depend on RTK-GPS devices, which can be susceptible to loss of radio signal or intermittent reception of corrections from the internet. Consequently, research has increasingly focused on using RGB cameras for crop-row detection, though challenges persist when dealing with grown plants. This paper introduces a LiDAR-based navigation system that can achieve crop-agnostic over-canopy autonomous navigation in row-crop fields, even when the canopy fully blocks the inter-row spacing. Our algorithm can detect crop rows across diverse scenarios, encompassing various crop types, growth stages, the presence of weeds, curved rows, and discontinuities. Without utilizing a global localization method (i.e., based on GPS), our navigation system can perform autonomous navigation in these challenging scenarios, detect the end of the crop rows, and navigate to the next crop row autonomously, providing a crop-agnostic approach to navigate an entire field. The proposed navigation system has undergone tests in various simulated and real agricultural fields, achieving an average cross-track error of 3.55cm without human intervention. The system has been deployed on a customized UGV robot, which can be reconfigured depending on the field conditions.
Our crop-row detection algorithm is comprised of three key components. First, we estimate the ground plane using the LiDAR’s tilted angle θ, raise it to intersect the point cloud centroid, and filter out points below this plane. This process isolates the returns corresponding only to the top of the plants. Second, employing this filtered LiDAR data, we apply the K-means clustering algorithm to segment crop rows autonomously. The generated centroids of these segments represent the center of crop rows. We utilize the robot’s filtered odometry, [x, y, ψ], to accumulate detected centroids into the robot frame within a short time window. This approach allows us to have accurate local positioning and avoid drifting. Finally, we implement the RANSAC line fitting algorithm on this crop-row centroids map, extracting 2D line locations for the first row on the left and the first row on the right in the robot frame.
After crop-row detection, we generate waypoints along the center line of the two predicted crop rows. We apply a nonlinear Model Predictive Control (MPC) algorithm in the robot's local frame for tracking the generated waypoints. ACADO is used to solve the quadratic programming problem, enabling real-time operation
We conducted experiments in both Gazebo simulated environments and real fields with different crops (corn & soybean) and growth stages (young & grown) to test our crop-row detection and crop-row following algorithms performance.
In this paper, we present a novel LiDAR-based crop-row detection approach that integrates the Model Predictive Control (MPC) and lane-switching algorithm to create an autonomous navigation system for agricultural robots in row-crop fields. This system facilitates independent robot navigation for diverse agricultural tasks, contributing to precision farming. Our crop-row detection method utilizes 3D LiDAR data to extract the height information and accurately detects crop rows amidst challenging scenarios such as canopy obstructions. The whole navigation system incorporates the crop-row detection, following, and switching algorithm, enabling automated tracking of detected crop rows and full field coverage. This navigation system is evaluated in both actual fields and Gazebo simulated fields with a 1:1 scale Amiga robot model. The crop-row detection algorithm achieves an average detection accuracy of 3.35cm, while the crop-row following algorithm achieves an average driving accuracy of 3.55cm. Future work will focus on improving the robustness of the crop row perception algorithm by integrating camera data, especially to handle gaps between plants during the germination stage.
@misc{liu2024overcanopyautonomousnavigationcropagnostic,
title={Towards Over-Canopy Autonomous Navigation: Crop-Agnostic LiDAR-Based Crop-Row Detection in Arable Fields},
author={Ruiji Liu and Francisco Yandun and George Kantor},
year={2024},
eprint={2403.17774},
archivePrefix={arXiv},
primaryClass={cs.RO},
url={https://arxiv.org/abs/2403.17774},
}