Robotic Perception for Precision Agriculture
Open access
Author
Date
2019-09Type
- Doctoral Thesis
ETH Bibliography
yes
Altmetrics
Abstract
To feed a growing world population with limited amount of arable land, we must develop new meth- ods of sustainable farming that maintain or increase yield while minimizing chemical inputs such as fertilizers, herbicides, and pesticides. Precision agriculture techniques seek to address this chal- lenge by monitoring key indicators of crop health and targeting timely treatment only to plants or infested areas that need it. Such monitoring is still, often, a time consuming and expensive activity and thus not performed as standard practice. Developing automated methods for such monitoring using unmanned aerial and ground vehicles can thus provide a major impetus for the adoption of precision agriculture practices at scale. This thesis deals with improving the perception capabilities of autonomous systems that can be deployed for environmental monitoring, especially in agricul- tural contexts. Deploying autonomous systems on agricultural fields is a challenging task, since the perception system must deal with an unstructured, dynamic environment under widely varying illumination and weather conditions. This thesis makes contributions towards three parts of the perception pipeline required for most autonomous systems - calibration, mapping and inference.
In the first part we look at radiometrically calibrating vision sensors, i.e. cameras in a field context. In contrast to existing, laboratory based approaches requiring specialized and expensive equipment, we develop a practical and modular method to radiometrically calibrate monochrome, colour and hyperspectral cameras with data that can be collected on the field. Our data driven, parameter free, maximum likelihood estimation based approach allows robust estimation of the sensor response, lens vignetting and global illuminant with minimal prior knowledge about the camera and lens setup in use. In the second part, leveraging developments in 3D photogrammetry, we propose methods to metrically map and extract plant trait indicators from crop fields using a low cost Unmanned Aerial Vehicle (UAV) carrying a camera. We show that crop parameters extracted using these 3D maps compare reasonably well with measurements taken on the ground and hence can be used to estimate the crop status, thus providing a practical and effective means of monitoring fields at high spatial and temporal resolution with minimal user intervention. Additionally, we propose a method by which a Unmanned Ground Vehicle (UGV) can localize itself in such a 3D map enabling higher resolution map updates and automated targeted interventions, such as delivering fertilizer or herbicide. We show that our collaborative mapping approach, which combines the 3D geometry of the crop field with a weakly semantic feature - a vegetation index, performs well on a wide variety of different crop fields and outperforms several state of the art map registration techniques for such scenarios. In the third part of this thesis, we present a generic, machine learning based framework for determining the types and severities of different stress factors affecting regions of a crop field using only remotely sensed data. Once trained, the classification models use radiometrically calibrated hyperspectral imagery, along with plant trait indicators extracted from a 3D point cloud of part of a crop field for quickly and systematically inferring the state of the crops with respect to the presence and severity of commonly occurring stress factors, such as nutrient deficiency as well as weed pressure. We show the effectiveness of the trained models by testing their predictions at different crop growth stages through a comprehensive greenhouse experiment following the growth of sugarbeet plants exposed to a variety of combinations of these stress factors.
In summary, this thesis focuses on enabling autonomous systems to create and maintain quantita- tive spatio-temporal spectral representations of the environment and using these representations for pertinent inference tasks within the context of precision agriculture. Most of the contributions of this thesis are accompanied by open source software and datasets to enable the community to benefit from and expand on our work. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000360967Publication status
publishedExternal links
Search print copy at ETH Library
Publisher
ETH ZurichSubject
Robotics in Agriculture and Forestry; Calibration techniques; Inference drawing; Unmanned aerial systemsOrganisational unit
03737 - Siegwart, Roland Y. / Siegwart, Roland Y.
Funding
644227 - Aerial Data Collection and Analysis, and Automated Ground Intervention for Precision Farming (SBFI)
More
Show all metadata
ETH Bibliography
yes
Altmetrics