Mapping skips in sugarcane fields using object-based analysis of unmanned aerial vehicle (UAV) images
Introduction
In agriculture, the availability of reliable and timely information on crop condition and the possibility of problems, assists proper planning decisions, which in turn contribute to increased profitability and reduced costs (Zhang et al., 2002).
Currently, because of the current demand for biofuels and sugar production, Brazil has large areas of agricultural land covered by sugarcane fields (Martinelli and Filoso, 2008, Nassar et al., 2008, Martinelli et al., 2011). Thus, developments of techniques and technologies aimed at operational efficiency, cost reduction, and yield increase have attracted considerable attention because of its economic and environmental importance (Abdel-Rahman and Ahmed, 2008, Bégué et al., 2010, Bocca et al., 2015).
The sugarcane cycle is semi-perennial with a growth cycle of approximately 12 or 18 months. After the first harvest, the ratoons are harvested annually for a period of about 5–7 years (Xavier et al., 2006, Rudorff et al., 2010). Therefore, to maintain the longevity and productivity of the crop field during this period, monitoring of the quality of the planting and harvesting operations is essential (Lebourgeois et al., 2010, Bocca et al., 2015). Matsuoka and Stolf (2012, p.148) defined “Gappy fields”, in terms of either plant canes or ratoons, as “a signal of bad crop management in most cases: bad soil preparation, inappropriate cultivar, improper seed cane, improper season for planting, unexpected disease or pest occurrence, herbicide damage, salinity, bad mechanical operations, mainly during harvest in the case of ratoons, if set aside climatic factors like freezing temperatures, lightening and extreme drought.”
The identification and quantification of skips in sugarcane fields is of great importance because it assesses the uniformity of germination and tillering and the consequent formation of stalks, which in turn is correlated directly with yield (Matsuoka and Stolf, 2012, Bocca et al., 2015). In addition to measuring the quality of the planting operation, quantification of skips provides support for decision making regarding the replanting or renovation of the field, in order to guarantee its continued profitability (Keerthipala and Dharmawardene, 2000, Matsuoka and Stolf, 2012). Currently, the presence of skips in planting rows is verified by in situ visual inspection. However, these inspections are generally conducted from the field borders and thus, heterogeneity across the entire crop field might affect the accuracy of such estimates (Bocca et al., 2015).
Recently, the use of remote sensing technologies for agricultural monitoring has been gaining ground, because they can provide spatially and temporally distributed information objectively and quickly over a variety of scales (Zhang et al., 2002, Ahamed et al., 2011, Mulla, 2013). The development of new technologies such as unmanned aerial vehicles (UAVs) as platforms for the acquisition of remote sensing imagery, allows some of the limitations of orbital and airborne platforms that hinder crop monitoring in real-time to be overcome, e.g., the suitability of revisit times, avoidance of cloud cover, costs, complexity of operation, and limitation of spatial resolution (Berni et al., 2009, Everaerts, 2009, Zhang and Kovacs, 2012, Colomina and Molina, 2014). These characteristics make UAV platforms suitable for a number of applications, including crop monitoring (Hunt et al., 2005, Hunt et al., 2010, Torres-Sánchez et al., 2014, Torres-Sánchez et al., 2015a, Comba et al., 2015), weed detection (Torres-Sánchez et al., 2013, Peña et al., 2015), water stress assessment (Berni et al., 2009, Zarco-Tejada et al., 2012), disease detection (Garcia-Ruiz et al., 2013), and yield estimation (Swain et al., 2010), i.e., applications where time-critical management is required.
UAVs can provide imagery with very high spatial resolution of only a few centimeters and they allow images to be acquired at optimal moments for the desired purposes, which makes them ideal for distinguishing crop plants during their first stages of development (Hengl, 2006, López-Granados, 2011, Torres-Sánchez et al., 2015a). However, very-high-resolution images require powerful image analysis procedures because, unlike lower resolution images, single pixels might no longer capture the characteristics of the classification targets. Additionally, these images show higher intra-class spectral variability and subsequently, a reduction in the degree of statistical separability among the classes compared with conventional pixel-based classification methods (Yu et al., 2006, Castillejo-González et al., 2014, Peña et al., 2015, Torres-Sánchez et al., 2015a). To overcome this limitation and to attain a high level of automation and adaptability, object-based image analysis (OBIA) has been used successfully with high-resolution satellite imagery (Novack et al., 2010, de Castro et al., 2013, Castillejo-González et al., 2014) and UAV imagery (Laliberte and Rango, 2011, Peña et al., 2013, Peña et al., 2015, Diaz-Varela et al., 2014, Qin, 2014, Torres-Sánchez et al., 2015a, Torres-Sánchez et al., 2015b). The OBIA approach first identifies spatially and spectrally homogeneous units called “objects”, which are created by grouping adjacent pixels following a segmentation process, and then using the created “objects” as the basic elements for analysis. Thus, it is possible to create automated and auto-adaptive classification methods by combining the spectral, contextual, morphological, and hierarchical information of these elements (Blaschke, 2010).
This article presents an innovative procedure for the creation of skip maps in sugarcane fields that combines high-resolution images from a commercial UAV and the OBIA approach to generate useful decision-support data. Based on UAV images, this procedure first performs the identification of the sugarcane crop rows. It then identifies the existent sugarcane within the crop rows and finally, performs skip extraction and the creation of field-extent crop maps.
Section snippets
Study area
The fields used in this study are located near Euclides da Cunha Paulista in São Paulo state, Brazil (22°26′21″S, 52°35′46″W). They have two predominant classes of soils: Rhodic Hapludox (Typic Hapludox) and Quartzarenic Neosol (Typic Quartzipsamment) and they present a moderate slope (<12%) that allows mechanical green harvest. The region has an average ground elevation of approximately 400 m a.s.l. and a humid subtropical climate (Cfa) according to the Köppen classification. The average annual
Global features
The procedure allows the computation of multiple datasets and statistics derived from the outputs, which can help characterize the crop field in terms of crop rows and skip incidence. In addition, the data can be exported in different formats, e.g., vector, raster, or tables, allowing further integration with other data sources. The global features calculated for the study area are presented in Table 1. The dimensions of the field were obtained from the shapefiles, which indicated a total area
Conclusions
The use of UAV images allows the creation of skip maps of sugarcane fields. The method presented in this study proved efficient in the estimation of skip length when compared with information derived in situ. Such information is useful for decision making, agricultural monitoring, and reduction of operational costs reduction, and it can help maintain the longevity and productivity of the crop over successive cycles. The use of UAV technology optimized the surveying of skips in fields,
Acknowledgments
This study was supported by the São Paulo Research Foundation FAPESP (Fundação de Amparo à Pesquisa do Estado de São Paulo) and Odebrecht Agro-Industrial (Process Number. 12/50048-7). The first author was supported by a PhD scholarship from CAPES (Coordenação de Aperfeiçoamento de Pessoal de Nível Superior). We also thank Prof. Mariana Abrantes Giannotti, coordinator of the Geoprocessing Laboratory (LABGEO) of the Polytechnic School of University of São Paulo, for allowing the use of the
References (50)
- et al.
A review of remote sensing methods for biomass feedstock production
Biomass Bioenerg.
(2011) Object based image analysis for remote sensing
ISPRS J. Photogramm. Remote Sens.
(2010)- et al.
When do I want to know and why? Different demands on sugarcane yield predictions
Agric. Syst.
(2015) - et al.
Evaluation of pixel- and object-based approaches for mapping wild oat (Avena sterilis) weed patches in wheat fields using QuickBird imagery for site-specific management
Eur. J. Agron.
(2014) - et al.
Unmanned aerial systems for photogrammetry and remote sensing: a review
ISPRS J. Photogramm. Remote Sens.
(2014) - et al.
Vineyard detection from unmanned aerial systems images
Comput. Electron. Agric.
(2015) - et al.
Automatic identification of agricultural terraces through object-oriented analysis of very high resolution DSMs and multispectral imagery obtained from an unmanned aerial vehicle
J. Environ. Manage.
(2014) - et al.
Comparison of two aerial imaging platforms for identification of Huanglongbing-infected citrus trees
Comput. Electron. Agric.
(2013) Finding the right pixel size
Comput. Geosci.
(2006)- et al.
Sugar and ethanol production as a rural development strategy in Brazil: evidence from the state of São Paulo
Agric. Syst.
(2011)
Twenty five years of remote sensing in precision agriculture: key advances and remaining knowledge gaps
Biosyst. Eng.
Cost of boundary manoeuvres in sugarcane production
Biosyst. Eng.
An automatic object-based method for optimal thresholding in UAV images: application for vegetation detection in herbaceous crops
Comput. Electron. Agric.
Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV
Comput. Electron. Agric.
Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera
Remote Sens. Environ.
Precision agriculture—a worldwide overview
Comput. Electron. Agric.
The application of remote sensing techniques to sugarcane (Saccharum spp. hybrid) production: a review of the literature
Int. J. Remote Sens.
Spatio-temporal variability of sugarcane fields and recommendations for yield forecast using NDVI
Int. J. Remote Sens.
Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging
Remote Sens.
Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle
IEEE Trans. Geosci. Remote Sens.
Broad-scale cruciferous weed patch classification in winter wheat using QuickBird imagery for in-season site-specific control
Precis. Agric.
Flood duration and time of flood onset effects on recently planted sugarcane
Agron. J.
Acquisition of NIR-green-blue digital photographs from unmanned aircraft for crop monitoring
Remote Sens.
Evaluation of digital photography from model aircraft for remote sensing of crop biomass and nitrogen status
Precis. Agric.
Cited by (56)
Automatic detection and evaluation of sugarcane planting rows in aerial images
2023, Information Processing in AgricultureCitation Excerpt :On the other hand, some studies perform identification and measurement of crop row with images captured by RPA. Souza et al. [8] proposed a semi-automatic method that maps the gaps within the sugarcane crop row using images captured Near Infrared (NIR). Maldaner et al. [11] evaluated the condition of the sugarcane field with the use of the image captured by RPA, which performs an intersection with the georeferenced row obtained by an operation planting.
Prediction of morpho-physiological traits in sugarcane using aerial imagery and machine learning
2023, Smart Agricultural TechnologyAutomated detection of sugarcane crop lines from UAV images using deep learning
2023, Information Processing in AgricultureAn assimilation method for wheat failure detection at the seedling stage
2022, European Journal of AgronomyCitation Excerpt :However, due to the drawback of their study stated in the introduction section, their method was not selected for comparison. Instead, the methods used in Oliverira et al. (2018) and de Souza et al. (2017) were used herein, which are described in the introduction section. The reason for selecting them and their application in this study are described below.
Optimal models under multiple resource types for Brix content prediction in sugarcane fields using machine learning
2022, Remote Sensing Applications: Society and EnvironmentA review of UAV platforms, sensors, and applications for monitoring of sugarcane crops
2022, Remote Sensing Applications: Society and EnvironmentCitation Excerpt :Fig. 6 represents the different types of UAVs (fixed wing, rotary wing, and VTOL), and different cameras including RGB, multispectral, hyperspectral, and LiDAR were used in UAV application sugarcane crops. Most of the experiments used rotary wing with RGB cameras for agronomic practices in sugarcane crops, (Chea et al., 2019, 2020, 2018; Costa et al., 2020; Kharuf-Gutierrez et al., 2018; Li et al., 2021; Luna and Lobo, 2016a; Natarajan et al., 2019; Rodrigues et al., 2021; Sanseechan et al., 2019; Som-ard et al., 2018; Souza et al., 2017; Sumesh et al., 2021a; Tanut and Riyamongkol, 2020; Xiaoyan Wang et al., 2019; J. X. Xu et al., 2020a; Yano et al., 2016; Yu et al., 2020) According the literature review, around 12 research and publications used RGB camera and about 6 studies were used multispectral cameras for the monitoring of sugarcane crops. However, there are limited studies which have utilized hyperspectral or LiDAR camera due to the relatively high cost (Akbarian et al., 2020; Chapman et al., 2014; Chea et al., 2019, 2020, 2018; Cholula et al., 2020; Costa et al., 2020; Duan et al., 2017; Elfatma et al., 2021; Kharuf-Gutierrez et al., 2018; Koondee et al., 2019; Li et al., 2021; Luna and Lobo, 2016a; Miyoshi et al., 2018; Moriya et al., 2017; Natarajan et al., 2019; Rodrigues et al., 2021; Sanches et al., 2018; Sanseechan et al., 2019; Shendryk et al., 2020a; Shi et al., 2018; Sofonia et al., 2019; Som-ard et al., 2018; Souza et al., 2017; Sumesh et al., 2021a; Tanut and Riyamongkol, 2020; Xiaoyan Wang et al., 2019; J. X. Xu et al., 2020a; Yano et al., 2016; Yu et al., 2020; P. Zhang et al., 2021; X. Q. Zhang et al., 2019).
- 1
CNPq Researcher (process: 307362/2014-0).