Abstract:
In farming systems, harvesting operations are tedious, time- and resource-consuming tasks. Deploying a fleet of autonomous robots to work alongside farmworkers may provid...Show MoreMetadata
Abstract:
In farming systems, harvesting operations are tedious, time- and resource-consuming tasks. Deploying a fleet of autonomous robots to work alongside farmworkers may provide vast productivity and logistics benefits. In this context, an intelligent robotic system should monitor human behavior, identify the ongoing activities and anticipate the worker's needs. Unlike other application areas, such as warehouses and factories, research on human behavior recognition in agriculture is still in its infancy and has few case studies. Thus, there is a need for developing a fully integrated human activity recognition (HAR) methodology applied for agricultural operations in production fields. In this work, the main contribution consists of creating a benchmark framework of video-based human pickers detection, classifying their activities and corresponding motion direction, to serve in harvesting operations in different agricultural scenarios. Our solution uses the combination of a Mask Region-based Convolutional Neural Network (Mask R-CNN) for object detection and optical flow for motion estimation with a newly added statistical attribute of flow motion descriptors, named as Correlation Sensitivity (CS). A classification criterion is defined based on the analysis of the Kernel Density Estimation (KDE) technique and the K-Means clustering algorithm. Both methods are evaluated upon in-house collected datasets from different environments like strawberry polytunnels and apple tree orchards. The proposed benchmark framework is quantitatively analyzed using measures of sensitivity, specificity, and accuracy and shows satisfactory results amidst various dataset challenges such as multi-foreground objects, lighting variation, blur, and occlusions.
Date of Conference: 16-18 July 2021
Date Added to IEEE Xplore: 26 August 2021
ISBN Information: