Loading [a11y]/accessibility-menu.js
A YOLO-based Separation of Touching-Pigs for Smart Pig Farm Applications | IEEE Conference Publication | IEEE Xplore

A YOLO-based Separation of Touching-Pigs for Smart Pig Farm Applications


Abstract:

For specific livestock such as pigs in a pigsty, many surveillance applications have been reported to consider their health for efficient livestock management. For pig su...Show More

Abstract:

For specific livestock such as pigs in a pigsty, many surveillance applications have been reported to consider their health for efficient livestock management. For pig surveillance applications, separating touching-pigs in real-time is an important issue for a final goal of 24-hour tracking of individual pigs. Although convolutional neural network (CNN)-based instance segmentation techniques can be applied to this separation problem, their collective performance of accuracy-time may not satisfy the required performance. In this study, we improve the collective performance of accuracy-time by combining the fastest CNN-based object detection technique (i.e., You Only Look Once, YOLO) with image processing techniques. We first apply image processing techniques to detect touching-pigs by using both infrared and depth information acquired from an Intel RealSense camera, then apply YOLO to separate the touching-pigs. Especially, in using YOLO as an object detector, we consider the target object as the boundary region of the touching-pigs, rather than the individual pigs of the touching-pigs. Finally, we apply image processing techniques to determine the final boundary line from the YOLO output. Our experimental results show that this method is effective to separate touching-pigs in terms of the collective performance of accuracy-time, compared to the recently reported CNN-based instance segmentation technique.
Date of Conference: 17-20 February 2019
Date Added to IEEE Xplore: 02 May 2019
ISBN Information:

ISSN Information:

Conference Location: PyeongChang, Korea (South)

References

References is not available for this document.