Loading [a11y]/accessibility-menu.js
RefineDet++: Single-Shot Refinement Neural Network for Object Detection | IEEE Journals & Magazine | IEEE Xplore

RefineDet++: Single-Shot Refinement Neural Network for Object Detection


Abstract:

Convolutional neural network based methods have dominated object detection in recent years, which can be divided into the one-stage approach and the two-stage approach. I...Show More

Abstract:

Convolutional neural network based methods have dominated object detection in recent years, which can be divided into the one-stage approach and the two-stage approach. In general, the two-stage approach (e.g., Faster R-CNN) achieves high accuracy, while the one-stage approach (e.g., SSD) has the advantage of high efficiency. To inherit the merits of both while overcoming their disadvantages, we propose a novel single-shot based detector, namely RefineDet++, which achieves better accuracy than two-stage methods and maintains comparable efficiency of one-stage methods. The proposed RefineDet++ consists of two inter-connected modules: the anchor refinement module and the alignment detection module. Specifically, the former module aims to (1) filter out negative anchors to reduce search space for the subsequent classifier, and (2) coarsely adjust the locations and sizes of anchors to provide better initialization for the subsequent regressor. The latter module takes (1) the refined anchors as the input from the former module with (2) a newly designed alignment convolution operation to further improve the regression accuracy and predict multi-class label. Meanwhile, we design a transfer connection block to transfer the features in the anchor refinement module to predict locations, sizes and class labels of objects in the object detection module. The multi-task loss function enables us to train the whole network in an end-to-end way. Extensive experiments on PASCAL VOC and MS COCO demonstrate that RefineDet++ achieves state-of-the-art detection accuracy with high efficiency.
Page(s): 674 - 687
Date of Publication: 07 April 2020

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.