Loading [a11y]/accessibility-menu.js
Saliency-Aware Convolution Neural Network for Ship Detection in Surveillance Video | IEEE Journals & Magazine | IEEE Xplore

Saliency-Aware Convolution Neural Network for Ship Detection in Surveillance Video


Abstract:

Real-time detection of inshore ships plays an essential role in the efficient monitoring and management of maritime traffic and transportation for port management. Curren...Show More

Abstract:

Real-time detection of inshore ships plays an essential role in the efficient monitoring and management of maritime traffic and transportation for port management. Current ship detection methods which are mainly based on remote sensing images or radar images hardly meet real-time requirement due to the timeliness of image acquisition. In this paper, we propose to use visual images captured by an on-land surveillance camera network to achieve real-time detection. However, due to the complex background of visual images and the diversity of ship categories, the existing convolution neural network (CNN) based methods are either inaccurate or slow. To achieve high detection accuracy and real-time performance simultaneously, we propose a saliency-aware CNN framework for ship detection, comprising comprehensive ship discriminative features, such as deep feature, saliency map, and coastline prior. This model uses CNN to predict the category and the position of ships and uses the global contrast based salient region detection to correct the location. We also extract coastline information and respectively incorporate it into CNN and saliency detection to obtain more accurate ship locations. We implement our model on Darknet under CUDA 8.0 and CUDNN V5 and use a real-world visual image dataset for training and evaluation. The experimental results show that our model outperforms representative counterparts (Faster R-CNN, SSD, and YOLOv2) in terms of accuracy and speed.
Page(s): 781 - 794
Date of Publication: 07 February 2019

ISSN Information:

Funding Agency:


References

References is not available for this document.