Loading [a11y]/accessibility-menu.js
Target Classification for 3D-ISAR Using CNNs | IEEE Journals & Magazine | IEEE Xplore
Scheduled Maintenance: On Tuesday, 25 February, IEEE Xplore will undergo scheduled maintenance from 1:00-5:00 PM ET (1800-2200 UTC). During this time, there may be intermittent impact on performance. We apologize for any inconvenience.

Target Classification for 3D-ISAR Using CNNs


Abstract:

In maritime surveillance, inverse synthetic aperture radar (ISAR) is a technique for imaging non-cooperative targets, with classification typically performed by the radar...Show More

Abstract:

In maritime surveillance, inverse synthetic aperture radar (ISAR) is a technique for imaging non-cooperative targets, with classification typically performed by the radar operator. By automating the target classification process, the operator workload will be reduced significantly and the classification accuracy can be improved. Traditional classification approaches use geometric features extracted from images of known targets to form a training dataset that is later used to classify an unknown target. While these approaches work reasonably well, deep learning based techniques have recently demonstrated significant improvements over conventional processing schemes in many areas of radar. The classification of traditional 2D-ISAR imagery is difficult due to the motion of the sea causing a wide range of imagery. The 3D-ISAR technique was developed as an alternative representation with the target represented by a three-dimensional point cloud. In this article, we investigate how 3D-ISAR can be used for the classification of maritime targets. The proposed scheme makes use of features extracted from the 3D-ISAR generated point cloud of the target from different perspectives (i.e. side, top and front views) to form three point density images. These are then fed into a convolutional neural network to classify the targets.
Published in: IEEE Transactions on Aerospace and Electronic Systems ( Volume: 60, Issue: 1, February 2024)
Page(s): 94 - 105
Date of Publication: 27 April 2023

ISSN Information:

Funding Agency:


References

References is not available for this document.