Joint Semi-supervised Learning and Re-ranking for Vehicle Re-identification | IEEE Conference Publication | IEEE Xplore

Joint Semi-supervised Learning and Re-ranking for Vehicle Re-identification


Abstract:

Vehicle re-identification (re-ID) remains an unproblematic problem due to the complicated variations in vehicle appearances from multiple camera views. Most existing algo...Show More

Abstract:

Vehicle re-identification (re-ID) remains an unproblematic problem due to the complicated variations in vehicle appearances from multiple camera views. Most existing algorithms for solving this problem are developed in the fully-supervised setting, requiring access to a large number of labeled training data. However, it is impractical to expect large quantities of labeled data because the high cost of data annotation. Besides, re-ranking is a significant way to improve its performance when considering vehicle re-ID as a retrieval process. Yet limited effort has been devoted to the research of re-ranking in the vehicle re-ID. To address these problems, in this paper, we propose a semi-supervised learning system based on the Convolutional Neural Network (CNN) and re-ranking strategy for Vehicle re-ID. Specifically, we adopt the structure of Generative Adversarial Network (GAN) to obtain more vehicle images and enrich the training set, then a uniform label distribution will be assigned to the unlabeled samples according to the Label Smoothing Regularization for Outliers (LSRO), which regularizes the supervised learning model and improves the performance of re-ID. To optimize the re-ID results, an improved re-ranking method is exploited to optimize the initial rank list. Experimental results on publically available datasets, VeRi-776 and VehicleID, demonstrate that the method significantly outperforms the state-of-the-art.
Date of Conference: 20-24 August 2018
Date Added to IEEE Xplore: 29 November 2018
ISBN Information:
Print on Demand(PoD) ISSN: 1051-4651
Conference Location: Beijing, China

Contact IEEE to Subscribe

References

References is not available for this document.