Loading [MathJax]/extensions/MathMenu.js
Improved Part-aligned Deep Features Learning for Person Re-Identification | IEEE Conference Publication | IEEE Xplore

Improved Part-aligned Deep Features Learning for Person Re-Identification


Abstract:

Person Re-IDentification (Re-ID) is to recognize a person who has been seen before by different cameras from possibly scenes. Re-ID poses as one of the most difficult com...Show More

Abstract:

Person Re-IDentification (Re-ID) is to recognize a person who has been seen before by different cameras from possibly scenes. Re-ID poses as one of the most difficult computer vision problems owing to the enormous amount of identities involved in a large-scale image pool, with much similar appearance constrained by low resolution image, in a possibly occluded scene, etc. Global features geared for general object recognition and face recognition are far less adequate to re-identify a same person across cameras. As such, more discriminating features are needed to identify people. In particular, part-based feature extraction methods that extract by learning local fine- grained features of different human body parts from detected persons have been proved effective for person Re-ID. To further improve the part-aligned spatial feature approach, this paper proposes an improved part-aligned feature (IPAF) deep learning framework to better characterize a person's complete information with the following threes highlights: part alignment, finer part segmentation, and better learning network backbone. Our proposed solution has been trained and tested on the two most comprehensive Re-ID datasets with comparable performance of reported state-of-the-art solutions: for the dataset of Market1501 (DukeMTMC-reID), our proposed solution both achieves competitive results with mAP of 85.96% (84.70%) and CMC 1 of 94.30% (89.84%), respectively.
Date of Conference: 18-21 September 2019
Date Added to IEEE Xplore: 25 November 2019
ISBN Information:

ISSN Information:

Conference Location: Taipei, Taiwan

Contact IEEE to Subscribe

References

References is not available for this document.