Abstract:
Three significant challenges have been limiting the stable palmprint recognition via mobile devices: 1) rotations and unconsensus scales of the unconstrait hand; 2) noise...Show MoreMetadata
Abstract:
Three significant challenges have been limiting the stable palmprint recognition via mobile devices: 1) rotations and unconsensus scales of the unconstrait hand; 2) noises generated in the open imaging environments; and 3) low quality images captured in the low-illumination conditions. Current palmprint representation methods rely on rich prior knowledge and lack any adaptability to its environment. In this paper, we propose a multi-view hierarchical graph learning based palmprint recognition (MVHG_PR) method, which comprehensively presents the discriminant palmprint features from multiple views. Fully exploiting different types of characteristics, it aims to adaptively perform multi-view feature description and feature selection. To this end, a novel regularized heterogeneous graph learning strategy is proposed for construction of the intra- and inter-class relationships, learning high-order structures for different views between four tuples, rather than just pair-wise intrinsic structures. In the proposed model, the learned hierarchical graph is given an elastic power from the label information to precisely reflect the intra-class and the inter-class relationships in each view, such that the projected structures can be aligned locally and globally. Besides this, we constructed a mobile palmprint dataset to simulate as many open application circumstance as possible to verify the effectiveness of contactless palmprint recognition methods. Experimental results have proven the superiority of the proposed MVHG_PR by achieving the best recognition performances on a number of real-world palmprint databases. The proposed mobile palmprint database and the code of the proposed MVHG_PR are available at https://github.com/ShupingZhao/MVHG_PR-for-contactless-palmprint-recognition.
Published in: IEEE Transactions on Information Forensics and Security ( Volume: 20)