Skip to main content
Log in

Gait recognition based on Orthogonal view feature extraction

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Gait is used for personal identification but is sensitive to covariates such as views and walking conditions. To reduce the influence of views on the accuracy of gait recognition, this paper proposes an Orthogonal-view Feature Decomposition Network based on GaitSet (OFD-GaitSet). The algorithm regards gait recognition as two orthogonal view components of gait recognition. Firstly, the algorithm improves the setting of the gait gallery so that each sample in the gallery contains gait information with two views: 0° and 90°; Secondly, the algorithm designs two Feature Extraction Networks, which extract the gait sub-features of the gait silhouettes sequence from two views. At the same time, the View Identification Network and Distance Block are used to weight the Euclidean Distance between the gait sub-features and the gallery’s, and the recognition results are obtained through comparison. This algorithm uses Cross Entropy Loss and improved Triplet Loss for training. Experiments on the CASIA-B dataset show that the average Raank-1 accuracy reaches 99.8% under normal walking (NM) conditions, 99.1% under walking with bag (BG) conditions, and 88.2% under wearing coat or jacket (CL) conditions. Compared with GaitSet, it improves by 4.8%, 11.9%, and 17.8%, respectively; Experiments on the OU-MVLP dataset have achieved a Rank-1 accuracy of 89.8%, which is 2.7% higher than the GaitSet.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Data availability

Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.

References

  1. Lee L, Grimson W E L (2002) Gait analysis for recognition and classification. Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition. pp 155–162.

  2. Chen C, Liang J, Zhu X (2011) Gait recognition based on improved dynamic Bayesian networks. Pattern Recogn 44(4):988–995

    Article  Google Scholar 

  3. Yang Q. (2011) Gait recognition based on embedded Hidden Markov models. International Conference on Mechatronic Science, Electric Engineering and Computer (MEC). pp 1457–1460.

  4. Chen C, Liang J, Zhao H et al (2009) Frame difference energy image for gait recognition with incomplete silhouettes. Pattern Recogn Lett 30(11):977–984

    Article  Google Scholar 

  5. Han J, Bhanu B (2005) Individual recognition using gait energy image. IEEE Trans Pattern Anal Mach Intell 28(2):316–322

    Article  Google Scholar 

  6. Krizhevsky A, Sutskever I, Hinton G E (2012) Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems: 25.

  7. Zifeng Wu, Huang Y, Wang L, Wang X, Tan T (2017) A comprehensive study on cross-view gait based human identification with deep CNNs. IEEE TPAMI 39(2):209–226

    Article  Google Scholar 

  8. Yu S, Chen H, Garcia Reyes E B, et al (2017) Gaitgan: Invariant gait feature extraction using generative adversarial networks. Proceedings of the IEEE conference on computer vision and pattern recognition workshops, pp 30–37.

  9. He Y, Zhang J, Shan H et al (2018) Multi-task GANs for view-specific feature learning in gait recognition. IEEE Trans Inf Forensics Secur 14(1):102–113

    Article  Google Scholar 

  10. Zhang E, Zhao Y, Xiong W (2010) Active energy image plus 2DLPP for gait recognition. Signal Process 90(7):2295–2302

    Article  Google Scholar 

  11. Wang C, Zhang J, Wang L et al (2011) Human identification using temporal information preserving gait template. IEEE Trans Pattern Anal Mach Intell 34(11):2164–2176

    Article  Google Scholar 

  12. Chao H, He Y, Zhang J et al (2019) Gaitset: Regarding gait as a set for cross-view gait recognition. Proceedings of the AAAI conference on artificial intelligence 33(01):8126–8133

    Article  Google Scholar 

  13. Song C, Huang Y, Huang Y et al (2019) Gaitnet: An end-to-end network for gait based human identification. Pattern Recogn 96:106988

    Article  Google Scholar 

  14. Fan C, Peng Y, Cao C, et al (2020) Gaitpart: Temporal part-based model for gait recognition. Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 14225–14233.

  15. Hou S , Cao C , Liu X , et al (2020) Gait Lateral Network: Learning Discriminative and Compact Representations for Gait Recognition European Conference on Computer Vision. Springer, Cham 382–398.

  16. Lin B, Zhang S, Yu X. Gait recognition via effective global-local feature representation and local temporal aggregation Proceedings of the IEEE/CVF International Conference on Computer Vision. 2021: 14648–14656.

  17. Rajib Ghosh, A Faster R-CNN and recurrent neural network based approach of gait recognition with and without carried objects, Expert Systems with Applications, Volume 205, 2022, 117730, https://doi.org/10.1016/j.eswa.2022.117730

  18. Liao R, Yu S, An W et al (2019) A Model-based Gait Recognition Method with Body Pose and Human Prior Knowledge. Pattern Recogn 98:107069

    Article  Google Scholar 

  19. Teepe T, Khan A, Gilg J, et al (2021) GaitGraph: Graph Convolutional Network for Skeleton-Based Gait Recognition. arXiv preprint arXiv:2101.11228.

  20. Yu S, Tan D, Tan T (2006) A framework for evaluating the effect of view angle, clothing and carrying condition on gait recognition. 18th International Conference on Pattern Recognition (ICPR'06). IEEE, 4: 441–444.

  21. He K, Zhang X, Ren S, et al (2016) Deep residual learning for image recognition. Proceedings of the IEEE conference on computer vision and pattern recognition. pp 770–778.

  22. Takemura N, Makihara Y, Muramatsu D et al (2018) Multi-view large population gait dataset and its performance evaluation for cross-view gait recognition. IPSJ Transactions on Computer Vision and Applications 10(1):1–14

    Article  Google Scholar 

Download references

Funding

No funds, grants, or other support was received.

Author information

Authors and Affiliations

Authors

Contributions

Conceptualization, Methodology: Qianping Fang, Na Ying, Huahua Chen, Miao Hu, Qin Shu, Jian Zhao, Xuewei Zhang; Writing: Qianping Fang, Na Ying; Software: Qianping Fang.

Corresponding author

Correspondence to Na Ying.

Ethics declarations

Declarations

This is the first submission of this manuscript and no parts of this manuscript are being considered for publication elsewhere. All authors have approved this manuscript. No author has financial or other contractual agreements that might cause conflicts of interest.

Ethics approval

Not applicable.

Consent to participate

Not applicable.

Consent for publication

The authors declare that they agree to publish this paper.

Competing interests

The authors have no relevant financial or non-financial interests to disclose.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fang, Q., Ying, N., Chen, H. et al. Gait recognition based on Orthogonal view feature extraction. Multimed Tools Appl 83, 39051–39071 (2024). https://doi.org/10.1007/s11042-023-17031-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-023-17031-z

Keywords

Navigation