Skip to main content
Log in

Periocular recognition: how much facial expressions affect performance?

  • Industrial and Commercial Application
  • Published:
Pattern Analysis and Applications Aims and scope Submit manuscript

Abstract

Using information near the human eye to perform biometric recognition has been gaining popularity. Previous works in this area, designated periocular recognition, show remarkably low error rates and particularly high robustness when data are acquired under less controlled conditions. In this field, one factor that remains to be studied is the effect of facial expressions on recognition performance, as expressions change the textural/shape information inside the periocular region. We have collected a multisession dataset whose single variation is the subjects’ facial expressions and analyzed the corresponding variations in performance, using the state-of-the-art periocular recognition strategy. The effectiveness attained by different strategies to handle the effects of facial expressions was compared: (1) single-sample enrollment; (2) multisample enrollment, and (3) multisample enrollment with facial expression recognition, with results also validated in the well-known Cohn–Kanade AU-Coded Expression dataset. Finally, the role of each type of facial expression in the biometrics menagerie effect is discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Park U, Ross A, Jain A (2009) Periocular biometrics in the visible spectrum: a feasibility study. In: IEEE 3rd International Conference on Biometrics: Theory, Applications, and Systems, 2009. BTAS ’09, pp 1–6

  2. Park U, Jillela RR, Ross A, Jain AK (2011) Periocular biometrics in the visible spectrum. IEEE Trans Inf Forensics Secur 6(1):96–106

    Article  Google Scholar 

  3. Lyle J, Miller P, Pundlik S, Woodard D (2010) Soft biometric classification using periocular region features. In: Fourth IEEE International Conference on Biometrics: Theory Applications and Systems (BTAS), 2010, pp 1–7

  4. Woodard D, Pundlik S, Miller P, Jillela R, Ross A (2010) On the fusion of periocular and iris biometrics in non-ideal imagery. In: 20th International Conference on Pattern Recognition (ICPR), 2010, pp 201–204

  5. Bharadwaj S, Bhatt H, Vatsa M, Singh R (2010) Periocular biometrics: when iris recognition fails,” in Fourth IEEE International Conference on Biometrics: Theory Applications and Systems (BTAS), 2010, pp 1–6

  6. Park U, Ross A, Jain A (2012) Matching highly non-ideal ocular images: an information fusion approach. In: IEEE 5th International Conference on Biometrics, ICB2012

  7. Hollingsworth K, Darnell S, Miller P, Woodard D, Bowyer K, Flynn P (2012) Human and machine performance on periocular biometrics under near-infrared light and visible light. IEEE Trans Inf Forensics Secur 7(2):588–601

    Article  Google Scholar 

  8. Woodard D, Pundlik S, Miller P, Lyle J (2011) Appearance-based periocular features in the context of face and non-ideal iris recognition. Signal Image Video Process 5:443–455

    Article  Google Scholar 

  9. Crihalmeanu S, Ross A (2011) Multispectral scleral patterns for ocular biometric recognition. Pattern Recognit Lett no. 0

  10. Kanade T, Cohn J, Tian YL (2000) Comprehensive database for facial expression analysis. In: Proceedings of the 4th IEEE International Conference on Automatic Face and Gesture Recognition (FG’00), pp 46–53

  11. Lucey P, Cohn J, Kanade T, Saragih J, Ambadar Z, Matthews I (2010) The extended Cohn-Kanade dataset (CK+): a complete dataset for action unit and emotion-specified expression. In: Proceedings of the IEEE Computer Vision and Pattern Recognition Workshops (CVPRW’10), pp 94–101

  12. Ekman P (1999) Facial expressions. In: Dalgleish T, Power M (eds) Handbook of cognition and emotion, John Wiley & Sons, San Francisco, California, USA, pp 301–320

  13. Anitha M, Venkatesha K, Adiga BS (2010) A survey of facial expression databases. Int J Eng Sci Technol 2(10):5158–5174

    Google Scholar 

  14. Langner O, Dotsch R, Bijlstra G, Wigboldus DHJ, Hawk ST, van Knippenberg A (2010) Presentation and validation of the Radboud Faces Database. Cognit Emot 24(8):1377–1388

    Article  Google Scholar 

  15. Ebner C, Riediger M, Lindenberger U (2010) FACES-a database of facial expressions in young, middle-aged, and older women and men: development and validation. Behav Res Methods 42(1):351–62

    Article  Google Scholar 

  16. Lyons M, Akamatsu S, Kamachi M, Gyoba J (1998) Coding facial expressions with Gabor wavelets. In: Third IEEE International Conference on Automatic Face Gesture Recognition. IEEE Computer Society, Nara, Japan, pp 200–205

  17. Pantic M, Valstar M, Rademaker R, Maat L (2005) Web-based database for facial expression analysis. In: 2005 IEEE International Conference on Multimedia and Expo, pp 317–321

  18. Bettadapura VK (2009) Face expression recognition and analysis : the state of the art. Emotion, pp 1–27

  19. Haq S, Jackson P (2010) Machine audition: principles, algorithms and systems. In: Multimodal Emotion Recognition. IGI Global ch., Hershey PA, pp 398–423

  20. Sebe N, Lew M, Sun Y, Cohen I, Gevers T, Huang T (2007) Authentic facial expression analysis. Image Vis Comput 25(12):1856–1863

    Article  Google Scholar 

  21. Sim T, Baker S, Bsat M (2003) The cmu pose, illumination, and expression database. IEEE Trans Pattern Anal Mach Intell 25:1615–1618

    Article  Google Scholar 

  22. Gross R (2005) Face databases. In: Handbook of face recognition. Springer, ch 13, pp 301–327

  23. Phillips PJ, Moon H, Rizvi SA, Rauss PJ (2000) The feret evaluation methodology for face-recognition algorithms. IEEE Trans Pattern Anal Mach Intell 22(10):1090–1104

    Article  Google Scholar 

  24. Hwang BW, Byun H, Roh MC, Lee SW (2003) Performance evaluation of face recognition algorithms on the asian face database, kfdb. In: Proceedings of the 4th international conference on Audio- and video-based biometric person authentication, ser. AVBPA’03. Springer-Verlag, Berlin, Heidelberg, pp 557–565

  25. O’Toole AJ, Harms J, Snow SL, Hurst DR, Pappas MR, Ayyad JH, Abdi H (2005) A video database of moving faces and people. IEEE Trans Pattern Anal Mach Intell 27:812–816

    Article  Google Scholar 

  26. Yin L, Wei X, Sun Y, Wang J, Rosato MJ (2006) A 3D facial expression database for facial behaviour research. In: Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition (FGR06). IEEE Computer Society

  27. Guyon I, Makhoul J, Schwartz R, Vapnik V (1998) What size test set gives good error rate estimates ? IEEE Trans Pattern Anal Mach Intell 20(1):52–64

    Article  Google Scholar 

  28. Cantor ABM (2002) Understanding logistic regression. Evid Oncol 3(2):52–53

    Article  MathSciNet  Google Scholar 

  29. Yager N, Dunstone T (2010) The biometric menagerie. IEEE Trans Pattern Anal Mach Intell 32(2):220–230

    Article  Google Scholar 

  30. Poh N, Kittler J (2009) A biometric menagerie index for characterizing template/model-specific variation. In: Proceedings of the Third International Conference on Advances in Biometrics- BTAS 09, pp 816–827

Download references

Acknowledgments

This work was carried out in the scope of the research project UID/EEA/50008/2013, R&D Unit 50008, financed by the applicable financial framework (FCT/MEC) through national funds and co-funded by FEDER PT2020 partnership agreement.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hugo Proença.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Barroso, E., Santos, G., Cardoso, L. et al. Periocular recognition: how much facial expressions affect performance?. Pattern Anal Applic 19, 517–530 (2016). https://doi.org/10.1007/s10044-015-0493-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10044-015-0493-z

Keywords

Navigation