Skip to main content

Moveable Facial Features in a Social Mediator

  • Conference paper
  • First Online:
Book cover Intelligent Virtual Agents (IVA 2017)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 10498))

Included in the following conference series:

  • 3493 Accesses

Abstract

Human face and facial features based behavior has a major impact in human-human communications. Creating face based personality traits and its representations in a social robot is a challenging task. In this paper, we propose an approach for a robotic face presentation based on moveable 2D facial features and present a comparative study when a synthesized face is projected using three setups; 1) 3D mask, 2) 2D screen, and 3) our 2D moveable facial feature based visualization. We found that robot’s personality and character is highly influenced by the projected face quality as well as the motion of facial features.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Al Moubayed, S., Beskow, J., Skantze, G., Granström, B.: Furhat: a back-projected human-like robot head for multiparty human-machine interaction. In: Esposito, A., Esposito, A.M., Vinciarelli, A., Hoffmann, R., Müller, V.C. (eds.) Cognitive Behavioural Systems. LNCS, vol. 7403, pp. 114–130. Springer, Heidelberg (2012). doi:10.1007/978-3-642-34584-5_9

    Chapter  Google Scholar 

  2. Cristinacce, D., Cootes, T.F.: Feature detection and tracking with constrained local models. In: British Machine Vision Conference (BMVC), p. 3, no. 2 (2006)

    Google Scholar 

  3. Khan, M.S.L., ur Réhman, S., Lu, Z., Li, H.: Head orientation modeling: Geometric head pose estimation using monocular camera. In: IEEE/IIAE Int’l Conf. Intelligent Sys. and Image Proc., pp. 149–153 (2013)

    Google Scholar 

  4. Réhman, S., Liu, L.: iFeeling: vibrotactile rendering of human emotions on mobile phones. In: Jiang, X., Ma, M.Y., Chen, C.W. (eds.) WMMP 2008. LNCS, vol. 5960, pp. 1–20. Springer, Heidelberg (2010). doi:10.1007/978-3-642-12349-8_1

    Chapter  Google Scholar 

  5. Schurgin, M., Nelson, J., Iida, S., Ohira, H., Chiao, J., Franconeri, S.: Eye movements during emotion recognition in faces. Journal of Vision 14(13), 14–14 (2014)

    Article  Google Scholar 

  6. Thoresen, J.C., Vuong, Q.C., Atkinson, A.P.: First impressions: Gait cues drive reliable trait judgements. Cognition 124(3), 261–271 (2012)

    Article  Google Scholar 

  7. Viola, P., Jones, M.J.: Robust real-time face detection. International Journal of Computer Vision 57(2), 137–154 (2004)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shafiq ur Réhman .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Khan, M.S.L., ur Réhman, S., Mi, Y., Naeem, U., Beskow, J., Li, H. (2017). Moveable Facial Features in a Social Mediator. In: Beskow, J., Peters, C., Castellano, G., O'Sullivan, C., Leite, I., Kopp, S. (eds) Intelligent Virtual Agents. IVA 2017. Lecture Notes in Computer Science(), vol 10498. Springer, Cham. https://doi.org/10.1007/978-3-319-67401-8_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-67401-8_23

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-67400-1

  • Online ISBN: 978-3-319-67401-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics