Abstract
In this work, a method for animating a mechatronic head with realistic appearance is presented. The required actuators have been defined upon the Facial Action Coding System (FACS). The generation of the six basic emotions is addressed: happiness, disgust, sadness, anger, fear and surprise. These expressions are generated by interpolating movements through a sequence of key poses. The voice is integrated in a similar way, using a viseme-based scheme that allows synchronizing voice, lips and mouth movements. Implementation details and results showing suitability of the approach are also given.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Ekman, P., Friesen, W., Hager, J.C.: The Facial Action Coding System, 2nd edn. Weidenfeld and Nicolson, London (2002)
Freitas-Magalhes, A.: The psychology of emotions: The allure of human face. University Fernando Pessoa Press, Oporto (2007)
Friesen, W.V., Ekman, P.: Emotional facial action coding system (1983)
Hamm, J., Kohler, C.G., Gur, R.C., Vermaa, R.: Automated facial action coding system for dynamic analysis of facial expressions in neuropsychiatric disorders. J. Neurosci. Methods 200(2), 237–256 (2011)
Izard, C.: Innate and universal facial expressions: Evidence from developmental and cross-cultural research. American Psychological Association 115, 288–299 (1994)
Ko, K.-E., Sim, K.-B.: Development of a facial emotion recognition method based on combining aam with dbn. In: International Conference on Cyberworlds (2010)
Loza, D., Marcos, S., Zalama, E., Gómez-García-Bermejo, J., González-Fraile, J.L.: Application of the FACS in the design and construction of a mechatronic head with realistic appearance. Journal of Physical Agents 7, 30–37 (2013)
Marcos, S., Gómez-García-Bermejo, J., Zalama, E.: A realistic, virtual head for human-computer interaction. Interact. Comput. 22, 176–192 (2010)
Nakaoka, S., Kanehiro, F., Miura, K., Morisawa, M., Fujiwara, K., Kaneko, K., Kajita, S., Hirukawa, H.: Creating facial motions of cybernetic human hrp-4c. In: 9th IEEE-RAS International Conference on Humanoid Robots, pp. 561–567 (2009)
Perales, F.: Inevai 3d: Agentes autnomos 3d, escenarios virtuales e interfaces inteligentes para aplicaciones de domtica y de realidad virtual. Simposio de Computacin - Ubicua e Inteligencia Ambiental (2005)
Saldien, J., Goris, K., Yilmazyildiz, S., Verhelst, W., Lefeber, D.: On the design of the huggable robot probo. Journal of Physical Agents 2, 2 (2008)
Tronick, E., Als, H., Berry Brazelton, T.: Monadic phases: A structural descriptive analysis of infant-mother face to face interaction. Merrill-Palmer Quarterly 26, 3–24 (1980)
Villagrasa, S., Susin, A.: Face! 3d facial animation system based on facs. In: IV Iberoamerican Symposium in Computer Graphic (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
David, L., Samuel, M.P., Eduardo, Z.C., García-Bermejo, J.G. (2014). Animation of Expressions in a Mechatronic Head. In: Armada, M., Sanfeliu, A., Ferre, M. (eds) ROBOT2013: First Iberian Robotics Conference. Advances in Intelligent Systems and Computing, vol 253. Springer, Cham. https://doi.org/10.1007/978-3-319-03653-3_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-03653-3_2
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-03652-6
Online ISBN: 978-3-319-03653-3
eBook Packages: EngineeringEngineering (R0)