Abstract
One of the main issues in the field of social and cognitive robotics is the robot’s ability to recognize emotional states and emotional interaction between robots and humans. Through effective emotional interaction, robots will be able to perform many tasks in human society. In this research, we have developed a robotic platform and a vision system to recognize the emotional state of the user through its facial expressions, which leads to a more realistic human-robot interaction (HRI). First, a number of features are extracted according to points detected by a vision system from the face of the user. Then, the emotional state of the user is analyzed with the help of these features. For the decision making unit, a state machine is designed that utilizes the results obtained from the emotional state analysis to generate the robot’s response. Finally, a fuzzy algorithm is used to improve the quality of emotional interaction and the results are implemented on a commercial humanoid robot platform which has the ability of producing facial expressions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Tadesse, Y., Hong, D., Priya, S.: Twelve degree of freedom baby humanoid head using shape memory alloy actuators. J. Mech. Robot. 3, 211–226 (2011)
Saffari, E., Meghdari, A., Vazirnezhad, B., Alemi, M.: Ava (A Social Robot): design and performance of a robotic hearing apparatus. In: Tapus, A., André, E., Martin, J.-C., Ferland, F., Ammi, M. (eds.) ICSR 2015. LNCS, vol. 9388, pp. 440–450. Springer, Heidelberg (2015). doi:10.1007/978-3-319-25554-5_44
Hanumara, N.C., Slocum, A.H., Mitamura, T.: Design of a spherically actuated human interaction robot head. J. Mech. Design 134(5), 055001 (2012)
Asfour, T., Welke, K., Azad, P., Ude, A., Dillmann, R.: The Karlsruhe humanoid head. In: 8th IEEE-RAS International Conference on Humanoid Robots, pp. 447–453, December 2008
Meghdari, A., Alemi, M., Ghazisaedy, M., Taheri, A.R., Karimian, A., Zandvakili, M.: Applying robots as teaching assistant in EFL classes at Iranian Middle-Schools. In: Proceeding International Conference on Education & Modern Educational Technologies (EMET-2013), 28–30 September 2013, Venice, Italy (2013)
Alemi, M., Meghdari, A., Ghazisaedy, M.: The impact of social robotics on L2 Learners’ anxiety and attitude in English vocabulary acquisition. Int. J. Soc. Robot. 7(4), 523–535 (2015)
Alemi, M., Ghanbarzadeh, A., Meghdari, A., Moghaddam, L.J.: Clinical application of a humanoid robot in pediatric cancer interventions. Int. J. Soc. Robot. (2015)
Trinh, T.Q., Schroeter, C., Kessler, J., Gross, H.-M.: “Go Ahead, Please”: recognition and resolution of conflict situations in narrow passages for polite mobile robot navigation. In: Tapus, A., André, E., Martin, J.-C., Ferland, F., Ammi, M. (eds.) ICSR 2015. LNCS, vol. 9388, pp. 643–653. Springer, Heidelberg (2015). doi:10.1007/978-3-319-25554-5_64
Taheri, A.R., Alemi, M., Meghdari, A., PourEtemad, H.R., Basiri, N.M.: Social robots as assistants for autism therapy in Iran: research in progress. In: Second RSI/ISM International Conference Robotics and Mechatronics (ICRoM), pp. 760–766. IEEE (2014)
McColl, D., Nejat, G.: A socially assistive robot that can monitor affect of the elderly during mealtime assistance. J. Med. Dev. 8(3), 030941 (2014)
Zacharatos, H., Gatzoulis, C., Chrysanthou, Y.L.: Automatic emotion recognition based on body movement analysis: a survey. IEEE Comput. Graph. Appl. 34(6), 35–45 (2014)
Halder, A., Konar, A., Mandal, R., Chakraborty, A., Bhowmik, P., Pal, N.R., Nagar, A.K.: General and interval type-2 fuzzy face-space approach to emotion recognition. IEEE Trans. Syst. Man Cybern. Syst. 43(3), 587–605 (2013)
Mavridis, N.: A review of verbal and non-verbal human-robot interactive communication. Robot. Auton. Syst. 63, 22–35 (2014)
Aly, A., Tapus, A.: Multimodal adapted robot behavior synthesis within a narrative human-robot interaction. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2986–2993 (2015)
Yashaswi Alva, M., Nachamai, M., Paulose, J.: A comprehensive survey on features and methods for speech emotion detection. In: IEEE International Conference on Electrical, Computer and Communication Technologies (ICECCT) (2015)
Xiao, Y., Zhang, Z., Beck, A., Yuan, J., Thalmann, D.: Human–robot interaction by understanding upper body gestures. Presence 23(2), 133–154 (2014)
Chakraborty, A., Konar, A., Chakraborty, U.K., Chatterjee, A.: Emotion recognition from facial expressions and its control using fuzzy logic. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 39(4), 726–743 (2009)
Kotsia, I., Pitas, I.: Facial expression recognition in image sequences using geometric deformation features and support vector machines. IEEE Trans. Image Process. 16(1), 172–187 (2007)
Dahmane, M., Meunier, J.: Prototype-based modeling for facial expression analysis. IEEE Trans. Multimedia 16(6), 1574–1584 (2014)
Li, Y., Mavadati, S.M., Mahoor, M.H., Zhao, Y., Ji, Q.: Measuring the intensity of spontaneous facial action units with dynamic Bayesian network. Pattern Recogn. 48(11), 3417–3427 (2015)
Li, Y., Wang, S., Zhao, Y., Ji, Q.: Simultaneous facial feature tracking and facial expression recognition. IEEE Trans. Image Process. 22(7), 2559–2573 (2013)
Kinect for Windows SDK (2016). https://msdn.microsoft.com/en-us/library/
Ekman, P., Friesen, W.: Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting psychologists press, Palo alto (1978)
Popescu, M., Keller, J., Bezdek, J.C., Zare, A.: Random projections fuzzy c-means (RPFCM) for big data clustering. In: IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), August 2015
Yan, J., Ryan, M., Power, J.: Using Fuzzy Logic: Towards Intelligent Systems, vol. 1. Prentice Hall, London (1994)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Meghdari, A., Alemi, M., Pour, A.G., Taheri, A. (2016). Spontaneous Human-Robot Emotional Interaction Through Facial Expressions. In: Agah, A., Cabibihan, JJ., Howard, A., Salichs, M., He, H. (eds) Social Robotics. ICSR 2016. Lecture Notes in Computer Science(), vol 9979. Springer, Cham. https://doi.org/10.1007/978-3-319-47437-3_34
Download citation
DOI: https://doi.org/10.1007/978-3-319-47437-3_34
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-47436-6
Online ISBN: 978-3-319-47437-3
eBook Packages: Computer ScienceComputer Science (R0)