Abstract
The face of a humanoid robot can affect the user experience, and the detection of face preference is particularly important. Preference detection belongs to a branch of emotion recognition that has received much attention from researchers. Most of the previous preference detection studies have been conducted based on a single modality. In this paper, we detect face preferences of humanoid robots based on electroencephalogram (EEG) signals and eye movement signals for single modality, canonical correlation analysis fusion modality, and bimodal deep autoencoder (BDAE) fusion modality, respectively. We validated the theory of frontal asymmetry by analyzing the preference patterns of EEG and found that participants had higher alpha wave energy for preference faces. In addition, hidden preferences extracted by EEG signals were better classified than preferences from participants' subjective feedback, and also, the classification performance of eye movement data was improved. Finally, experimental results showed that BDAE multimodal fusion using frontal alpha and beta power spectral densities and eye movement information as features performed best, with the highest average accuracy of 83.13% for the SVM and 71.09% for the KNN.












Similar content being viewed by others
Explore related subjects
Discover the latest articles and news from researchers in related subjects, suggested using machine learning.Data availability
The datasets generated during and/or analyzed during the current study are not publicly available due to individual privacy but are available from the corresponding author on reasonable request.
References
Bossi F, Willemse C, Cavazza J, Marchesi S, Murino V, Wykowska A (2020) The human brain reveals resting state activity patterns that are predictive of biases in attitudes toward robots. Sci Robot 5(46):eabb6652
Normile D (2014) In our own image. Science 346(6206):188–189. https://doi.org/10.1126/science.346.6206.188
Laakasuo M, Palomäki J, Köbis N (2021) Moral uncanny valley: a robot’s appearance moderates how its decisions are judged. Int J Soc Robot 13(7):1679–1688
Belkaid M, Kompatsiari K, De Tommaso D, Zablith I, Wykowska A (2021) Mutual gaze with a robot affects human neural activity and delays decision-making processes. Sci Robot 6(58):eabc5044
Luu S, Chau T (2009) Neural representation of degree of preference in the medial prefrontal cortex. NeuroReport 20(18):1581–1585. https://doi.org/10.1097/wnr.0b013e32832d5989
Aldayel M, Ykhlef M, Al-Nafjan A (2021) Consumers’ preference recognition based on brain–computer interfaces: advances, trends, and applications. Arab J Sci Eng 46(9):8983–8997. https://doi.org/10.1007/s13369-021-05695-4
Aldayel M, Ykhlef M, Al-Nafjan A (2020) Deep learning for EEG-based preference classification in neuromarketing. Appl Sci-Basel 10(4):1525. https://doi.org/10.3390/app10041525
Koelstra S et al (2011) Deap: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31
Bauer AK, Kreutz G, Herrmann CS (2015) Individual musical tempo preference correlates with EEG beta rhythm. Psychophysiology 52(4):600–604. https://doi.org/10.1111/psyp.12375
Nakamura T, Ito S-i, Mitsukura Y, Setokawa H (2009) A method for evaluating the degree of human's preference based on EEG analysis. In: 2009 fifth international conference on intelligent information hiding and multimedia signal processing, 2009. IEEE, pp 732–735
Kang JH, Kim SJ, Cho YS, Kim SP (2015) Modulation of alpha oscillations in the human EEG with facial preference. PLoS ONE 10(9):e0138153. https://doi.org/10.1371/journal.pone.0138153
Touchette B, Lee SE (2017) Measuring neural responses to apparel product attractiveness: an application of frontal asymmetry theory. Cloth Text Res J 35(1):3–15. https://doi.org/10.1177/0887302x16673157
Smith EE, Reznik SJ, Stewart JL, Allen JJ (2017) Assessing and conceptualizing frontal EEG asymmetry: an updated primer on recording, processing, analyzing, and interpreting frontal alpha asymmetry. Int J Psychophysiol 111:98–114. https://doi.org/10.1016/j.ijpsycho.2016.11.005
Jacques C, Jonas J, Maillard L, Colnat-Coulbois S, Koessler L, Rossion B (2019) The inferior occipital gyrus is a major cortical source of the face-evoked N170: evidence from simultaneous scalp and intracerebral human recordings. Hum Brain Mapp 40(5):1403–1418. https://doi.org/10.1002/hbm.24455
Caharel S, Rossion B (2021) The N170 is sensitive to long-term (personal) familiarity of a face identity. Neuroscience 458:244–255. https://doi.org/10.1016/j.neuroscience.2020.12.036
Podvigina DN, Prokopenya VK (2019) Role of familiarity in recognizing faces and words: an EEG study. Sovrem Tehnol V Med 11(1):76–82
Seeber KG (2015) Eye tracking. In: P C (ed) F. Routledge encyclopedia of interpreting studies. Routledge, London, p 2015
Jin S, Qing C, Xu X, Wang Y (2019) Emotion recognition using eye gaze based on shallow CNN with identity mapping. In: International conference on brain inspired cognitive systems, 2019. Springer, pp 65–75
Holmes T, Zanker J (2009) I like what I see: using eye-movement statistics to detect image preference. J Vis 9(8):385–385
Schweikert C, Gobin L, Xie S, Shimojo S, Frank Hsu D (2018) Preference prediction based on eye movement using multi-layer combinatorial fusion. In: International conference on brain informatics, 2018. Springer, pp 282–293
Zheng W-L, Dong B-N, Lu B-L (2014) Multimodal emotion recognition using EEG and eye tracking data. In: 2014 36th annual international conference of the IEEE engineering in medicine and biology society, 2014. IEEE, pp 5040–5043
Shi Z-F, Zhou C, Zheng W-L, Lu B-L (2017) Attention evaluation with eye tracking glasses for EEG-based emotion recognition. In: 2017 8th international IEEE/EMBS conference on neural engineering (NER), 2017. IEEE, pp 86–89
Su Y, Li W, Bi N, Lv Z (2019) Adolescents environmental emotion perception by integrating EEG and eye movements. Front Neurorobot 13:46. https://doi.org/10.3389/fnbot.2019.00046
Zhao L-M, Li R, Zheng W-L, Lu B-L (2019) Classification of five emotions from EEG and eye movement signals: complementary representation properties. In: 2019 9th international IEEE/EMBS conference on neural engineering (NER), 2019. IEEE, pp 611–614
Lu Y, Zheng W-L, Li B, Lu B-L (2015) Combining eye movements and EEG to enhance emotion recognition. In: Twenty-fourth international joint conference on artificial intelligence, 2015.
Zheng WL, Liu W, Lu Y, Lu BL, Cichocki A (2019) EmotionMeter: a multimodal framework for recognizing human emotions. IEEE Trans Cybern 49(3):1110–1122. https://doi.org/10.1109/TCYB.2018.2797176
Huang Y, Ma W, Yang Y (2020) Eye movement experiment research on users’ aesthetic preferences of car seats. In: 2020 13th international symposium on computational intelligence and design (ISCID), 2020. IEEE, pp 310–313
Hakim A, Klorfeld S, Sela T, Friedman D, Shabat-Simon M, Levy DJ (2021) Machines learn neuromarketing: improving preference prediction from self-reports using multiple EEG measures and machine learning. Int J Res Mark 38(3):770–791. https://doi.org/10.1016/j.ijresmar.2020.10.005
Moon S-E, Kim J-H, Kim S-W, Lee J-S (2019) Prediction of car design perception using EEG and gaze patterns. IEEE Trans Affect Comput 12(4):843–856
Liu Y et al (2019) Detection of humanoid robot design preferences using EEG and eye tracker. In: 2019 international conference on cyberworlds (CW), 2019. IEEE, pp 219–224
Li MM, Guo F, Ren ZG, Duffy VG (2022) A visual and neural evaluation of the affective impression on humanoid robot appearances in free viewing. Int J Ind Ergonom 88:103159. https://doi.org/10.1016/j.ergon.2021.103159
Guo F, Li M, Chen J, Duffy VG (2022) Evaluating users’ preference for the appearance of humanoid robots via event-related potentials and spectral perturbations. Behav Inf Technol 41(7):1381–1397
Zhao W, Zhao Z, Li C (2018) Discriminative-CCA promoted by EEG signals for physiological-based emotion recognition. In: 2018 first Asian conference on affective computing and intelligent interaction (ACII Asia), 2018. IEEE, pp 1–6
Zhang X et al (2020) Fusing of electroencephalogram and eye movement with group sparse canonical correlation analysis for anxiety detection. IEEE Trans Affect Comput 13:958–971
Liu W, Zheng W-L, Lu B-L (2016) Emotion recognition using multimodal deep learning. In: International conference on neural information processing, 2016. Springer, pp 521–529
Zhang HL (2020) Expression-EEG based collaborative multimodal emotion recognition using deep AutoEncoder. IEEE Access 8:164130–164143. https://doi.org/10.1109/Access.2020.3021994
Guo J-J, Zhou R, Zhao L-M, Lu B-L (2019) Multimodal emotion recognition from eye image, eye movement and EEG using deep neural networks. In: 2019 41st annual international conference of the IEEE engineering in medicine and biology society (EMBC), 2019. IEEE, pp 3071–3074
Ouzar Y, Bousefsaf F, Djeldjli D, Maaoui C (2022) Video-based multimodal spontaneous emotion recognition using facial expressions and physiological signals. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2022, pp 2460–2469
Dias W et al (2022) Cross-dataset emotion recognition from facial expressions through convolutional neural networks. J Vis Commun Image Represent 82:103395
Gramfort A et al (2013) MEG and EEG data analysis with MNE-Python. Front Neurosci 7:267. https://doi.org/10.3389/fnins.2013.00267
Majumdar K (2017) A brief survey of quantitative EEG. CRC Press, Boca Raton
Alsolamy M, Fattouh A (2016) Emotion estimation from EEG signals during listening to Quran using PSD features. In: 2016 7th international conference on computer science and information technology (CSIT), 2016. IEEE, pp 1–5
Kirke A, Miranda ER (2011) 'Combining EEG frontal asymmetry studies with affective algorithmic composition and expressive performance models. In: Citeseer, 2011
Ramirez R, Vamvakousis Z (2012) Detecting emotion from EEG signals using the emotive epoc device. In: International conference on brain informatics, 2012. Springer, pp 175–184
Ramirez R, Palencia-Lefler M, Giraldo S, Vamvakousis Z (2015) Musical neurofeedback for treating depression in elderly people. Front Neurosci 9:354. https://doi.org/10.3389/fnins.2015.00354
Soleymani M, Pantic M, Pun T (2011) Multimodal emotion recognition in response to videos. IEEE Trans Affect Comput 3(2):211–223
Cohrdes C, Wrzus C, Frisch S, Riediger M (2017) Tune yourself in: valence and arousal preferences in music-listening choices from adolescence to old age. Dev Psychol 53(9):1777–1794
Baldo D, Parikh H, Piu Y, Müller K-M (2015) Brain waves predict success of new fashion products: a practical application for the footwear retailing industry. J Creat Value 1(1):61–71
Pedregosa F et al (2011) “Scikit-learn: machine learning in python,” (in English). J Mach Learn Res 12:2825–2830
Khushaba RN, Greenacre L, Kodagoda S, Louviere J, Burke S, Dissanayake G (2012) Choice modeling and the brain: a study on the electroencephalogram (EEG) of preferences. Expert Syst Appl 39(16):12378–12388. https://doi.org/10.1016/j.eswa.2012.04.084
Khushaba RN, Kodagoda S, Dissanayake G, Greenacre L, Burke S, Louviere J (2012) A neuroscientific approach to choice modeling: electroencephalogram (EEG) and user preferences. In: The 2012 international joint conference on neural networks (IJCNN), 2012: IEEE, pp 1–8
Ali A et al (2022) EEG signals based choice classification for neuromarketing applications. In: A fusion of artificial intelligence and internet of things for emerging cyber systems, pp 371–394, 2022
Mashrur FR et al (2022) BCI-based consumers’ choice prediction from EEG signals: an intelligent neuromarketing framework. Front Hum Neurosci 16:861270
Funding
Research supported by the National Key R&D Program of China, grant no. 2021YFC0122700; National Natural Science Foundation of China, grant no. 61904038; Ji Hua Laboratory, grant no.X190021TB193; Shanghai Municipal Science and Technology Major Project, grant no. 2021SHZDZX0103.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interests
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Ethical approval
The experiment was approved by the Medical Ethics Committee of Jing’an District Central Hospital of Shanghai (Ethics reference number: 2020–2029).
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Wang, P., Mu, W., Zhan, G. et al. Preference detection of the humanoid robot face based on EEG and eye movement. Neural Comput & Applic 36, 11603–11621 (2024). https://doi.org/10.1007/s00521-024-09765-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-024-09765-0