Abstract
Users’ performance is known to be impacted by their emotional states. To better understand this relationship, different situations could be simulated during which the users’ emotional reactions are analyzed through sensors like eye tracking and EEG. In addition, virtual reality environments provide an immersive simulation context that induces high intensity emotions such as excitement. Extracting excitement from EEG provides more precise measures then other methods, however it is not always possible to use EEG headset in virtual reality environment. In this paper we present an alternative approach to the use of EEG for excitement detection using only eye movements. Results showed that there is a correlation between eye movements and excitement index extracted from EEG. Five machine learning algorithms were used in order to predict excitement trend exclusively from eye tracking. Results revealed that we can detect the offline excitements trend directly from eye movements with a precision of 92% using deep neural network.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Michael, D.: Serious Games: Games that Educate, Train and Inform. Thomson Course Technology, Boston (2006)
Schonauer, C., Pintaric, T., Kaufmann, H., Jansen - Kosterink, S., Vollenbroek-Hutten, M.: Chronic pain rehabilitation with a serious game using multimodal input. In: 2011 International Conference on Virtual Rehabilitation, pp. 1–8. IEEE, Zurich (2011)
Ganjoo, A.: Designing emotion-capable robots, one emotion at a time. In: Proceedings of the Annual Meeting of the Cognitive Science Society (2005)
Benlamine, M.S, Chaouachi, M., Frasson, C., Dufresne, A.: Predicting spontaneous facial expressions from EEG. Intell. Tutoring Syst. (2016)
Jraidi, I., Chaouachi, M., Frasson, C.: A hierarchical probabilistic framework for recognizing learners’ interaction experience trends and emotions. Adv. Hum.-Comput. Interact. 1–16 (2014)
Chaouachi, M., Jraidi, I., Frasson, C.: MENTOR: a physiologically controlled tutoring system. In: Ricci, F., Bontcheva, K., Conlan, O., Lawless, S. (eds.) User Modeling, Adaptation and Personalization, pp. 56–67. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-20267-9_5
Chaouachi, M., Jraidi, I., Frasson, C.: Adapting to learners’ mental states using a physiological computing approach. In: Proceedings of the Twenty-Eighth International Florida Artificial Intelligence Research Society Conference, FLAIRS 2015, Hollywood, Florida, USA, 18–20 May 2015, pp. 257–262 (2015)
Horlings, R., Datcu, D., Rothkrantz, L.J.M.: Emotion recognition using brain activity. Presented at the (2008)
Biocca, F.: The Cyborg’s dilemma: progressive embodiment in virtual environments. J. Comput.-Mediat. Commun. 3, JCMC324 (2006)
Pedraza-Hueso, M., MartĂn-CalzĂłn, S., DĂaz-Pernas, F.J., MartĂnez-Zarzuela, M.: Rehabilitation using kinect-based games and virtual reality. Procedia Comput. Sci. 75, 161–168 (2015)
Ghali, R., Abdessalem, H.B., Frasson C.: Improving intuitive reasoning through assistance strategies in a virtual reality game (2017)
Ang, J., Dhillon, R., Krupski, A., Shriberg, E., Stolcke, A.: Prosody-based automatic detection of annoyance and frustration in human-computer dialog. In: Hansen, J.H.L., Pellom, B.L. (eds.) INTERSPEECH. ISCA (2002)
Chakladar, D.D., Chakraborty, S.: EEG based emotion classification using “Correlation Based Subset Selection”. Biol. Inspired Cogn. Archit. 24, 98–106 (2018)
Bhardwaj, A., Gupta, A., Jain, P., Rani, A., Yadav, J.: Classification of human emotions from EEG signals using SVM and LDA classifiers. In: 2015 2nd International Conference on Signal Processing and Integrated Networks (SPIN), pp. 180–185. IEEE, Noida (2015)
Pitaloka, D.A., Wulandari, A., Basaruddin, T., Liliana, D.Y.: Enhancing CNN with preprocessing stage in automatic emotion recognition. Proc. Comput. Sci. 116, 523–529 (2017)
Lopes, A.T., de Aguiar, E., De Souza, A.F., Oliveira-Santos, T.: Facial expression recognition with convolutional neural networks: coping with few data and the training sample order. Pattern Recognit. 61, 610–628 (2017)
Ben Abdessalem, H., Frasson, C.: Real-time brain assessment for adaptive virtual reality game: a neurofeedback approach. Brain Function Assessment in Learning. LNCS (LNAI), vol. 10512, pp. 133–143. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-67615-9_12
Ben Abdessalem, H., Boukadida, M., Frasson, C.: Virtual reality game adaptation using neurofeedback. In: The Thirty-First International Flairs Conference (2018)
Aspinall, P., Mavros, P., Coyne, R., Roe, J.: The urban brain: analysing outdoor physical activity with mobile EEG. Br. J. Sports Med. 49, 272–276 (2015)
Loh, W.-Y.: Classification and regression trees: Classification and regression trees. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 1, 14–23 (2011)
Acknowledgment
We acknowledge NSERC-CRD and Beam Me Up for funding this work.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Abdessalem, H.B., Chaouachi, M., Boukadida, M., Frasson, C. (2019). Toward Real-Time System Adaptation Using Excitement Detection from Eye Tracking. In: Coy, A., Hayashi, Y., Chang, M. (eds) Intelligent Tutoring Systems. ITS 2019. Lecture Notes in Computer Science(), vol 11528. Springer, Cham. https://doi.org/10.1007/978-3-030-22244-4_26
Download citation
DOI: https://doi.org/10.1007/978-3-030-22244-4_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-22243-7
Online ISBN: 978-3-030-22244-4
eBook Packages: Computer ScienceComputer Science (R0)