skip to main content
10.1145/3627043.3659549acmconferencesArticle/Chapter ViewAbstractPublication PagesumapConference Proceedingsconference-collections
research-article

Good GUIs, Bad GUIs: Affective Evaluation of Graphical User Interfaces

Published: 22 June 2024 Publication History

Abstract

Affective computing has potential to enrich the development lifecycle of Graphical User Interfaces (GUIs) and of intelligent user interfaces by incorporating emotion-aware responses. Yet, affect is seldom considered to determine whether a GUI design would be perceived as good or bad. We study how physiological signals can be used as an early, effective, and rapid affective assessment method for GUI design, without having to ask for explicit user feedback. We conducted a controlled experiment where 32 participants were exposed to 20 good GUI and 20 bad GUI designs while recording their eye activity through eye tracking, facial expressions through video recordings, and brain activity through electroencephalography (EEG). We observed noticeable differences in the collected data, so we trained and compared different computational models to tell good and bad designs apart. Taken together, our results suggest that each modality has its own “performance sweet spot” both in terms of model architecture and signal length. Taken together, our findings suggest that is possible to distinguish between good and bad designs using physiological signals. Ultimately, this research paves the way toward implicit evaluation methods of GUI designs through user modeling.

References

[1]
D. Aneja, A. Colburn, G. Faigin, L. Shapiro, and B. Mones. 2017. Modeling Stylized Character Expressions via Deep Learning. In Computer Vision – ACCV 2016, Shang-Hong Lai, Vincent Lepetit, Ko Nishino, and Yoichi Sato (Eds.). Springer International Publishing, Cham, 136–153.
[2]
S. A. Bargal, E. Barsoum, C. C. Ferrer, and C. Zhang. 2016. Emotion Recognition in the Wild from Videos Using Images. In Proceedings of the 18th ACM International Conference on Multimodal Interaction (Tokyo, Japan) (ICMI ’16). Association for Computing Machinery, New York, NY, USA, 433–436. https://doi.org/10.1145/2993148.2997627
[3]
J. R. Bergstrom and A. Schall. 2014. Eye Tracking in User Experience Design (1st ed.). Morgan Kaufmann Publishers Inc., San Francisco, CA, USA.
[4]
A. Bojko. 2006. Using Eye Tracking to Compare Web Page Designs: A Case Study. Journal of Usability Studies 1, 3 (May 2006), 112–120. https://uxpajournal.org/wp-content/uploads/sites/7/pdf/JUS_Bojko_May2006.pdf
[5]
J. Bowden, J. Conduit, L. Hollebeek, V. Luoma-aho, and B. Solem. 2017. Engagement valence duality and spillover effects in online brand communities. Journal of Service Theory and Practice 27 (06 2017), 877–897. https://doi.org/10.1108/JSTP-04-2016-0072
[6]
N. Burny and J. Vanderdonckt. 2022. (Semi-)Automatic Computation of User Interface Consistency. In EICS ’22: ACM SIGCHI Symposium on Engineering Interactive Computing Systems, Sophia Antipolis, France, June 21 - 24, 2022, Companion Volume, Marco Winckler and Aaron Quigley (Eds.). ACM, 5–13. https://doi.org/10.1145/3531706.3536448
[7]
J. Bölte, T. Hösker, G. Hirschfeld, and M. Thielsch. 2017. Electrophysiological correlates of aesthetic processing of webpages: A comparison of experts and laypersons. PeerJ 5 (06 2017), e3440. https://doi.org/10.7717/peerj.3440
[8]
Y. Cai, X. Li, and J. Li. 2023. Emotion Recognition Using Different Sensors, Emotion Models, Methods and Datasets: A Comprehensive Review. Sensors 23, 5 (2023). https://doi.org/10.3390/s23052455
[9]
F. Z. Canal, T. R. Müller, J. C. Matias, G. G. Scotton, A. R. de Sa Junior, E. Pozzebon, and A. C. Sobieranski. 2022. A survey on facial emotion recognition techniques: A state-of-the-art literature review. Information Sciences 582 (2022), 593–617. https://doi.org/10.1016/j.ins.2021.10.005
[10]
J. Cheng, M. Chen, C. Li, Y. Liu, R. Song, A. Liu, and X. Chen. 2021. Emotion Recognition From Multi-Channel EEG via Deep Forest. IEEE Journal of Biomedical and Health Informatics 25, 2 (2021), 453–464. https://doi.org/10.1109/JBHI.2020.2995767
[11]
S. Cheng and A. K. Dey. 2019. I see, you design: user interface intelligent design system with eye tracking and interactive genetic algorithm. CCF Trans. Perv. Comput. Int. 1, 3 (2019), 224–236.
[12]
K. Chengeta. 2018. Comparative Analysis of Emotion Detection from Facial Expressions and Voice Using Local Binary Patterns and Markov Models: Computer Vision and Facial Recognition. In Proceedings of the 2nd International Conference on Vision, Image and Signal Processing (Las Vegas, NV, USA) (ICVISP 2018). Association for Computing Machinery, New York, NY, USA, Article 27, 6 pages. https://doi.org/10.1145/3271553.3271574
[13]
N. Chettaoui and M. S. Bouhlel. 2017. I2Evaluator: An Aesthetic Metric-Tool for Evaluating the Usability of Adaptive User Interfaces. In Proceedings of the International Conference on Advanced Intelligent Systems and Informatics 2017, AISI 2017, Cairo, Egypt, September 9-11, 2017(Advances in Intelligent Systems and Computing, Vol. 639), Aboul Ella Hassanien, Khaled Shaalan, Tarek Gaber, and Mohamed F. Tolba (Eds.). Springer, 374–383. https://doi.org/10.1007/978-3-319-64861-3_35
[14]
A. Delorme and S. Makeig. 2004. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. Journal of neuroscience methods 134, 1 (2004), 9–21.
[15]
A. Dzedzickis, A. Kaklauskas, and V. Bucinskas. 2020. Human Emotion Recognition: Review of Sensors and Methods. Sensors 20, 3 (2020). https://doi.org/10.3390/s20030592
[16]
P. Emami, Y. Yiang, Z. Guo, and L. A. Leiva. 2024. Impact of Design Decisions in Scanpath Modeling. In Proceedings of the ACM Symposium on Eye Tracking Research an Applications (ETRA).
[17]
R. A. Fernandez, J. A. Deja, and B. P. V. Samson. 2018. Automating Heuristic Evaluation of Websites Using Convolutional Neural Networks. In Proceedings of the Asian HCI Symposium’18 on Emerging Research Collection (Montreal, QC, Canada) (Asian HCI Symposium’18). Association for Computing Machinery, New York, NY, USA, 9–12. https://doi.org/10.1145/3205851.3205854
[18]
K. Z. Gajos and K. Chauncey. 2017. The Influence of Personality Traits and Cognitive Load on the Use of Adaptive User Interfaces. In Proceedings of the 22nd International Conference on Intelligent User Interfaces, IUI 2017, Limassol, Cyprus, March 13-16, 2017, George A. Papadopoulos, Tsvi Kuflik, Fang Chen, Carlos Duarte, and Wai-Tat Fu (Eds.). ACM, 301–306. https://doi.org/10.1145/3025171.3025192
[19]
J. A. Galindo, S. Dupuy-Chessa, N. Mandran, and E. Céret. 2018. Using user emotions to trigger UI adaptation. In 2018 12th International Conference on Research Challenges in Information Science (RCIS). 1–11. https://doi.org/10.1109/RCIS.2018.8406661
[20]
J. M. Garcia-Garcia, V. M. R. Penichet, and M. D. Lozano. 2017. Emotion Detection: A Technology Review. In Proceedings of the XVIII International Conference on Human Computer Interaction (Cancun, Mexico) (Interacción ’17). Association for Computing Machinery, New York, NY, USA, Article 8, 8 pages. https://doi.org/10.1145/3123818.3123852
[21]
S. Gwak and K. Park. 2023. Designing Effective Visual Feedback for Facial Rehabilitation Exercises: Investigating the Role of Shape, Transparency, and Age on User Experience. Healthcare 11 (06 2023), 1835. https://doi.org/10.3390/healthcare11131835
[22]
S. Haddad, O. Daassi, and S. Belghith. 2023. Emotion Recognition from Audio-Visual Information based on Convolutional Neural Network. In 2023 International Conference on Control, Automation and Diagnosis (ICCAD). 1–5. https://doi.org/10.1109/ICCAD57653.2023.10152451
[23]
H. B. Hassan and Q. I. Sarhan. 2020. Performance Evaluation of Graphical User Interfaces in Java and C#. In 2020 International Conference on Computer Science and Software Engineering (CSASE). 290–295. https://doi.org/10.1109/CSASE48920.2020.9142075
[24]
B. Hjorth. 1970. EEG analysis based on time domain properties. Electroencephalography and clinical neurophysiology 29, 3 (1970), 306–310.
[25]
L. D. Hollebeek and T. Chen. 2014. Exploring positively- versus negatively-valenced brand engagement: a conceptual model. Journal of Product & Brand Management 23, 1 (2014), 62–74. https://doi.org/10.1108/JPBM-06-2013-0332
[26]
T. Holmes and J. M. Zanker. 2012. Using an Oculomotor Signature as an Indicator of Aesthetic Preference. i-Perception 3, 7 (2012), 426–439. https://doi.org/10.1068/i0448aap arXiv:https://doi.org/10.1068/i0448aapPMID: 23145294.
[27]
K. M. Hossain, M. A. Islam, S. Hossain, A. Nijholt, and M. A. R. Ahad. 2023. Status of deep learning for EEG-based brain–computer interface applications. Frontiers in computational neuroscience 16 (2023), 1006763.
[28]
W. Hoyer and N. Stokburger-Sauer. 2011. The role of aesthetic taste in consumer behavior. Journal of the Academy of Marketing Science 40 (01 2011), 167–180. https://doi.org/10.1007/s11747-011-0269-y
[29]
Y. M. Hwang and K. C. Lee. 2022. An eye-tracking paradigm to explore the effect of online consumers’ emotion on their visual behaviour between desktop screen and mobile screen. Behaviour & Information Technology 41, 3 (2022), 535–546.
[30]
I. Kant. 1987. The Critique of judgment. Hackett Publishing. https://monoskop.org/images/7/77/Kant_Immanuel_Critique_of_Judgment_1987.pdf
[31]
A. Kaushik and G. J. F. Jones. 2023. Comparing Conventional and Conversational Search Interaction Using Implicit Evaluation Methods. In Proceedings of the 18th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, VISIGRAPP 2023, Volume 2: HUCAPP, Lisbon, Portugal, February 19-21, 2023, Alexis Paljic, Mounia Ziat, and Kadi Bouatouch (Eds.). SCITEPRESS, 292–304. https://doi.org/10.5220/0011798500003417
[32]
W. Klimesch. 1999. EEG alpha and theta oscillations reflect cognitive and memory performance: a review and analysis. Brain Research Reviews 29, 2 (1999), 169–195. https://doi.org/10.1016/S0165-0173(98)00056-3
[33]
S. Koelstra, C. Muhl, M. Soleymani, J.-S. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I. Patras. 2012. DEAP: A Database for Emotion Analysis ;Using Physiological Signals. IEEE Transactions on Affective Computing 3, 1 (2012), 18–31. https://doi.org/10.1109/T-AFFC.2011.15
[34]
K. Krejtz, A. T. Duchowski, A. Niedzielska, C. Biele, and I. Krejtz. 2018. Eye tracking cognitive load using pupil diameter and microsaccades with fixed gaze. PloS one 13, 9 (2018), e0203629.
[35]
T. Landauer. 1996. The trouble with computers: Usefulness, usability, and productivity. The MIT Press. https://mitpress.mit.edu/9780262621083/the-trouble-with-computers/
[36]
K. Latifzadeh, N. Gozalppour, V. J. Traver, T. Ruotsalo, A. Kawala-Sterniuk, and L. A. Leiva. 2024. Efficient Decoding of Affective States from Video-elicited EEG Signals: An Empirical Investigation. ACM Transactions on Multimedia Computing Communications and Applications (2024).
[37]
L. Leiva, M. Shiripour, and A. Oulasvirta. 2022. Modeling how different user groups perceive webpage aesthetics. Universal Access in the Information Society (08 2022). https://doi.org/10.1007/s10209-022-00910-x
[38]
X. Li, Y. Zhang, P. Tiwari, D. Song, B. Hu, M. Yang, Z. Zhao, N. Kumar, and P. Marttinen. 2022. EEG Based Emotion Recognition: A Tutorial and Review. ACM Comput. Surv. 55, 4, Article 79 (nov 2022), 57 pages. https://doi.org/10.1145/3524499
[39]
Z. Liang, S. Oba, and S. Ishii. 2019. An Unsupervised EEG Decoding System for Human Emotion Recognition. Neural Netw. 116, C (aug 2019), 257–268. https://doi.org/10.1016/j.neunet.2019.04.003
[40]
R. Likert. 1932. A technique for the measurement of attitudes. Archives of Psychology 22, 140 (1932), 55–. http://psycnet.apa.org/record/1933-01885-001
[41]
J. Z. Lim, J. Mountstephens, and J. Teo. 2020. Emotion recognition using eye-tracking: taxonomy, review and current challenges. Sensors 20, 8 (2020), 2384.
[42]
X. Liu, Y. P. Sanchez Perdomo, B. Zheng, X. Duan, Z. Zhang, and D. Zhang. 2022. When medical trainees encountering a performance difficulty: evidence from pupillary responses. BMC Medical Education 22, 1 (2022), 1–9.
[43]
D. Lockner and N. Bonnardel. 2014. Emotion and Interface Design How to measure interface design emotional effect?
[44]
D. Lockner and N. Bonnardel. 2015. Towards the Evaluation of Emotional Interfaces. In Human-Computer Interaction: Design and Evaluation - 17th International Conference, HCI International 2015, Los Angeles, CA, USA, August 2-7, 2015, Proceedings, Part I(Lecture Notes in Computer Science, Vol. 9169), Masaaki Kurosu (Ed.). Springer, 500–511. https://doi.org/10.1007/978-3-319-20901-2_47
[45]
D. Lockner, N. Bonnardel, C. Bouchard, and V. Rieuf. 2014. Emotion and Interface Design. In Proceedings of the 2014 Ergonomie et Informatique Avancée Conference - Design, Ergonomie et IHM: Quelle Articulation Pour La Co-Conception de l’interaction (Bidart-Biarritz, France) (Ergo’IA ’14). Association for Computing Machinery, New York, NY, USA, 33–40. https://doi.org/10.1145/2671470.2671475
[46]
P. Lucey, J. F. Cohn, T. Kanade, J. Saragih, Z. Ambadar, and I. Matthews. 2010. The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression. In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops. 94–101. https://doi.org/10.1109/CVPRW.2010.5543262
[47]
S. Luo, Y.-T. Lan, D. Peng, Z. Li, W.-L. Zheng, and B.-L. Lu. 2022. Multimodal Emotion Recognition in Response to Oil Paintings. In 2022 44th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC). 4167–4170. https://doi.org/10.1109/EMBC48229.2022.9871630
[48]
M. Maithri, U. Raghavendra, A. Gudigar, J. Samanth, Prabal Datta Barua, M. Murugappan, Y. Chakole, and U. R. Acharya. 2022. Automated emotion recognition: Current trends and future perspectives. Computer Methods and Programs in Biomedicine 215 (2022), 106646. https://doi.org/10.1016/j.cmpb.2022.106646
[49]
S. Mastandrea, S. Fagioli, and V. Biasi. 2019. Art and Psychological Well-Being: Linking the Brain to the Aesthetic Emotion. Frontiers in Psychology 10 (4 Apr 2019), 739. https://doi.org/10.3389/fpsyg.2019.00739
[50]
S. Minaee, M. Minaei, and A. Abdolrashidi. 2021. Deep-Emotion: Facial Expression Recognition Using Attentional Convolutional Network. Sensors 21 (04 2021), 3046. https://doi.org/10.3390/s21093046
[51]
A. Miniukovich and A. De Angeli. 2015. Computation of Interface Aesthetics. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (Seoul, Republic of Korea) (CHI ’15). Association for Computing Machinery, New York, NY, USA, 1163–1172. https://doi.org/10.1145/2702123.2702575
[52]
L. S. Mokatren, R. Ansari, A. E. Cetin, A. D. Leow, O. A. Ajilore, H. Klumpp, and F. T. Yarman Vural. 2021. EEG Classification by Factoring in Sensor Spatial Configuration. IEEE Access 9 (2021), 19053–19065. https://doi.org/10.1109/ACCESS.2021.3054670
[53]
M. Moshagen and M. T. Thielsch. 2010. Facets of visual aesthetics. International Journal of Human-Computer Studies 68, 10 (2010), 689–709. https://doi.org/10.1016/j.ijhcs.2010.05.006
[54]
M. Muller. 2007. Participatory design: The third space in HCI (revised). Erlbaum, Mahway NJ USA.
[55]
C. S. Nayak and A. C. Anilkumar. 2021. EEG Normal Waveforms. StatPearls Publishing, Treasure Island (FL). https://www.ncbi.nlm.nih.gov/books/NBK539805/
[56]
M. Ninaus, S. Greipl, K. Kiili, A. Lindstedt, S. Huber, E. Klein, H.-O. Karnath, and K. Moeller. 2019. Increased emotional engagement in game-based learning – A machine learning approach on facial emotion detection data. Computers & Education 142 (2019), 103641. https://doi.org/10.1016/j.compedu.2019.103641
[57]
L. Odushegun. 2023. Aesthetic semantics: Affect rating of atomic visual web aesthetics for use in affective user experience design. International Journal of Human-Computer Studies 171 (2023), 102978. https://doi.org/10.1016/j.ijhcs.2022.102978
[58]
M. Oliva and A. Anikin. 2018. Pupil dilation reflects the time course of emotion recognition in human vocalizations. Scientific Reports 8 (03 2018). https://doi.org/10.1038/s41598-018-23265-x
[59]
E. Ostertagova and O. Ostertag. 2016. Methodology and Application of Savitzky-Golay Moving Average Polynomial Smoother. Global Journal of Pure and Applied Mathematics 12, 4 (08 2016), 3201–3210. https://www.ripublication.com/gjpam16/gjpamv12n4_35.pdf
[60]
A. Oulasvirta, S. D. Pascale, J. Koch, T. Langerak, J. Jokinen, K. Todi, M. Laine, M. Kristhombuge, Y. Zhu, A. Miniukovich, G. Palmas, and T. Weinkauf. 2018. Aalto Interface Metrics (AIM): A Service and Codebase for Computational GUI Evaluation. In The 31st Annual ACM Symposium on User Interface Software and Technology Adjunct Proceedings, UIST 2018, Berlin, Germany, October 14-17, 2018, Patrick Baudisch, Albrecht Schmidt, and Andy Wilson (Eds.). ACM, 16–19. https://doi.org/10.1145/3266037.3266087
[61]
T. Partala and V. Surakka. 2003. Pupil size variation as an indication of affective processing. International Journal of Human-Computer Studies 59, 1 (2003), 185–198. https://doi.org/10.1016/S1071-5819(03)00017-X Applications of Affective Computing in Human-Computer Interaction.
[62]
J. L. Plass, S. Heidig, E. O. Hayward, B. D. Homer, and E. Um. 2014. Emotional design in multimedia learning: Effects of shape and color on affect and learning. Learning and Instruction 29 (2014), 128–140. https://doi.org/10.1016/j.learninstruc.2013.02.006
[63]
R. Reber, N. Schwarz, and P. Winkielman. 2004. Processing Fluency and Aesthetic Pleasure: Is Beauty in the Perceiver’s Processing Experience?Personality and Social Psychology Review 8, 4 (2004), 364–382. https://doi.org/10.1207/s15327957pspr0804_3 arXiv:https://doi.org/10.1207/s15327957pspr0804_3PMID: 15582859.
[64]
K. Reinecke and K. Z. Gajos. 2014. Quantifying Visual Preferences around the World. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Toronto, Ontario, Canada) (CHI ’14). Association for Computing Machinery, New York, NY, USA, 11–20. https://doi.org/10.1145/2556288.2557052
[65]
T. Ruotsalo, V. J. Traver, A. Kawala-Sterniuk, and L. A. Leiva. 2024. Affective Relevance. IEEE Intelligent Systems (2024).
[66]
J. Russell. 1980. A Circumplex Model of Affect. Journal of Personality and Social Psychology 39 (12 1980), 1161–1178. https://doi.org/10.1037/h0077714
[67]
A. Savitzky and M. J. E. Golay. 1964. Smoothing and Differentiation of Data by Simplified Least Squares Procedures.Analytical Chemistry 36, 8 (1964), 1627–1639. https://doi.org/10.1021/ac60214a047 arXiv:https://doi.org/10.1021/ac60214a047
[68]
K. R. Scherer, H. Ellgring, A. Dieckmann, M. Unfried, and M. Mortillaro. 2019. Dynamic Facial Expression of Emotion and Observer Inference. Frontiers in Psychology 10 (2019). https://doi.org/10.3389/fpsyg.2019.00508
[69]
J. Tan, K. Otto, and K. Wood. 2017. A comparison of design decisions made early and late in development. In Proceedings of the 21st International Conference on Engineering Design (Vancouver, Canada) (ICED ’17, Vol. 2). 41–50. https://www.designsociety.org/publication/39558/A+comparison+of+design+decisions+made+early+and+late+in+development
[70]
M. Teplan 2002. Fundamentals of EEG measurement. Measurement science review 2, 2 (2002), 1–11.
[71]
T. Thanapattheerakul, K. Mao, J. Amoranto, and J. H. Chan. 2018. Emotion in a Century: A Review of Emotion Recognition. In Proceedings of the 10th International Conference on Advances in Information Technology (Bangkok, Thailand) (IAIT 2018). Association for Computing Machinery, New York, NY, USA, Article 17, 8 pages. https://doi.org/10.1145/3291280.3291788
[72]
A. Toisoul, J. Kossaifi, A. Bulat, G. Tzimiropoulos, and M. Pantic. 2021. Estimation of continuous valence and arousal levels from faces in naturalistic conditions. Nature Machine Intelligence 3 (01 2021). https://doi.org/10.1038/s42256-020-00280-0
[73]
N. Tractinsky, A. Cokhavi, M. Kirschenbaum, and T. Sharfi. 2006. Evaluating the consistency of immediate aesthetic perceptions of web pages. International Journal of Human-Computer Studies 64, 11 (2006), 1071–1083. https://doi.org/10.1016/j.ijhcs.2006.06.009
[74]
N. Tractinsky, A. Katz, and D. Ikar. 2000. What is beautiful is usable. Interacting with Computers 13, 2 (2000), 127–145. https://doi.org/10.1016/S0953-5438(00)00031-X
[75]
A. N. Tuch, S. P. Roth, K. Hornbæk, K. Opwis, and J. A. Bargas-Avila. 2012. Is beautiful really usable? Toward understanding the relation between usability, aesthetics, and affect in HCI. Computers in Human Behavior 28, 5 (2012), 1596–1607. https://doi.org/10.1016/j.chb.2012.03.024
[76]
S. Tzvetanova, M. Tang, and L. Justice. 2007. Emotional Web Usability Evaluation. In Human-Computer Interaction. HCI Applications and Services, 12th International Conference, HCI International 2007, Beijing, China, July 22-27, 2007, Proceedings, Part IV(Lecture Notes in Computer Science, Vol. 4553), Julie A. Jacko (Ed.). Springer, 1039–1046. https://doi.org/10.1007/978-3-540-73111-5_114
[77]
N. van Berkel, M. J. Clarkson, G. Xiao, E. Dursun, M. Allam, B. R. Davidson, and A. Blandford. 2020. Dimensions of ecological validity for usability evaluations in clinical settings. J. Biomed. Informatics 110 (2020), 103553. https://doi.org/10.1016/j.jbi.2020.103553
[78]
P. van Schaik and J. Ling. 2009. The role of context in perceptions of the aesthetics of web pages over time. International Journal of Human-Computer Studies 67, 1 (2009), 79–89. https://doi.org/10.1016/j.ijhcs.2008.09.012
[79]
J. Vanderdonckt and A. Beirekdar. 2005. Automated Web Evaluation by Guideline Review. J. Web Eng. 4, 2 (2005), 102–117. http://www.rintonpress.com/xjwe4/jwe-4-2/102-117.pdf
[80]
A. P. O. S. Vermeeren, E. L.-C. Law, V. Roto, M. Obrist, J. Hoonhout, and K. Väänänen-Vainio-Mattila. 2010. User Experience Evaluation Methods: Current State and Development Needs. In Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries (Reykjavik, Iceland) (NordiCHI ’10). Association for Computing Machinery, New York, NY, USA, 521–530. https://doi.org/10.1145/1868914.1868973
[81]
P. Viola and M. Jones. 2001. Rapid object detection using a boosted cascade of simple features. In Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001, Vol. 1. I–I. https://doi.org/10.1109/CVPR.2001.990517
[82]
B. Wang, C. M. Wong, F. Wan, P. U. Mak, P. I. Mak, and M. I. Vai. 2009. Comparison of different classification methods for EEG-based brain computer interfaces: a case study. In 2009 International Conference on Information and Automation. IEEE, 1416–1421.
[83]
J. Wang, Y. Liu, Y. Wang, J. Mao, T. Yue, and F. You. 2021. SAET: The Non-Verbal Measurement Tool in User Emotional Experience. Applied Sciences 11, 16 (2021). https://doi.org/10.3390/app11167532
[84]
A. Whitefield, F. Wilson, and J. Dowell. 1991. A framework for human factors evaluation. Behaviour & Information Technology 10, 1 (1991), 65–79. https://doi.org/10.1080/01449299108924272 arXiv:https://doi.org/10.1080/01449299108924272
[85]
M. Zen and J. Vanderdonckt. 2014. Towards an evaluation of graphical user interfaces aesthetics based on metrics. In IEEE 8th International Conference on Research Challenges in Information Science, RCIS 2014, Marrakech, Morocco, May 28-30, 2014, Marko Bajec, Martine Collard, and Rébecca Deneckère (Eds.). IEEE, 1–12. https://doi.org/10.1109/RCIS.2014.6861050
[86]
M. Zen and J. Vanderdonckt. 2016. Assessing User Interface Aesthetics Based on the Inter-Subjectivity of Judgment. In HCI 2016 - Fusion! Proceedings of the 30th International BCS Human Computer Interaction Conference, BCS HCI 2016, Bournemouth University, Poole, UK, 11-15 July 2016(Workshops in Computing), Shamal Faily, Nan Jiang, Huseyin Dogan, and Jacqui Taylor (Eds.). BCS. http://ewic.bcs.org/content/ConWebDoc/56903
[87]
A. Zhang, B. Yang, and L. Huang. 2008. Feature extraction of EEG signals using power spectral entropy. In 2008 international conference on BioMedical engineering and informatics, Vol. 2. IEEE, 435–439.
[88]
W.-L. Zheng, B.-N. Dong, and B.-L. Lu. 2014. Multimodal emotion recognition using EEG and eye tracking data. In 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. 5040–5043. https://doi.org/10.1109/EMBC.2014.6944757

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UMAP '24: Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization
June 2024
338 pages
ISBN:9798400704338
DOI:10.1145/3627043
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 June 2024

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Affective computing
  2. Neurophysiological and peripheral signals
  3. User Interface design

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • European Innovation Council
  • Horizon 2020 FET program

Conference

UMAP '24
Sponsor:

Acceptance Rates

Overall Acceptance Rate 162 of 633 submissions, 26%

Upcoming Conference

UMAP '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 235
    Total Downloads
  • Downloads (Last 12 months)235
  • Downloads (Last 6 weeks)34
Reflects downloads up to 07 Mar 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media