Abstract
Human-centered computing is rapidly becoming a major research direction as new developments in sensor technology make it increasingly feasible to obtain signals from human beings. At the same time, the pervasiveness of computing devices is also encouraging more research in human–computer interaction, especially in the direction of personalized and adaptive user interfaces. Among the various research issues, affective computing, or the ability of computers to understand and react according to what a user “feels,” has been gaining in importance. In order to recognize the human affect (feeling), computers rely on the analysis of signal inputs captured by a multitude of means. This paper proposes the use of human physiological signals as a new form of modality in determining human affects, in a non-intrusive manner. The principle of non-invasiveness is very important, since it imposes no extra burden on the user, which improves user accessibility and encourages user adoption. This goal is realized via the physiological mouse, as a first step toward the support of affective computing. The conventional mouse is converted with a small optical component for capturing user photoplethysmographic (PPG) signal. With the PPG signal, it is possible to compute and derive human physiological signals. A prototype of the physiological mouse was built and raw PPG readings measured. The accuracy of the approach was evaluated through empirical studies to determine human physiological signals from the mouse PPG data. Finally, pilot experiments to correlate human physiological signals with various modes of human–computer interaction, namely gaming and video watching, were conducted. The trend in physiological signals could be used as feedback to the computer system which in turn adapts to the needs or the mood of the user, for instance change the volume and the light intensity when watching a video or playing a game based on current user emotion. The authors argue that this research will provide a new dimension for multimodal affective computing research, and the pilot study has already shed some light toward this research goal.
Similar content being viewed by others
Notes
The funny video is extracted from the famous hidden camera comedy show: “Just For Laughs: Gags,” Season 9, Episode 8, between 9’38” and 10’58,” whereas the horror video is taken from the movie “Final Destination 5,” running from 22’05” to 26’35,”
References
Allen, J.: Photoplethysmography and its application in clinical physiological measurement. Physiol. Meas. 28, R1–R39 (2007)
Ark, W.,S., Dryer, D.C., Lu, D.J.: The emotion mouse. In: Proceedings of SIGCHI, ACM, pp. 818–823 (1999)
Bixler, R., D’Mello, S.: Detecting boredom and engagement during writing with keystroke analysis, task appraisals, and stable traits. In: Proceedings of the 2013 International Conference on Intelligent User Interfaces, ACM, p. 225 (2013)
Bixler, R., D’Mello, S.: Detecting pulse from head motions in video. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 3430–3437 (2013)
Bolt, R.: Put-that-there: voice and gesture at the graphics interface. ACM SIGGRAPH Comput. Graph. 14(3), 262–270 (1980)
Brayda, L., Campus, C., Memeo, M., Lucagrossi, L.: The importance of visual experience, gender and emotion in the assessment of an assistive tactile mouse. IEEE Transactions on Haptics (2015). To appear
Brown, T., Beightolm, L., Koh, J., Eckberg, D.: Important influence of respiration on human RR interval power spectra is largely ignored. J. Appl. Physiol. 75(5), 2310–2317 (1993)
Calvo, R., Mello, S.: Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1(1), 18–37 (2010)
Changizi, M.: The Vision Revolution: How the Latest Research Overturns Everything We Thought We Knew About Human Vision. BenBella Books, Dallas, Texas (2009)
Ekman, P., Friesen, W.: Detecting deception from the body or face. J. Pers. Social Psychol. 29(3), 288–298 (1974)
Emotiv. EEG System/Electroencephalography. http://www.emotiv.com
Huang, M.X., Kwok, T.C.K., Ngai, G., Leong, H.V., Chan, S.C.F.: Building a self-learning eye gaze model from user interaction data. In: Proceedings of the 2014 International Conference on Multimedia, ACM, pp. 1017–1020 (2014)
iHealth. iHealth Pulse Oximeter. http://www.ihealthlabs.com/fitness-devices/wireless-pulse-oximeter/
Kim, B., Yoo, S.: Motion artifact reduction in photoplethysmography using independent component analysis. IEEE Trans. Biomed. Eng. 53(3), 566–568 (2006)
Kim, J., Andre, E.: Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 30(12), 2067–2083 (2008)
Kusk, K., Nielsen, D., Thylstrup, T., Rasmussen, N., Jorvang, J., Pedersen, C., Wagner, S.: Feasibility of using a lightweight context-aware system for facilitating reliable home blood pressure self-measurements. In: Proceedings of International Conference on Pervasive Computing Technologies for Healthcare, pp. 236–239 (2013)
Kwok, T.C.K., Huang, M.X., Tam, W.C., Ngai, G.: Emotar: communicating feelings through video sharing. In: Proceedings of the 2015 International Conference on Intelligent User Interfaces, ACM, pp. 374–378 (2015)
Lalanne, D., Robinson, P., Nigay, L., Vanderdonckt, J., Palanque, P., Ladry, J.: Fusion engines for multimodal input: a survey. In: Proceedings of ACM International Conference on Multimodal Interfaces, ACM, pp. 153–160 (2009)
Lo, K.W.K., Lau, C., K., Huang, M.,X., Tang, W.,W., Ngai, G., Chan,S.,C.,F.: Mobile DJ: a tangible, mobile platform for active and collaborative music listening. In: Proceedings of International Conference on New Interfaces for Musical Expression, ACM, pp. 217–222 (2013)
NeuroSky. NeuroSky/MindSet. http://www.neurosky.com/products/mindset.aspx
Oviatt, S.: Advances in robust multimodal interface design. IEEE Comput. Graph. Appl. 23(5), 62–68 (2003)
Oviatt, S.: Multimodal interfaces. In :Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications, pp. 286–304. L. Erlbaum Assoc. Inc. (2007)
Picard, R.: Affective Computing. The MIT Press, Cambridge (1997)
Rainville, P., Bechara, A., Naqvi, N., Damasio, A.: Basic emotions are associated with distinct patterns of cardiorespiratory activity. J. Pers. Social Psychol. 61(1), 5–18 (2006)
Scully, C., Lee, J., Meyer, J., Gorbach, A., Granquist-Fraser, D., Mendelson, Y., Chon, K.: Physiological parameter monitoring from optical recordings with a mobile phone. IEEE Trans. Biomed. Eng. 59(2), 303–306 (2012)
Shivappa, S., Trivedi, M., Rao, B.: Audiovisual information fusion in human–computer interfaces and intelligent environments: a survey. Proc. IEEE 98(10), 1–24 (2010)
Soleymani, M., Lichtenauer, J., Pun, T., Pantic, M.: A multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 3(1), 42–55 (2012)
Sun, H.J., Huang, M.X., Ngai, G., Chan, S.C.F.: Nonintrusive multimodal attention detection. In: IEEE Proceedings of International Conference on Advances in Computer–Human Interactions (2014)
Waluyo, A., Yeoh, W., Pek, I., Yong, Y., Chen, X.: Mobisense: mobile body sensor network for ambulatory monitoring. ACM Trans. Embed. Comput. Syst. 10(1), 13–42 (2010)
Yamauchi, T.: Mouse trajectories and state anxiety: feature selection with random forest. In: IEEE Proceedings of ACII, pp. 399–404 (2013)
Zeng, Z., Pantic, M., Roisman, G., Huang, T.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009)
Acknowledgments
We would like to thank the experiment subjects for their time and patience. We would also like to thank the reviewers for their valuable comments for improving this paper. This research is supported in part by the Research Grant Council and the Hong Kong Polytechnic University under Grant Nos. PolyU 5235/11E and PolyU 5222/13E.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Fu, Y., Leong, H.V., Ngai, G. et al. Physiological mouse: toward an emotion-aware mouse. Univ Access Inf Soc 16, 365–379 (2017). https://doi.org/10.1007/s10209-016-0469-9
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10209-016-0469-9