Skip to main content
Log in

Physiological mouse: toward an emotion-aware mouse

  • Long paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

Human-centered computing is rapidly becoming a major research direction as new developments in sensor technology make it increasingly feasible to obtain signals from human beings. At the same time, the pervasiveness of computing devices is also encouraging more research in human–computer interaction, especially in the direction of personalized and adaptive user interfaces. Among the various research issues, affective computing, or the ability of computers to understand and react according to what a user “feels,” has been gaining in importance. In order to recognize the human affect (feeling), computers rely on the analysis of signal inputs captured by a multitude of means. This paper proposes the use of human physiological signals as a new form of modality in determining human affects, in a non-intrusive manner. The principle of non-invasiveness is very important, since it imposes no extra burden on the user, which improves user accessibility and encourages user adoption. This goal is realized via the physiological mouse, as a first step toward the support of affective computing. The conventional mouse is converted with a small optical component for capturing user photoplethysmographic (PPG) signal. With the PPG signal, it is possible to compute and derive human physiological signals. A prototype of the physiological mouse was built and raw PPG readings measured. The accuracy of the approach was evaluated through empirical studies to determine human physiological signals from the mouse PPG data. Finally, pilot experiments to correlate human physiological signals with various modes of human–computer interaction, namely gaming and video watching, were conducted. The trend in physiological signals could be used as feedback to the computer system which in turn adapts to the needs or the mood of the user, for instance change the volume and the light intensity when watching a video or playing a game based on current user emotion. The authors argue that this research will provide a new dimension for multimodal affective computing research, and the pilot study has already shed some light toward this research goal.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

Notes

  1. The funny video is extracted from the famous hidden camera comedy show: “Just For Laughs: Gags,” Season 9, Episode 8, between 9’38” and 10’58,” whereas the horror video is taken from the movie “Final Destination 5,” running from 22’05” to 26’35,”

References

  1. Allen, J.: Photoplethysmography and its application in clinical physiological measurement. Physiol. Meas. 28, R1–R39 (2007)

    Article  Google Scholar 

  2. Ark, W.,S., Dryer, D.C., Lu, D.J.: The emotion mouse. In: Proceedings of SIGCHI, ACM, pp. 818–823 (1999)

  3. Bixler, R., D’Mello, S.: Detecting boredom and engagement during writing with keystroke analysis, task appraisals, and stable traits. In: Proceedings of the 2013 International Conference on Intelligent User Interfaces, ACM, p. 225 (2013)

  4. Bixler, R., D’Mello, S.: Detecting pulse from head motions in video. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 3430–3437 (2013)

  5. Bolt, R.: Put-that-there: voice and gesture at the graphics interface. ACM SIGGRAPH Comput. Graph. 14(3), 262–270 (1980)

    Article  Google Scholar 

  6. Brayda, L., Campus, C., Memeo, M., Lucagrossi, L.: The importance of visual experience, gender and emotion in the assessment of an assistive tactile mouse. IEEE Transactions on Haptics (2015). To appear

  7. Brown, T., Beightolm, L., Koh, J., Eckberg, D.: Important influence of respiration on human RR interval power spectra is largely ignored. J. Appl. Physiol. 75(5), 2310–2317 (1993)

    Google Scholar 

  8. Calvo, R., Mello, S.: Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1(1), 18–37 (2010)

    Article  Google Scholar 

  9. Changizi, M.: The Vision Revolution: How the Latest Research Overturns Everything We Thought We Knew About Human Vision. BenBella Books, Dallas, Texas (2009)

    Google Scholar 

  10. Ekman, P., Friesen, W.: Detecting deception from the body or face. J. Pers. Social Psychol. 29(3), 288–298 (1974)

    Article  Google Scholar 

  11. Emotiv. EEG System/Electroencephalography. http://www.emotiv.com

  12. Huang, M.X., Kwok, T.C.K., Ngai, G., Leong, H.V., Chan, S.C.F.: Building a self-learning eye gaze model from user interaction data. In: Proceedings of the 2014 International Conference on Multimedia, ACM, pp. 1017–1020 (2014)

  13. iHealth. iHealth Pulse Oximeter. http://www.ihealthlabs.com/fitness-devices/wireless-pulse-oximeter/

  14. Kim, B., Yoo, S.: Motion artifact reduction in photoplethysmography using independent component analysis. IEEE Trans. Biomed. Eng. 53(3), 566–568 (2006)

    Article  MathSciNet  Google Scholar 

  15. Kim, J., Andre, E.: Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 30(12), 2067–2083 (2008)

    Article  Google Scholar 

  16. Kusk, K., Nielsen, D., Thylstrup, T., Rasmussen, N., Jorvang, J., Pedersen, C., Wagner, S.: Feasibility of using a lightweight context-aware system for facilitating reliable home blood pressure self-measurements. In: Proceedings of International Conference on Pervasive Computing Technologies for Healthcare, pp. 236–239 (2013)

  17. Kwok, T.C.K., Huang, M.X., Tam, W.C., Ngai, G.: Emotar: communicating feelings through video sharing. In: Proceedings of the 2015 International Conference on Intelligent User Interfaces, ACM, pp. 374–378 (2015)

  18. Lalanne, D., Robinson, P., Nigay, L., Vanderdonckt, J., Palanque, P., Ladry, J.: Fusion engines for multimodal input: a survey. In: Proceedings of ACM International Conference on Multimodal Interfaces, ACM, pp. 153–160 (2009)

  19. Lo, K.W.K., Lau, C., K., Huang, M.,X., Tang, W.,W., Ngai, G., Chan,S.,C.,F.: Mobile DJ: a tangible, mobile platform for active and collaborative music listening. In: Proceedings of International Conference on New Interfaces for Musical Expression, ACM, pp. 217–222 (2013)

  20. NeuroSky. NeuroSky/MindSet. http://www.neurosky.com/products/mindset.aspx

  21. Oviatt, S.: Advances in robust multimodal interface design. IEEE Comput. Graph. Appl. 23(5), 62–68 (2003)

    Article  Google Scholar 

  22. Oviatt, S.: Multimodal interfaces. In :Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications, pp. 286–304. L. Erlbaum Assoc. Inc. (2007)

  23. Picard, R.: Affective Computing. The MIT Press, Cambridge (1997)

    Book  Google Scholar 

  24. Rainville, P., Bechara, A., Naqvi, N., Damasio, A.: Basic emotions are associated with distinct patterns of cardiorespiratory activity. J. Pers. Social Psychol. 61(1), 5–18 (2006)

    Google Scholar 

  25. Scully, C., Lee, J., Meyer, J., Gorbach, A., Granquist-Fraser, D., Mendelson, Y., Chon, K.: Physiological parameter monitoring from optical recordings with a mobile phone. IEEE Trans. Biomed. Eng. 59(2), 303–306 (2012)

    Article  Google Scholar 

  26. Shivappa, S., Trivedi, M., Rao, B.: Audiovisual information fusion in human–computer interfaces and intelligent environments: a survey. Proc. IEEE 98(10), 1–24 (2010)

    Article  Google Scholar 

  27. Soleymani, M., Lichtenauer, J., Pun, T., Pantic, M.: A multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 3(1), 42–55 (2012)

    Article  Google Scholar 

  28. Sun, H.J., Huang, M.X., Ngai, G., Chan, S.C.F.: Nonintrusive multimodal attention detection. In: IEEE Proceedings of International Conference on Advances in Computer–Human Interactions (2014)

  29. Waluyo, A., Yeoh, W., Pek, I., Yong, Y., Chen, X.: Mobisense: mobile body sensor network for ambulatory monitoring. ACM Trans. Embed. Comput. Syst. 10(1), 13–42 (2010)

    Article  Google Scholar 

  30. Yamauchi, T.: Mouse trajectories and state anxiety: feature selection with random forest. In: IEEE Proceedings of ACII, pp. 399–404 (2013)

  31. Zeng, Z., Pantic, M., Roisman, G., Huang, T.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009)

    Article  Google Scholar 

Download references

Acknowledgments

We would like to thank the experiment subjects for their time and patience. We would also like to thank the reviewers for their valuable comments for improving this paper. This research is supported in part by the Research Grant Council and the Hong Kong Polytechnic University under Grant Nos. PolyU 5235/11E and PolyU 5222/13E.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yujun Fu.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fu, Y., Leong, H.V., Ngai, G. et al. Physiological mouse: toward an emotion-aware mouse. Univ Access Inf Soc 16, 365–379 (2017). https://doi.org/10.1007/s10209-016-0469-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-016-0469-9

Keywords

Navigation