Skip to main content

Artificial Neural Networks Can Distinguish Genuine and Acted Anger by Synthesizing Pupillary Dilation Signals from Different Participants

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2018)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11305))

Included in the following conference series:

  • 2482 Accesses

Abstract

Previous research has revealed that people are generally poor at distinguishing genuine and acted anger facial expressions, with a mere 65% accuracy of verbal answers. We aim to investigate whether a group of feedforward neural networks can perform better using raw pupillary dilation signals from individuals. Our results show that a single neural network cannot accurately discern the veracity of an emotion based on raw physiological signals, with an accuracy of 50.5%. Nonetheless, distinct neural networks using pupillary dilation signals from different individuals display a variety of genuineness for discerning the anger emotion, from 27.8% to 83.3%. By leveraging these differences, our novel Misaka neural networks can compose predictions using different individuals’ pupillary dilation signals to give a more accurate overall prediction than even from the highest performing single individual, reaching an accuracy of 88.9%. Further research will involve the investigation of the correlation between two groups of high-performing predictors using verbal answers and pupillary dilation signals.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Andreassi, J.L.: Psychophysiology: Human Behavior & Physiological Response, 5th edn. Lawrence Erlbaum Associates Publishers, Mahwah (2007)

    Google Scholar 

  2. Aviezer, H., Hassin, R., Bentin, S., Trope, Y.: Putting facial expressions back in context. In: Ambady, N., Skowronsky, J.J. (eds.) First Impressions, chap. 11, pp. 255–286. Guilford Press, New York (2008)

    Google Scholar 

  3. Batty, M., Taylor, M.J.: Early processing of the six basic facial emotional expressions. Cogn. Brain Res. 17(3), 613–620 (2003)

    Article  Google Scholar 

  4. Chanel, G., Ansari-Asl, K., Pun, T.: Valence-arousal evaluation using physiological signals in an emotion recall paradigm. In: 2007 IEEE International Conference on Systems, Man and Cybernetics, pp. 2662–2667 (2007)

    Google Scholar 

  5. Chen, L., Gedeon, T., Hossain, M.Z., Caldwell, S.: Are you really angry?: detecting emotion veracity as a proposed tool for interaction. In: Proceedings of the 29th Australian Conference on Computer-Human Interaction, Brisbane, Queensland, Australia, pp. 412–416. ACM (2017)

    Google Scholar 

  6. Dalmaijer, E.: Is the low-cost eyetribe eye tracker any good for research? Technical report, PeerJ PrePrints (2014)

    Google Scholar 

  7. Ekman, P.: An argument for basic emotions. Cogn. Emot. 6(3–4), 169–200 (1992)

    Article  Google Scholar 

  8. Frood, A.: Work the crowd. New Sci. 237(3166), 32–35 (2018)

    Article  Google Scholar 

  9. Gao, Y., Xiao, F., Liu, J., Wang, R.: Distributed soft fault detection for interval type-2 fuzzy-model-based stochastic systems with wireless sensor networks. IEEE Trans. Ind. Inf. (2018, early access version)

    Google Scholar 

  10. de Gee, J.W., Knapen, T., Donner, T.H.: Decision-related pupil dilation reflects upcoming choice and individual bias. Proc. Nat. Acad. Sci. 111(5), E618–E625 (2014)

    Article  Google Scholar 

  11. Goldinger, S.D., Papesh, M.H.: Pupil dilation reflects the creation and retrieval of memories. Curr. Dir. Psychol. Sci. 21(2), 90–95 (2012)

    Article  Google Scholar 

  12. Haimura, M.: A Certain Magical Index. ASCII Media Works, Tokyo (2013)

    Google Scholar 

  13. Hess, E.H., Polt, J.M.: Pupil size in relation to mental activity during simple problem-solving. Science 143(3611), 1190–1192 (1964)

    Article  Google Scholar 

  14. Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Netw. 2(5), 359–366 (1989)

    Article  Google Scholar 

  15. Jirayucharoensak, S., Pan-Ngum, S., Israsena, P.: EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation. Sci. World J. (2014)

    Google Scholar 

  16. Kahneman, D., Beatty, J.: Pupil diameter and load on memory. Science 154(3756), 1583–1585 (1966)

    Article  Google Scholar 

  17. Kim, K.H., Bang, S.W., Kim, S.R.: Emotion recognition system using short-term monitoring of physiological signals. Med. Biol. Eng. Comput. 42(3), 419–427 (2004)

    Article  Google Scholar 

  18. Lang, P.J.: The emotion probe: studies of motivation and attention. Am. Psychol. 50(5), 372 (1995)

    Article  Google Scholar 

  19. Manski, C.F.: Interpreting the predictions of prediction markets. Econ. Lett. 91(3), 425–429 (2006)

    Article  Google Scholar 

  20. Mellers, B., et al.: Identifying and cultivating superforecasters as a method of improving probabilistic predictions. Perspect. Psychol. Sci. 10(3), 267–281 (2015)

    Article  Google Scholar 

  21. Papesh, M.H., Goldinger, S.D., Hout, M.C.: Memory strength and specificity revealed by pupillometry. Int. J. Psychophysiol. 83(1), 56–64 (2012)

    Article  Google Scholar 

  22. Partala, T., Jokiniemi, M., Surakka, V.: Pupillary responses to emotionally provocative stimuli. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 123–129. ACM (2000)

    Google Scholar 

  23. Pletti, C., Scheel, A., Paulus, M.: Intrinsic altruism or social motivationwhat does pupil dilation tell us about children’s helping behavior? Front. Psychol. 8, 2089 (2017)

    Article  Google Scholar 

  24. Polgreen, P.M., Nelson, F.D., Neumann, G.R., Weinstein, R.A.: Use of prediction markets to forecast infectious disease activity. Clin. Infect. Dis. 44(2), 272–279 (2007)

    Article  Google Scholar 

  25. Qin, Z., Gedeon, T., Caldwell, S.: Neural networks assist crowd predictions in discerning the veracity of emotional expressions. arXiv Preprint arXiv:1808.05359 (2018)

  26. Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)

    Article  Google Scholar 

  27. Steinhauer, S.: Pupillary dilation to emotional visual stimuli revisited. Psychophysiology 20, S472 (1983)

    Google Scholar 

  28. Steinhauer, S.R., Siegle, G.J., Condray, R., Pless, M.: Sympathetic and parasympathetic innervation of pupillary dilation during sustained processing. Int. J. Psychophysiol. 52(1), 77–86 (2004)

    Article  Google Scholar 

  29. Wagner, J., Kim, J., André, E.: From physiological signals to emotions: implementing and comparing selected methods for feature extraction and classification. In: IEEE International Conference on Multimedia and Expo, ICME 2005, Amsterdam, Netherlands, pp. 940–943. IEEE (2005)

    Google Scholar 

  30. Wolfers, J., Zitzewitz, E.: Prediction markets. J. Econ. Perspect. 18(2), 107–126 (2004)

    Article  Google Scholar 

Download references

Acknowledgments

The authors acknowledge Dongyang Li, Liang Zhang and Zihan Wang for the suggestion of applying Bayes’ theorem in the probability calculation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tom Gedeon .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Qin, Z., Gedeon, T., Chen, L., Zhu, X., Hossain, M.Z. (2018). Artificial Neural Networks Can Distinguish Genuine and Acted Anger by Synthesizing Pupillary Dilation Signals from Different Participants. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11305. Springer, Cham. https://doi.org/10.1007/978-3-030-04221-9_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-04221-9_27

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-04220-2

  • Online ISBN: 978-3-030-04221-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics