Skip to main content

Analysis of Emotion in Socioenactive Systems

  • Conference paper
  • First Online:
Human-Computer Interaction. Theory, Methods and Tools (HCII 2021)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12762))

Included in the following conference series:

  • 3489 Accesses

Abstract

Facial expressions are important data to understand how systems in social environments impact people in it. The presence of new technologies and new coupled forms of interaction with the ubiquity of computing and social networks, present challenges that require the consideration of new factors as emotional. Socioenactive systems represent a complex scenario that requires the treatment of technological aspects in which the consideration of the social dynamic, enhanced by concepts such as affective computing and enactive systems. This work presents a proposal for facial recognition in the wild applied to outputs of socioenative systems. These results reinforce how the design of socioenactive systems can promote positive changes in the emotional state of children in an educational context and promote social interactions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Kaipainen, M., et al.: Enactive systems and enactive media: embodied human-machine coupling beyond interfaces. Leonardo 44(5), 433–438 (2011). https://doi.org/10.1162/LEONa00244

    Article  Google Scholar 

  2. Baranauskas, M.C.C.: Socio-enactive systems: investigating new dimensions in the design of interaction mediated by information and communication technologies. FAPESP Thematic Research Project #2015/16528-0 (2015)

    Google Scholar 

  3. Gallagher, S.: Making enactivism even more embodied. Chapter 8 in Enactivist Interventions Rethinking the Mind. University Press, Oxford (2017)

    Google Scholar 

  4. Talipu, A., Generosi, A., Mengoni, M., Giraldi, L.: Evaluation of deep convolutional neural network architectures for emotion recognition in the wild. In: 2019 IEEE 23rd International Symposium on Consumer Technologies (ISCT), Ancona, Italy, 2019, pp. 25–27 (2019). https://doi.org/10.1109/isce.2019.8900994

  5. Caceffo, R., et al.: Collaborative meaning construction in socioenactive systems: study with the mBot. In: Zaphiris, P., Ioannou, A. (eds.) HCII 2019. LNCS, vol. 11590, pp. 237–255. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-21814-0_18

    Chapter  Google Scholar 

  6. Rodríguez, A., González, P., Rossi, G.: Sketching for designing enactive interactions. In: Proceedings of the XV International Conference on Human Computer Interaction (Interacción ‘14). ACM, New York, NY, USA, Article 39, p. 2 (2014). https://doi.org/10.1145/2662253.2662292

  7. Abascal, J., et al.: Personalizing the user interface for people with disabilities. In: Proceedings of the 23rd International Workshop on Personalization and Recommendation on the Web and Beyond (ABIS ‘19), p. 29. ACM, New York, NY, USA (2019). https://doi.org/10.1145/3345002.3349292

  8. Namrata, S.: Using contactless sensors to estimate learning difficulty in digital learning environments. In: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers (UbiComp/ISWC ‘19 Adjunct), pp. 399–403. ACM, New York, NY, USA (2019). https://doi.org/10.1145/3341162.3349312

  9. Jansen, K.M.B.: How to shape the future of smart clothing. In: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers (UbiComp/ISWC ‘19 Adjunct), pp. 1037–1039. ACM, New York, NY, USA (2019). https://doi.org/10.1145/3341162.3349571

  10. Samira, E.K., et al.: Emonets: multimodal deep learning approaches for emotion recognition in video. J. Multimodal User Interfaces 10(2), 99–111 (2016)

    Article  Google Scholar 

  11. Kim, D.H., Baddar, W.J., Ro, Y.M.: Micro-expression recognition with expression-state constrained spatio-temporal feature representations. In: Proceedings of the 24th ACM international conference on Multimedia (MM ‘16), pp. 382–386. Association for Computing Machinery, New York, NY, USA (2016). https://doi.org/10.1145/2964284.2967247

  12. Happy, S., Routray, A.: Automatic facial expression recognition using features of salient facial patches. IEEE Trans. Affect. Comput. 6(1), 1–12 (2015)

    Article  Google Scholar 

  13. Brennand, C.V.L.T., Brennand, C.A.R.L., Duarte, E.F., Baranauskas, M.C.C.: Evaluating the user experience in interactive installations: a case study. In: Proceedings of the 18th Brazilian Symposium on Human Factors in Computing Systems (IHC ‘19), pp. 1–10. Association for Computing Machinery, New York, NY, USA, Article 26 (2019). https://doi.org/10.1145/3357155.3358484

  14. Luque Carbajal, M., Baranauskas, M.C.C.: Multimodal analysis of preschool children’s embodied interaction with a tangible programming environment. In: Kurosu, M. (ed.) HCII 2020. LNCS, vol. 12182, pp. 443–462. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-49062-1_30

    Chapter  Google Scholar 

  15. Salunke, V.V., Patil, C.G.: A new approach for automatic face emotion recognition and classification based on deep networks. In: 2017 International Conference on Computing, Communication, Control and Automation (ICCUBEA), Pune 2017, pp. 1–5 (2017). https://doi.org/10.1109/iccubea.2017.8463785

  16. Pons, G., Masip, D.: Multitask, multilabel, and multidomain learning with convolutional networks for emotion recognition. IEEE Trans. Cybern. https://doi.org/10.1109/tcyb.2020.3036935

  17. Kabakus, A.T.: PyFER: a facial expression recognizer based on convolutional neural networks. IEEE Access 8, 142243–142249 (2020). https://doi.org/10.1109/ACCESS.2020.3012703

    Article  Google Scholar 

  18. Li, S., Deng, W.: Deep facial expression recognition: a survey. IEEE Trans. Affect. Comput. https://doi.org/10.1109/taffc.2020.2981446

  19. Yan, W.-J., et al.: CASME II: an improved spontaneous micro-expression database and the baseline evaluation. PLoS ONE 9(1), (2014)

    Article  Google Scholar 

  20. Lundqvist, J.: The averaged Karolinska directed emotional faces - AKDEF. In: CD ROM from the Department of Clinical Neuroscience, Psychology Section (1998)

    Google Scholar 

  21. Liu, J., Wang, H., Feng, Y.: An end-to-end deep model with discriminative facial features for facial expression recognition. IEEE Access 9, 12158–12166 (2021). https://doi.org/10.1109/ACCESS.2021.3051403

    Article  Google Scholar 

  22. Elouariachi, I., Benouini, R., Zenkouar, K., Zarghili, A., El Fadili, H.: Explicit quaternion krawtchouk moment invariants for finger-spelling sign language recognition. In: 2020 28th European Signal Processing Conference (EUSIPCO), Amsterdam 2021, pp. 620–624 (2021). https://doi.org/10.23919/eusipco47968.2020.9287845

Download references

Acknowledgement

This work was financially supported by the São Paulo Research Foundation (FAPESP) (grants #2015/16528-0, #2015/24300-9 and #2019/12225-3), and CNPq (grant ##306272/2017-2). We thank the University of Campinas (UNICAMP) for making this research possible.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Diego Addan Gonçalves .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Gonçalves, D.A., Caceffo, R.E., Baranauskas, M.C.C. (2021). Analysis of Emotion in Socioenactive Systems. In: Kurosu, M. (eds) Human-Computer Interaction. Theory, Methods and Tools. HCII 2021. Lecture Notes in Computer Science(), vol 12762. Springer, Cham. https://doi.org/10.1007/978-3-030-78462-1_41

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-78462-1_41

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-78461-4

  • Online ISBN: 978-3-030-78462-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics