Skip to main content

Affective Guide for Museum: A System to Suggest Museum Paths Based on Visitors’ Emotions

  • Conference paper
  • First Online:
Universal Access in Human-Computer Interaction. Design Methods and User Experience (HCII 2021)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12768))

Included in the following conference series:

Abstract

This paper introduces a new recommendation system for museums able to profile the visitors and propose them the most suitable exhibition path accordingly, to improve visitors’ satisfaction. It consists of an interactive touch screen totem, which implements a USB camera and exploits Convolutional Neural Network to perform facial coding to measure visitors’ emotions and estimate their age and gender. Based on the detected level of emotional valence, the system associates visitors with a profile and suggests them to visit a selection of five works of art, following a specific itinerary. An extensive experimentation lasting 2 months has been carried out at the Modern Art Museum “Palazzo Buonaccorsi” of Macerata. Results evidence that the proposed system can create an interactive and emotional link with the visitors, influencing their mood in the Pre-Experience phase and in the subsequent Post-Experience phase. In particular, they highlight that the proposed system, which aims at acting as emotional leverage, has been able to improve the positiveness of the emotions experienced by the visitors.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Pine, B.J., Gilmore, J.H.: Welcome to the experience economy. Harv. Bus. Rev. 76, 97–105 (1998)

    Google Scholar 

  2. Neuburger, L., Egger, R.: An afternoon at the museum: Through the lens of augmented reality. In: Schegg, R., Stangl, B. (eds.) Information and communication technologies in tourism 2017, pp. 241–254. Springer, Cham (2017)

    Chapter  Google Scholar 

  3. Leopardi, A., et al.: X-reality technologies for museums: a comparative evaluation based on presence and visitors experience through user studies. J. Cult. Herit. (2020). https://doi.org/10.1016/j.culher.2020.10.005

    Article  Google Scholar 

  4. Legrenzi, L., Troilo, G.: The impact of exhibit arrangement on visitors’ emotions: a study at the Victoria & Albert Museum. In: Proceedings of AIMAC, pp. 1–14 (2005)

    Google Scholar 

  5. Bigné, J.E., Andreu, L.: Emotions in segmentation: an empirical study. Ann. Tour. Res. 31(3), 682–696 (2004)

    Article  Google Scholar 

  6. Del Chiappa, G., Andreu, L., Gallarza, M.: Emotions and visitors’ satisfaction at a museum. Int. J. Cult. Tourism Hospital. Res. 8(4), 420–431 (2014)

    Article  Google Scholar 

  7. Thyne, M.: The importance of values research for nonprofit organisations: The motivation-based values of museum visitors. Int. J. Nonprofit Voluntary Sector Market. 6(2), 116–130 (2001)

    Article  Google Scholar 

  8. Bigné, E., Gnoth, J., Andreu, L.: Advanced topics in tourism market segmentation. In: Tourism Management: Analysis, Behaviour and Strategy, pp. 151–173 (2008)

    Google Scholar 

  9. Hosany, S., Prayag, G.: Patterns of tourists’ emotional responses, satisfaction, and intention to recommend. J. Bus. Res. 66(6), 730–737 (2013)

    Article  Google Scholar 

  10. Goren-Bar, D., Graziola, I., Rocchi, C., Pianesi, F., Stock, O., Zancanaro, M.: Designing and redesigning an affective interface for an adaptive museum guide. In: International Conference on Affective Computing and Intelligent Interaction, pp. 939–946, October 2005

    Google Scholar 

  11. Benta, K.I.: Affective aware museum guide. In: IEEE International Workshop on Wireless and Mobile Technologies in Education (WMTE 2005), pp. 53–55, November 2005

    Google Scholar 

  12. Lim, M.Y., Aylett, R.: An emergent emotion model for an affective mobile guide with attitude. Appl. Artif. Intell. 23(9), 835–854 (2009)

    Article  Google Scholar 

  13. Alelis, G., Bobrowicz, A., Ang, C.S.: Exhibiting emotion: capturing visitors’ emotional responses to museum artefacts. In: Marcus, A. (ed.) Design, User Experience, and Usability. User Experience in Novel Technological Environments: Second International Conference, DUXU 2013, Held as Part of HCI International 2013, Las Vegas, NV, USA, July 21–26, 2013, Proceedings, Part III, pp. 429–438. Springer Berlin Heidelberg, Berlin, Heidelberg (2013)

    Chapter  Google Scholar 

  14. Rocchi, C., Stock, O., Zancanaro, M.: Adaptivity in museum mobile guides: the Peach experience. In: Proceedings of the Mobile Guide, p. 6 (2006)

    Google Scholar 

  15. Abdelrahman, Y., Hassib, M., Marquez, M.G., Funk, M., Schmidt, A.: Implicit engagement detection for interactive museums using brain-computer interfaces. In: Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, pp. 838–845, August 2015

    Google Scholar 

  16. Ceccacci, S., Generosi, A., Giraldi, L., Mengoni, M.: Tool to make shopping experience responsive to customer emotions. Int. J. Autom. Technol. 12(3), 319–326 (2018)

    Article  Google Scholar 

  17. Karyotis, C., Doctor, F., Iqbal, R., James, A.E., Chang, V.: Affect aware ambient intelligence: current and future directions. In: State of the Art in AI Applied to Ambient Intelligence, vol. 298, pp. 48–67. IOS Press (2017)

    Google Scholar 

  18. Bernin, A., et al.: Towards more robust automatic facial expression recognition in smart environments. In: Proceedings of the 10th International Conference on Pervasive Technologies Related to Assistive Environments, pp. 37–44. ACM, June 2017.

    Google Scholar 

  19. Salunke, V.V., Patil, C.G.: A new approach for automatic face emotion recognition and classification based on deep networks. In: 2017 International Conference on Computing, Communication, Control and Automation (ICCUBEA), pp. 1–5. IEEE, August 2017

    Google Scholar 

  20. Ekman, P., Keltner, D.: Universal facial expressions of emotion. Calif. Mental Health Res. Digest 8(4), 151–158 (1970)

    Google Scholar 

  21. Totem multimediali DIGITall Light. https://www.gruppoitec.com/totem-multimediali/digitall-light (n.d.). Accessed 4 Nov 2020.

  22. Talipu, A., Generosi, A., Mengoni, M., Giraldi, L.: Evaluation of deep convolutional neural network architectures for emotion recognition in the wild. In: 2019 IEEE 23rd International Symposium on Consumer Technologies (ISCT), pp. 25–27. IEEE, June 2019

    Google Scholar 

  23. Ekman, P.: An argument for basic emotions. Cogn. Emot. 6(3–4), 169–200 (1992)

    Article  Google Scholar 

  24. Generosi, A., et al.: MoBeTrack: a toolkit to analyze user experience of mobile apps in the wild. In: 2019 IEEE International Conference on Consumer Electronics (ICCE), pp. 1–2. IEEE, January 2019

    Google Scholar 

  25. Brooke, J.: SUS: a retrospective. J. Usability Stud. 8(2), 29–40 (2013)

    Google Scholar 

  26. Hassenzahl, M.: The interplay of beauty, goodness, and usability in interactive products. Hum. Comput. Interact. 19, 319–349 (2004)

    Article  Google Scholar 

  27. Othman, M., Petrie, H., Power, C.: Engaging visitors in museums with technology: scales for the measurement of visitor and multimedia guide experience. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) Human-Computer Interaction – INTERACT 2011, pp. 92–99. Springer Berlin Heidelberg, Berlin, Heidelberg (2011). https://doi.org/10.1007/978-3-642-23768-3_8

    Chapter  Google Scholar 

  28. Bangor, A., Kortum, P., Miller, J.: Determining what individual SUS scores mean: adding an adjective rating scale. J. Usability Stud. 4(3), 114–123 (2009)

    Google Scholar 

  29. Rauer, M.: Quantitative Usability-Analysen mit der System Usability Scale (SUS). Seibert Media. Online verfügbar unter https://blog.seibert-media.net/blog/2011/04/11/usablility-analysen-system-usability-scale-sus/ (2011). zuletzt geprüft am, 19, 2016

Download references

Acknowledgement

We thank the Musei Civici di Macerata, and in particular the director, Dr. Giuliana Pascucci, for her support in the experimentation at Palazzo Buonaccorsi. This work is supported by the Marche Region under the POR-FESR 2014–2020 program, project C.O.M.E., involving the companies Cherry Merry Lab, Marchingegno and Grottini Lab.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Silvia Ceccacci .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Altieri, A., Ceccacci, S., Giraldi, L., Leopardi, A., Mengoni, M., Talipu, A. (2021). Affective Guide for Museum: A System to Suggest Museum Paths Based on Visitors’ Emotions. In: Antona, M., Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. Design Methods and User Experience. HCII 2021. Lecture Notes in Computer Science(), vol 12768. Springer, Cham. https://doi.org/10.1007/978-3-030-78092-0_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-78092-0_35

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-78091-3

  • Online ISBN: 978-3-030-78092-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics