Skip to main content

Nature at Your Service - Nature Inspired Representations Combined with Eye-gaze Features to Infer User Attention and Provide Contextualized Support

  • Conference paper
  • First Online:
Adaptive Instructional Systems (HCII 2020)

Abstract

Internet of Things (IoT) enables the creation of sensing and computing machines to enhance the level of continuous adaptation and support provided by intelligent systems to humans. Nevertheless, these systems still depend on human intervention, for example, in maintenance and (re)configuration tasks. To this measure, the development of an Adaptive Instructional System (AIS) in the context of IoT allows for the creation of new, improved learning and training environments. One can test new approaches to improve the training and perception efficiency of humans. Examples are the use of virtual and augmented reality, the inclusion of nature inspired metaphors based on biophilic design and calm computing principles and the design of technology that aims at changing the users’ behaviour through persuasion and social influence. In this work, we specifically propose a nature inspired visual representation concept, BioIoT, to communicate sensor information. Our results show that this new representation contributes to the users’ well-being and performance while remaining as easy to understand as traditional data representations (based on an experiment with twelve participants over two weeks). We present a use case under which we apply the BioIoT concept. It serves the purpose of demonstrating the BioIoT benefits in a AR setting, when applied in households and workplaces scenarios. Furthermore, by leveraging our previous experience in the development of adaptive and supportive systems based on eye-tracking, we discuss the application of this new sensing technology to the support of users in machine intervention by using the user attention, i.e., eye-gaze, on different machine parts as a way to infer the user’s needs and adapt the system accordingly. In this way, a new level of continuous support can be provided to the users depending on their skill level and individual needs in the form of contextualized instructions and action recommendations based on user attention.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Azuma, R.T.: A survey of augmented reality. Presence: Teleoperators Virtual Environ. 6(4), 355–385 (1997). https://doi.org/10.1162/pres.1997.6.4.355

    Article  Google Scholar 

  2. Barreiros, C., Veas, E., Pammer, V.: Bringing nature into our lives. In: Kurosu, M. (ed.) HCI 2018. LNCS, vol. 10902, pp. 99–109. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-91244-8_9

    Chapter  Google Scholar 

  3. Barreiros, C., Pammer-Schindler, V., Veas, E.: Planting the seed of positive human-IoT interaction. Int. J. Hum.-Comput. Interact. 36(4), 355–372 (2020). https://doi.org/10.1080/10447318.2019.1642674

    Article  Google Scholar 

  4. Barreiros, C., et al.: What if factories looked like forests? Redesigning the manufacturing industry 4.0 workplaces with an augmented reality inspired nature metaphor. In: CEUR - Workshop Proceedings i-know 2017, vol. 2025, pp. 4 (2017)

    Google Scholar 

  5. Mekni, M., Lemieux, A.: Augmented reality: applications, challenges and future trends. In: Proceedings of the 13th International Conference on Applied Computer and Computation Sciences (ACACOS 2014). pp. 205–215. WSEAS Press (2014)

    Google Scholar 

  6. Bower, M., Howe, C., McCredie, N., Robinson, A., Grover, D.: Augmented reality in education—cases, places, and potentials. In: 2013 IEEE 63rd Annual Conference International Council for Education Media (ICEM), pp. 1–11 (2013). https://doi.org/10.1109/CICEM.2013.6820176

  7. Ricci, F., Rokach, L., Shapira, B., Kantor, P.B. (eds.): Recommender Systems Handbook. Springer, Boston (2011). https://doi.org/10.1007/978-0-387-85820-3

    Book  MATH  Google Scholar 

  8. Ulrich, R., Simons, R., Losito, B., Fiorito, E., Miles, M., Zelson, M.: Stress recovery during exposure to natural and urban environments. J. Environ. Psychol. 11(3), 201–230 (1991). https://doi.org/10.1016/S0272-4944(05)80184-7

    Article  Google Scholar 

  9. Kahn, P., Severson, R., Ruckert, J.: The human relation with nature and technological nature. Curr. Direc. Psychol. Sci. 18(1), 37–42 (2009). https://doi.org/10.1111/j.1467-8721.2009.01602.x

    Article  Google Scholar 

  10. Parsons, R.J.: Environmental psychophysiology. In: Cacioppo, J.T., et al. (eds.) Handbook of Psychophysiology, pp. 752–786. Cambridge University Press, Cambridge (2007)

    Chapter  Google Scholar 

  11. Largo-Wight, E., Chen, W., Dodd, V., Weiler, R.: Healthy workplaces: the effects of nature contact at work on employee stress and health. Publ. Health Rep. 126(1 suppl), 124–130 (2011). https://doi.org/10.1177/00333549111260S116

    Article  Google Scholar 

  12. Nass, C., Yen, C.: The Man Who Lied to His Laptop: What We Can Learn About Ourselves from Our Machines. Penguin Publishing Group, New York City (2010)

    Google Scholar 

  13. Chien, J., Guimbretière, F., Rahman, T., Gay, G., Matthews, M.: Biogotchi!: an exploration of plant-based information displays. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, pp. 1139–1144 ACM, New York (2015). https://doi.org/10.1145/2702613.2732770

  14. Eggen, B., Van Mensvoort, K.: Making sense of what is going on ‘Around’: designing environmental awareness information displays. In: Markopoulos, P., et al. (eds.) Awareness Systems: Advances in Theory, Methodology and Design, pp. 99–124. Springer, London (2009). https://doi.org/10.1007/978-1-84882-477-5_4

  15. Fogg, B., Nass, C.: How users reciprocate to computers: an experiment that demonstrates behavior change. In: CHI 1997 Extended Abstracts on Human Factors in Computing Systems, pp. 331–332. ACM, New York (1997). https://doi.org/10.1145/1120212.1120419

  16. Bartneck, C., Hoek, M., Mubin, O., Mahmud, A.: “Daisy, Daisy, Give Me Your Answer Do!” switching off a robot. In: 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 217–222 (2007)

    Google Scholar 

  17. Froehlich, J., et al.: UbiGreen: investigating a mobile tool for tracking and supporting green transportation habits. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1043–1052. ACM, New York (2009). https://doi.org/10.1145/1518701.1518861

  18. Browning, B., Garvin, C., Fox, B., Cook, R.: The Economics of Biophilia. Terrapin Bright Green, New Yor (2012)

    Google Scholar 

  19. Andrienko, G., Andrienko, N.V., Burch, M., Weiskopf, D.: Visual analytics methodology for eye movement studies. IEEE Trans. Vis. Comput. Graph. 18(12), 2889–2898 (2012). https://doi.org/10.1109/tvcg.2012.276

    Article  Google Scholar 

  20. Kurzhals, K., Fisher, B., Burch, M., Weiskopf, D.: Evaluating visual analytics with eye tracking. In: Proceedings of the BELIV Workshop: Beyond (2014). https://doi.org/10.1145/2669557.2669560

  21. Collins, C., et al.: Guidance in the human-machine analytics process. Visual Inform. 2(3), 166–180 (2018). https://doi.org/10.1016/j.visinf.2018.09.003

    Article  Google Scholar 

  22. Ceneda, D., et al.: Characterizing guidance in visual analytics. IEEE Trans. Visual. Comput. Graph. 23(1), 111–120 (2017)

    Article  Google Scholar 

  23. Renner, P., Pfeiffer, T.: Attention guiding techniques using peripheral vision and eye tracking for feedback in augmented-reality-based assistance systems. In: Proceedings of the Symposium on 3D User Interfaces, pp. 186–194 (2017). https://doi.org/10.1109/3dui.2017.7893338

  24. Felfernig, A., Jeran, M., Ninaus, G., Reinfrank, F., Reiterer, S.: Toward the next generation of recommender systems: applications and research challenges. In: Multimedia Services in Intelligent Environments, pp. 81–98. Springer (2013). https://doi.org/10.1007/978-3-319-00372-6_5

  25. Castagnos, S., et al.: Eye-tracking product recommenders’ usage. In: Proceedings of the ACM Conference on Recommender systems, pp. 29–36. ACM (2010). https://doi.org/10.1145/1864708.1864717

  26. Castagnos, S., Pu, P.: Consumer decision patterns through eye gaze analysis. In: Proceedings of the Workshop on Eye Gaze in Intelligent Human Machine. ACM (2010). https://doi.org/10.1145/2002333.2002346

  27. Chen, L., Pu, P.: Users’ eye gaze pattern in organization-based recommender interfaces. In: Proceedings of the 16th International Conference on Intelligent User Interfaces, pp. 311–314. ACM (2011). https://doi.org/10.1145/1943403.1943453

  28. Xu, S., Jiang, H., Lau, F.: Personalized online document, image and video recommendation via commodity eye-tracking. In: Proceedings of the 2008 ACM conference on Recommender systems, pp. 83–90 ACM (2008). https://doi.org/10.1145/1454008.1454023

  29. Bednarik, R.: Potentials of eye-movement tracking in adaptive systems. In: Proceedings of the 4th Workshop on Empirical Evaluation of Adaptive Systems, pp. 1–8 (2005)

    Google Scholar 

  30. Shao, L., Silva, N., Eggeling, E., Schreck, T.: Visual exploration of large scatter plot matrices by pattern recommendation based on eye tracking. In: Proceedings of the 2017 ACM Workshop on Exploratory Search and Interactive Data Analytics, pp. 9–16. ACM (2017)

    Google Scholar 

  31. Silva, N., Settgast, V., Eggeling, E., Ullrich, T., Schreck, T., Fellner, D.: Increasing fault tolerance in operational centres using human sensing technologies: approach and initial results. In: European Project Space on Computer Vision, Graphics, Optics and Photonics, p. 25. SCITEPRESS (2015)

    Google Scholar 

  32. Silva, N., et al.: Sense.Me - open source framework for the exploration and visualization of eye tracking data. In: Proceedings of the 2016 IEEE Conference on Information Visualization. IEEE (2016)

    Google Scholar 

  33. Silva, N., Schreck, T., Veas, E., Sabol, V., Eggeling, E., Fellner, D.: Leveraging eye-gaze and time-series features to predict user interests and build a recommendation model for visual analysis. In: Proceedings of the 2018 ACM Symposium on Eye Tracking. Research and Applications, vol. 13. ACM (2018). https://doi.org/10.1145/3204493.3204546

  34. Gegenfurtner, A., et al.: Expertise differences in the comprehension of visualizations: a meta-analysis of eye-tracking research in professional domains. Educ. Psychol. Rev. 23(4), 523–552 (2011). https://doi.org/10.1007/s10648-011-9174-7

    Article  Google Scholar 

  35. Cristancho, S., Moussa, F., Dubrowski, A.: A framework-based approach to designing simulation-augmented surgical education and training programs. Am. J. Surg. 202(3), 344–351 (2011). https://doi.org/10.1016/j.amjsurg.2011.02.011

    Article  Google Scholar 

  36. Regenbrecht, H., Baratoff, G., Wilke, W.: Augmented reality projects in the automotive and aerospace industries. IEEE Comput. Graph. Appl. 25(6), 48–56 (2005). https://doi.org/10.1109/MCG.2005.124

    Article  Google Scholar 

  37. Schall, G., et al.: Handheld Augmented Reality for underground infrastructure visualization. Pers. Ubiquit. Comput. 13(4), 281–291 (2009). https://doi.org/10.1007/s00779-008-0204-5

    Article  MathSciNet  Google Scholar 

  38. Billinghurst, M., Duenser, A.: Augmented reality in the classroom. Computer 45(7), 56–63 (2012). https://doi.org/10.1109/MC.2012.111

    Article  Google Scholar 

  39. Jerry, T., Aaron, C.: The impact of augmented reality software with inquiry-based learning on students’ learning of kinematics graph. In: 2010 2nd International Conference on Education Technology and Computer, pp. V2-1–V2-5 (2010). https://doi.org/10.1109/ICETC.2010.5529447

  40. Lee, K.: Augmented reality in education and training. Techtrends Tech Trends 56(2), 13–21 (2012). https://doi.org/10.1007/s11528-012-0559-3

    Article  Google Scholar 

Download references

Acknowledgements

This work was funded by the LiTech K-project and by Know Center GmbH. Both funded by the Austrian Competence Centers for Excellent Technologies (COMET) program, under the auspices of the Austrian Federal Ministry of Transport, Innovation, and Technology; the Austrian Federal Ministry of Economy, Family, and Youth; and the Austrian state of Styria. COMET is managed by the Austrian Research Promotion Agency FFG.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Carla Barreiros , Nelson Silva , Viktoria Pammer-Schindler or Eduardo Veas .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Barreiros, C., Silva, N., Pammer-Schindler, V., Veas, E. (2020). Nature at Your Service - Nature Inspired Representations Combined with Eye-gaze Features to Infer User Attention and Provide Contextualized Support. In: Sottilare, R.A., Schwarz, J. (eds) Adaptive Instructional Systems. HCII 2020. Lecture Notes in Computer Science(), vol 12214. Springer, Cham. https://doi.org/10.1007/978-3-030-50788-6_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-50788-6_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-50787-9

  • Online ISBN: 978-3-030-50788-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics