Skip to main content
Log in

Multiparty gaze preservation through perspective switching for interactive elearning environments

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Existing live tele-teaching systems enable eye-contact between interacting participants, however, they are often incomplete as they neglect finer levels of adherence to gaze such as gaze awareness and gaze following. A multilocation eLearning classroom setting often does not preserve relative neighborhood i.e., displays showing videos of remote participants at each location might not be congruent with their actual seating positions. This leads to incoherent gaze patterns during interactions. We present a media-rich distributed classroom architecture with multiple cameras and displays in each classroom. During interaction changes, cameras capturing appropriate perspectives of participants are streamed to displays in other classrooms. Hence for all interactions, the physical participants of a classroom are presented with appropriate perspectives of remote participants resembling gaze patterns during conventional-classroom interactions. We also present a framework to systematically analyze gaze patterns with its dependencies. The framework dictates optimal placement of media devices ensuring minimal deviation in capturing appropriate perspectives for a given set of resources. Evaluation results on a three classroom test-bed indicates a marked reduction in viewer cognitive load in discerning the entity-at-focus in an eLearning classroom environment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23

Similar content being viewed by others

References

  1. Arkorful V, Abaidoo N (2015) The role of e-learning, advantages and disadvantages of its adoption in higher education. International Journal of Instructional Technology and Distance Learning 12(1):29–42

    Google Scholar 

  2. Baek ET, Ho YS (2017) Gaze correction using feature-based view morphing and performance evaluation. SIViP 11(1):187–194

    Article  Google Scholar 

  3. Bailenson JN, Yee N, Blascovich J, Beall AC, Lundblad N, Jin M (2008) The use of immersive virtual reality in the learning sciences: digital transformations of teachers, students, and social context. J Learn Sci 17(1):102–141

    Article  Google Scholar 

  4. Barzuza T, Wiener Y, Modai O (2015) Presentation of enhanced communication between remote participants using augmented and virtual reality. US Patent App. 14/601,535

  5. Bijlani K, Rangan PV, Subramanian S, Vijayan V, Jayahari K (2010) A-view: Adaptive bandwidth for telepresence and large user sets in live distance education. In: 2010 2nd international conference on education technology and computer (ICETC), vol 2. IEEE, pp V2–219

  6. Bijlani K, KR J, Mathew A (2011) A-view: real-time collaborative multimedia e-learning. In: Proceedings of the third international ACM workshop on multimedia technologies for distance learning. ACM, pp 13–18

  7. Davies J, Graff M (2005) Performance in e-learning: online participation and student grades. Br J Educ Technol 36(4):657–663

    Article  Google Scholar 

  8. Flom RE, Lee KE, Muir DE (2007) Gaze-following: its development and significance. Lawrence Erlbaum Associates Publishers, Mahwah

    Google Scholar 

  9. Ford DA, Silberman GM (2017) Dynamic gaze correction for video conferencing. US Patent 9,538,130

  10. Gemmell J, Toyama K, Zitnick CL, Kang T, Seitz S (2000) Gaze awareness for video-conferencing: a software approach. IEEE MultiMedia 7(4):26–35

    Article  Google Scholar 

  11. Guntha R, Hariharan B, Rangan PV (2016) Analysis of echo cancellation techniques in multi-perspective smart classroom. In: 2016 international conference on advances in computing, communications and informatics (ICACCI). IEEE, pp 1135–1140

  12. Harrell RK, Dhuey MJ, Wales RT, Marechal PE, Graham PR, Desai AT, Grunes AD, Abed T, Lombrozo P, How PH, et al. (2010) System and method for enhancing eye gaze in a telepresence system. US Patent 7,679,639

  13. Jerald J, Daily M (2002) Eye gaze correction for videoconferencing. In: Proceedings of the 2002 symposium on eye tracking research & applications. ACM, pp 77–81

  14. Jokinen K, Harada K, Nishida M, Yamamoto S (2010) Turn-alignment using eye-gaze and speech in conversational interaction. In: INTERSPEECH, pp 2018–2021

  15. Jung I, Choi S, Lim C, Leem J (2002) Effects of different types of interaction on learning achievement, satisfaction and participation in web-based instruction. Innov Educ Teach Int 39(2):153–162

    Article  Google Scholar 

  16. MacPherson AC, Moore C (2007) Attentional control by gaze cues in infancy. In: Gaze-following: its development and significance. Lawrence Erlbaum Associates Publishers

  17. Marks M (1979) Department of english. Professional Ethics 2(2):2

    Google Scholar 

  18. Monk AF, Gale C (2002) A look is worth a thousand words: full gaze awareness in video-mediated conversation. Discourse Processes 33(3):257–278

    Article  Google Scholar 

  19. Noh ST, Yeo HS, Woo W (2015) An hmd-based mixed reality system for avatar-mediated remote collaboration with bare-hand interaction. In: Proceedings of the 25th international conference on artificial reality and telexistence and 20th eurographics symposium on virtual environments. Eurographics Association, pp 61–68

  20. Nordén B (2015) Global knowledge formation in the extended classroom: transdisciplinary network for global learning towards sustainability. EERA

  21. Oppenheim AN (2000) Questionnaire design, interviewing and attitude measurement. Bloomsbury Publishing, London

    Google Scholar 

  22. Owens J, Hardcastle L, Richardson B (2009) Learning from a distance: the experience of remote students. J Dist Educ (Online) 23(3):53

    Google Scholar 

  23. Pejsa T, Kantor J, Benko H, Ofek E, Wilson A (2016) Room2room: enabling life-size telepresence in a projected augmented reality environment. In: Proceedings of the 19th ACM conference on computer-supported cooperative work & social computing. ACM, pp 1716–1725

  24. Ramkumar N, Venkat Rangan P, Gopalakrishnan U, Hariharan B (2017) Gesture triggered, dynamic gaze alignment architecture for intelligent elearning systems. J Intell Fuzzy Syst 32(4):2963–2969

    Article  Google Scholar 

  25. Regenbrecht H, Langlotz T (2015) Mutual gaze support in videoconferencing reviewed. CAIS 37:45

    Article  Google Scholar 

  26. Rosinski RR, Farber J (1979) Compensation for viewing point in the perception of pictured space. Tech. rep. DTIC Document

  27. Sharma K, Jermann P, Dillenbourg P (2014) “with-me-ness”: a gaze-measure for students’ attention in moocs. In: Proceedings of international conference of the learning sciences 2014. ISLS, EPFL-CONF-201918, pp 1017–1022

  28. Subramanian NS, Anand S, Bijlani K (2014) Enhancing e-learning education with live interactive feedback system. In: Proceedings of the international conference on interdisciplinary advances in applied computing. ACM, p 53

  29. Thies J, Zollhöfer M, Stamminger M, Theobalt C, Nießner M (2016) Facevr: real-time facial reenactment and eye gaze control in virtual reality. arXiv:161003151

  30. Varghese JM, Hariharan B, Uma G, Kumar R (2016) Adaptive video quality throttling based on network bandwidth for virtual classroom systems. In: Proceedings of the second international conference on computer and communication technologies. Springer, pp 37–46

  31. Vertegaal R, van der Veer G, Vons H (2000) Effects of gaze on multiparty mediated communication. In: Graphics interface, pp 95–102

  32. Zhang Z, Yang R (2004) Video-teleconferencing system with eye-gaze correction. US Patent 6,771,303

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ramkumar Narayanan.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix A: Camera-display mapping table with gaze quantization errors

Table 3 summarizes the camera to display mapping matrix with Gaze Quantization Errors.

Table 3 Camera to display mapping and gaze quantization error values for each of the interaction states
Fig. 24
figure 24

Response charts for the pre-questionnaire

Appendix B: Pre-questionnaire

  1. a)

    Have you taken live online lectures via any eLearning/video-conferencing tool? □ Yes □ No

  2. b)

    If so, how was the online lecture comparable to the in-classroom lecture? □ Horrible □ Bad □ was manageable but not as good □ was good but I still prefer in-classroom lecture □ was as good as in-classroom lecture

  3. c)

    How many courses do you have as online lectures? □ Zero □ One □ Two □ Three □ Four □ more than four

  4. d)

    Do you have any online courses that are highly interactive? □ Yes □ No

  5. e)

    Are you a highly interactive person in class? □ Yes □ No □ Somewhat interactive

  6. f)

    How has your interaction with the distant teacher been? □ Very interactive □ Moderately interactive □ Not interactive

  7. g)

    Have you had lecture sessions as a student where you had to interact with a distant student? □ Yes □ No

  8. h)

    Have you had lecture sessions where the teacher had a set of local students and you were a remote student? □ Yes □ No

  9. i)

    Have you had a lecture where you were a local student and there was a group of distant students participating from elsewhere? □ Yes □ No

  10. j)

    Did it bother you when a distant participant (student/teacher) was not looking at you when you were interacting with him/her? □ Yes □ No

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Narayanan, R., Rangan, V.P., Gopalakrishnan, U. et al. Multiparty gaze preservation through perspective switching for interactive elearning environments. Multimed Tools Appl 78, 17461–17494 (2019). https://doi.org/10.1007/s11042-018-7078-y

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-018-7078-y

Keywords

Navigation