Abstract
Mobile devices significantly reshape our various aspects of livings. Yet prolonged contacts with mobile devices may cause eye and/or muscle fatigues especially for young children. In this paper, we consider the integration of web cameras as image sensors available on most tablets or smartphones with an interesting tracking algorithm to continuously monitor and analyze the learners’ responses through their facial orientations and eye movements to build the Personalized Teaching And Learning, namely the PETAL, platform for nurturing the academic development of our young learners while protecting their eyesight. Through the in-depth studies of various Android programming toolkits with the Open Source Computer Vision library, we explore many possible ways to detect the viewers’ responses to educational videos as a mean of self-learning. With the capability of notifying learners of their, possibly unconscious, reactions to such educational videos, our platform is targeted to promote a truly personalized approach for developing the next-generation e-learning systems.
Kelly Liu, Victoria Tam, Phoebe Tse are contributed equally to this work.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
S. Asteriadis, P. Tzouveli, K. Karpouzis and S. Kollias, “Estimation of behavioral user state based on eye gaze and head pose—application in an e-learning environment”, Multimedia Tools Application (2009), 41:469–493, the Springer.
V. Cantoni, M. Cellario and M. Porta, “Perspectives and challenges in e-learning: towards natural interaction paradigms”, Journal of Visual Languages and Computing, 15 (2004), pp. 333–345, the Elsevier Ltd.
C. Hennessey, B. Noureddin, P. Lawrence, “A single camera eye-gaze tracking system with free head motion.”, In: Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA ‘05), San Diego, CA, USA, pp 87–94, 2006.
S. Ioannou, G. Caridakis, K. Karpouzis, S. Kollias, “Robust feature detection for facial expression recognition.”, Int J Image Video Process 29081, 2007.
J. Jang, “ANFIS: adaptive-network-based fuzzy inference systems.”, IEEE Trans Syst. Man Cybernetics 23(3):665–685, 1993.
T. Sabih, “PETAL: Learning the Android Way”, The HKU Journal of Technology (TecHKU), URL at: http://www.engineering.hku.hk/tecHKU/2013/11/11/petal/, last visited in November, 2013.
The Microsoft Kinect Development Team, “Kinect for Windows”, URL at: http://www.microsoft.com/en-us/kinectforwindows/, last visited in April, 2014.
The OpenCV Development Team, “The OpenCV”, URL at: http:// http://opencv.org/, last visited in March, 2014.
Acknowledgements
The support of MIT International Science and Technology Initiatives (MISTI) is gratefully acknowledged.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Liu, K., Tam, V., Tse, P., Lam, E.Y., Tam, V. (2015). Developing the PETAL e-Learning Platform for Personalized Teaching and Learning. In: Chen, G., Kumar, V., Kinshuk, ., Huang, R., Kong, S. (eds) Emerging Issues in Smart Learning. Lecture Notes in Educational Technology. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-44188-6_17
Download citation
DOI: https://doi.org/10.1007/978-3-662-44188-6_17
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-44187-9
Online ISBN: 978-3-662-44188-6
eBook Packages: Humanities, Social Sciences and LawSocial Sciences (R0)