Abstract
The goal of this research is to find an algorithm that is capable of recognizing gestures drawn in a visual and gesture-driven interface used to teach introductory programming concepts. Our system combines components from Google’s Blockly, a visual programming language with a drag-and-drop puzzle piece interface, and Microsoft’s Xbox Kinect which is used to perform skeletal tracking. We focus on two supervised machine learning clustering algorithms, centroid matching and medoid matching, to detect gestures.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Hoste, L., Signer, B.: Criteria, challenges and opportunities for gesture programming languages. In: Proceedings of EGMI, pp. 22–29 (2014)
Lü, H., Li, Y.: Gesture coder: a tool for programming multi-touch gestures by demonstration. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2875–2884. ACM (2012)
Kato, J.: Integrated visual representations for programming with real-world input and output. In: Proceedings of the Adjunct Publication of the 26th Annual ACM Symposium on User Interface Software and Technology, pp. 57–60. ACM (2013)
Kato, J., Igarashi, T.: VisionSketch: integrated support for example-centric programming of image processing applications. In: Proceedings of the 2014 Graphics Interface Conference, pp. 115–122. Canadian Information Processing Society (2014)
Kavakli, M., Taylor, M., Trapeznikov, A.: Designing in virtual reality (desire): a gesture-based interface. In: Proceedings of the 2nd International Conference on Digital Interactive Media in Entertainment and Arts, pp. 131–136. ACM (2007)
Dudek, G., Sattar, J., Xu, A.: A visual language for robot control and programming: a human-interface study. In: 2007 IEEE International Conference on Robotics and Automation, pp. 2507–2513. IEEE (2007)
Tani, B., Maia, R., von Wangenheim, A.: A gesture interface for radiological workstations. In: 2007 Twentieth IEEE International Symposium on Computer-Based Medical Systems, CBMS 2007, pp. 27–32. IEEE (2007)
Gallo, L., Placitelli, A., Ciampi, M.: Controller-free exploration of medical image data: experiencing the kinect. In: 2011 24th International Symposium on Computer-Based Medical Systems (CBMS), pp. 1–6. IEEE (2011)
O’Hara, K., et al.: Touchless interaction in surgery. Commun. ACM 57(1), 70–77 (2014)
Zhang, H., Song, Y., Chen, Z., Cai, J., Lu, K.: Chinese shadow puppetry with an interactive interface using the kinect sensor. In: Fusiello, A., Murino, V., Cucchiara, R. (eds.) ECCV 2012. LNCS, vol. 7583, pp. 352–361. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33863-2_35
Held, R., et al.: 3D puppetry: a kinect-based interface for 3d animation. In: UIST, Citeseer, pp. 423–434 (2012)
Murugappan, S., Piya, C., Ramani, K.: Handy-potter: rapid 3D shape exploration through natural hand motions. In: ASME 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, pp. 19–28. American Society of Mechanical Engineers (2012)
Tian, J., et al.: KinWrite: handwriting-based authentication using kinect. In: NDSS (2013)
Huang, J.: Kinerehab: a kinect-based system for physical rehabilitation: a pilot study for young adults with motor disabilities. In: The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 319–320. ACM 2(011)
Roith, J., et al.: Gestairboard: a gesture-based touch typing keyboard using the kinect camera. In: Gesellschaft für Informatik eV (GI), p. 137 (2013)
Giovanni, S., Choi, Y.C., Huang, J., Khoo, E.T., Yin, K.: Virtual try-on using kinect and HD camera. In: Kallmann, M., Bekris, K. (eds.) MIG 2012. LNCS, vol. 7660, pp. 55–65. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-34710-8_6
Avancini, M., Ronchetti, M.: Using kinect to emulate an interactive whiteboard. In: MS in Computer Science, University of Trento (2011)
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Streeter, L., Gauch, J. (2020). Detecting Gestures Through a Gesture-Based Interface to Teach Introductory Programming Concepts. In: Kurosu, M. (eds) Human-Computer Interaction. Multimodal and Natural Interaction. HCII 2020. Lecture Notes in Computer Science(), vol 12182. Springer, Cham. https://doi.org/10.1007/978-3-030-49062-1_9
Download citation
DOI: https://doi.org/10.1007/978-3-030-49062-1_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-49061-4
Online ISBN: 978-3-030-49062-1
eBook Packages: Computer ScienceComputer Science (R0)