Abstract
Personalizing interactions in socially assistive robot (SAR) tutoring has shown promise with a wide variety of learners, especially when using multiple interaction modalities. Many of those interactions, however, focus on seated learning contexts, creating a need for multimodal personalization measures in kinesthetic (i.e., embodied) learning contexts. This paper proposes a multimodal measure of student kinesthetic curiosity (\(KC^S\)) that combines a student’s movement and curiosity measures into a single, personalized measure. This work evaluates the efficacy of \(KC^S\) in a SAR tutor interaction by conducting a within-subjects (\(n=9\)) pilot study where participants completed kinesthetic mixed reality coding exercises alongside a curious robot tutor whose actions were determined by \(KC^S\). The study results indicate that the stationarity assumptions needed for \(KC^S\) were met and that the robot tutor was able to successfully use \(KC^S\) to personalize its action policy, thereby positively affecting short term \(KC^S\). However, no significant results were found for longer state changes for each student. The mixed reality visual programming language (MoveToCode) created for this work has been made open-source. This work aims to inform future online features and measures for mixed reality human-robot interactions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bloom, B.S.: The 2 sigma problem: the search for methods of group instruction as effective as one-to-one tutoring. Educ. Researcher 13(6), 4–16 (1984)
Clabaugh, C., Matarić, M.: Escaping Oz: autonomy in socially assistive robotics. Ann. Rev. Control Rob. Auton. Syst. 2, 33–61 (2019)
Deng, E., Mutlu, B., Mataric, M.J., et al.: Embodiment in socially interactive robots. Found. Trends® Rob 7(4), 251–356 (2019)
Groechel, T., Kuo, C., Dasgupta, R.: Interaction-lab/movetocode: Doi release (2020). https://doi.org/10.5281/zenodo.3924514
Groechel, T., Shi, Z., Pakkar, R., Mataric, M.J.: Using socially expressive mixed reality arms for enhancing low-expressivity robots. In: 2019 28th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE (2019)
Jain, S., Thiagarajan, B., Shi, Z., Clabaugh, C., Matarić, M.J.: Modeling engagement in long-term, in-home socially assistive robot interventions for children with autism spectrum disorders. Sci. Rob. 5(39), (2020)
Lipton, J.I., Fay, A.J., Rus, D.: Baxter’s homunculus: virtual reality spaces for teleoperation in manufacturing. IEEE Rob. Autom. Lett. 3(1), 179–186 (2018)
Macedonia, M.: Embodied learning: why at school the mind needs the body. Front. Psychol. 10, 2098 (2019)
Mariéthoz, J., Bengio, S.: A unified framework for score normalization techniques applied to text-independent speaker verification. IEEE Signal Process. Lett. 12(7), 532–535 (2005)
Quigley, M., Conley, K., Gerkey, B., Faust, J., Foote, T., Leibs, J., Wheeler, R., Ng, A.Y.: Ros: an open-source robot operating system. In: ICRA Workshop on Open Source Software, Kobe, Japan, vol. 3, p. 5 (2009)
Schodde, T., Bergmann, K., Kopp, S.: Adaptive robot language tutoring based on bayesian knowledge tracing and predictive decision-making. In: Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, pp. 128–136 (2017)
Spaulding, S., Breazeal, C.: Frustratingly easy personalization for real-time affect interpretation of facial expression. In: 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 531–537. IEEE (2019)
Walker, M., Hedayati, H., Lee, J., Szafir, D.: Communicating robot motion intent with augmented reality. In: Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, pp. 316–324. ACM (2018)
Williams, T., Hirshfield, L., Tran, N., Grant, T., Woodward, N.: Using augmented reality to better study human-robot interaction. In: HCII Conference on Virtual, Augmented, and Mixed Reality (2020)
Williams, T., Szafir, D., Chakraborti, T., Soh Khim, O., Rosen, E., Booth, S., Groechel, T.: Virtual, augmented, and mixed reality for human-robot interaction (vam-hri). In: Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, HRI 2020, pp. 663–664 (2020)
Acknowledgement
This research was supported by National Science Foundation National Robotics Initiative 2.0 grant for “Communicate, Share, Adapt: A Mixed Reality Framework for Facilitating Robot Integration and Customization” (NSF IIS-1925083).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Groechel, T. et al. (2021). Kinesthetic Curiosity: Towards Personalized Embodied Learning with a Robot Tutor Teaching Programming in Mixed Reality. In: Siciliano, B., Laschi, C., Khatib, O. (eds) Experimental Robotics. ISER 2020. Springer Proceedings in Advanced Robotics, vol 19. Springer, Cham. https://doi.org/10.1007/978-3-030-71151-1_22
Download citation
DOI: https://doi.org/10.1007/978-3-030-71151-1_22
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-71150-4
Online ISBN: 978-3-030-71151-1
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)