Abstract
Virtual and Augmented reality (VR/AR) are widely deployed in industrial, medical, educational, and entertaining fields. The design of interactive interfaces has an impact on usability, comfort, and efficiency. Hand controllers and gestures are popularly used in VR/AR devices. However, users may suffer from overloading on the upper extremities while raising the hand or controller. Therefore, we released a microgesture library with 19 microgestures designed by ergonomists. Users can perform microgestures for an extended duration by resting the forearm on the tables to reduce the load on the upper extremity. Additionally, we collected a microgesture dataset of 2900 samples and utilized the C3D model to recognize the microgesture dataset. Finally, we achieved a recognition accuracy of 93.4% on the microgestures dataset.
Granted by the Key-Area Research and Development Program of Guangdong Province (No. 2019B010149001), the National Natural Science Foundation of China (No. 61960206007) and the 111 Project (B18005).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Guo, J., et al.: Mixed reality office system based on Maslow’s hierarchy of needs: towards the long-term immersion in virtual environments. In: 18th IEEE International Symposium on Mixed and Augmented Reality, Beijing, pp. 224–235. IEEE (2019)
Saredakis, D., Szpak, A., Birckhead, B., Keage, H.A.D., Rizzo, A., Loetscher, T.: Factors associated with virtual reality sickness in head-mounted displays: a systematic review and meta-analysis. Front. Human Neurosci. 14, 96 (2020)
Wang, Y., MacKenzie, C.L.: The role of contextual haptic and visual constraints on object manipulation in virtual environments. In: CHI 2000: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 532–539. ACM (2000)
Visser, B., Korte, E.D., Kraan, I.V.D., Kuijer, P.: The effect of arm and wrist supports on the load of the upper extremity during VDU work. Clin. Biomech. Supp. 1(15), 34–38 (2000)
Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: CHI 2009: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1083–1092. ACM (2009)
Rempel, D., Camilleri, M.J., Lee, D.L.: The design of hand gestures for human-computer interaction: lessons from sign language interpreters. Int. J. Hum.-Comput. Stud. 10(72), 728–735 (2014)
Pereira, A., Wachs, J.P., Park, K., Rempel, D.: A user-developed 3-D hand gesture set for human-computer interaction. Hum. Fact. 57(4), 607–621 (2015)
Hincapié-Ramos J.D., Guo X., Moghadasian P., Irani P.: Consumed endurance: a metric to quantify arm fatigue of mid-air interactions. In: CHI 2014: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1063–1072. ACM (2014)
Chan, C., Seyed, T., Stuerzlinger, W., Yang, X., Maurer, F.: User elicitation on single-hand microgestures. In: CHI 2016: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 3403–3414. ACM (2016)
Nai, W., Liu, Y., Rempel, D., Wang, Y.: Fast hand posture classification using depth features extracted from random line segments. Hum. Fact. 65, 1–10 (2016)
Mueller, F., et al.: Real-time pose and shape reconstruction of two interacting hands with a single depth camera. ACM Trans. Graph. 38(4), 1–13 (2019)
Molchanov P., Yang X., Gupta S., Kim K., Tyree S., Kautz J.: Online detection and classification of dynamic hand gestures with recurrent 3D convolutional neural networks. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 4207–4215. IEEE (2019)
Wu W., Yuan Y., Yeo H., Quigley A., Koike H., Kitani K.M.: Back-hand-pose: 3D hand pose estimation for a wrist-worn camera via dorsum deformation network. In: Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, pp. 1147–1160. ACM (2020)
Cao, C., Zhang, Y., Wu, Y., Lu, H., Cheng, J.: Egocentric gesture recognition using recurrent 3d convolutional neural networks with spatiotemporal transformer modules. In: 2017 IEEE International Conference on Computer Vision (ICCV), pp. 3763–3771. IEEE (2017)
Zhang, Y., Cao, C., Cheng, J., Lu, H.: EgoGesture: a new dataset and benchmark for egocentric hand gesture recognition. IEEE Trans. Multimedia 5(20), 1038–1050 (2018)
Materzynska J., Berger G., Bax I., Memisevic R.: The jester dataset: a large-scale video dataset of human gestures. In:2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW), pp. 2874–2882. IEEE (2019)
Tran, D., Bourdev, L., Fergus, R., Torresani, L., Paluri, M.: Learning spatiotemporal features with 3D convolutional networks. In: 2015 IEEE International Conference on Computer Vision (ICCV), pp. 4489–4497. IEEE (2015)
Ng, H., Nguyen, V.D., Vonikakis, V., Winkler, S.: Deep learning for emotion recognition on small datasets using transfer learning. In:17th ACM International Conference on Multimodal Interaction, pp. 443—449. ACM (2015)
Andrej K., George T., Sanketh S., Thomas L., Rahul S., Li F.: Large-scale video classification with convolutional neural networks. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1725–1732. IEEE (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Li, G., Liu, Y., Song, W., Wang, C., Wang, Y. (2021). A New Dataset and Recognition for Egocentric Microgesture Designed by Ergonomists. In: Peng, Y., Hu, SM., Gabbouj, M., Zhou, K., Elad, M., Xu, K. (eds) Image and Graphics. ICIG 2021. Lecture Notes in Computer Science(), vol 12889. Springer, Cham. https://doi.org/10.1007/978-3-030-87358-5_7
Download citation
DOI: https://doi.org/10.1007/978-3-030-87358-5_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-87357-8
Online ISBN: 978-3-030-87358-5
eBook Packages: Computer ScienceComputer Science (R0)