Skip to main content

A New Dataset and Recognition for Egocentric Microgesture Designed by Ergonomists

  • Conference paper
  • First Online:
Image and Graphics (ICIG 2021)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 12889))

Included in the following conference series:

  • 1663 Accesses

Abstract

Virtual and Augmented reality (VR/AR) are widely deployed in industrial, medical, educational, and entertaining fields. The design of interactive interfaces has an impact on usability, comfort, and efficiency. Hand controllers and gestures are popularly used in VR/AR devices. However, users may suffer from overloading on the upper extremities while raising the hand or controller. Therefore, we released a microgesture library with 19 microgestures designed by ergonomists. Users can perform microgestures for an extended duration by resting the forearm on the tables to reduce the load on the upper extremity. Additionally, we collected a microgesture dataset of 2900 samples and utilized the C3D model to recognize the microgesture dataset. Finally, we achieved a recognition accuracy of 93.4% on the microgestures dataset.

Granted by the Key-Area Research and Development Program of Guangdong Province (No. 2019B010149001), the National Natural Science Foundation of China (No. 61960206007) and the 111 Project (B18005).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Guo, J., et al.: Mixed reality office system based on Maslow’s hierarchy of needs: towards the long-term immersion in virtual environments. In: 18th IEEE International Symposium on Mixed and Augmented Reality, Beijing, pp. 224–235. IEEE (2019)

    Google Scholar 

  2. Saredakis, D., Szpak, A., Birckhead, B., Keage, H.A.D., Rizzo, A., Loetscher, T.: Factors associated with virtual reality sickness in head-mounted displays: a systematic review and meta-analysis. Front. Human Neurosci. 14, 96 (2020)

    Article  Google Scholar 

  3. Wang, Y., MacKenzie, C.L.: The role of contextual haptic and visual constraints on object manipulation in virtual environments. In: CHI 2000: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 532–539. ACM (2000)

    Google Scholar 

  4. Visser, B., Korte, E.D., Kraan, I.V.D., Kuijer, P.: The effect of arm and wrist supports on the load of the upper extremity during VDU work. Clin. Biomech. Supp. 1(15), 34–38 (2000)

    Article  Google Scholar 

  5. Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: CHI 2009: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1083–1092. ACM (2009)

    Google Scholar 

  6. Rempel, D., Camilleri, M.J., Lee, D.L.: The design of hand gestures for human-computer interaction: lessons from sign language interpreters. Int. J. Hum.-Comput. Stud. 10(72), 728–735 (2014)

    Article  Google Scholar 

  7. Pereira, A., Wachs, J.P., Park, K., Rempel, D.: A user-developed 3-D hand gesture set for human-computer interaction. Hum. Fact. 57(4), 607–621 (2015)

    Article  Google Scholar 

  8. Hincapié-Ramos J.D., Guo X., Moghadasian P., Irani P.: Consumed endurance: a metric to quantify arm fatigue of mid-air interactions. In: CHI 2014: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1063–1072. ACM (2014)

    Google Scholar 

  9. Chan, C., Seyed, T., Stuerzlinger, W., Yang, X., Maurer, F.: User elicitation on single-hand microgestures. In: CHI 2016: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 3403–3414. ACM (2016)

    Google Scholar 

  10. Nai, W., Liu, Y., Rempel, D., Wang, Y.: Fast hand posture classification using depth features extracted from random line segments. Hum. Fact. 65, 1–10 (2016)

    Google Scholar 

  11. Mueller, F., et al.: Real-time pose and shape reconstruction of two interacting hands with a single depth camera. ACM Trans. Graph. 38(4), 1–13 (2019)

    Article  Google Scholar 

  12. Molchanov P., Yang X., Gupta S., Kim K., Tyree S., Kautz J.: Online detection and classification of dynamic hand gestures with recurrent 3D convolutional neural networks. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 4207–4215. IEEE (2019)

    Google Scholar 

  13. Wu W., Yuan Y., Yeo H., Quigley A., Koike H., Kitani K.M.: Back-hand-pose: 3D hand pose estimation for a wrist-worn camera via dorsum deformation network. In: Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, pp. 1147–1160. ACM (2020)

    Google Scholar 

  14. Cao, C., Zhang, Y., Wu, Y., Lu, H., Cheng, J.: Egocentric gesture recognition using recurrent 3d convolutional neural networks with spatiotemporal transformer modules. In: 2017 IEEE International Conference on Computer Vision (ICCV), pp. 3763–3771. IEEE (2017)

    Google Scholar 

  15. Zhang, Y., Cao, C., Cheng, J., Lu, H.: EgoGesture: a new dataset and benchmark for egocentric hand gesture recognition. IEEE Trans. Multimedia 5(20), 1038–1050 (2018)

    Article  Google Scholar 

  16. Materzynska J., Berger G., Bax I., Memisevic R.: The jester dataset: a large-scale video dataset of human gestures. In:2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW), pp. 2874–2882. IEEE (2019)

    Google Scholar 

  17. Tran, D., Bourdev, L., Fergus, R., Torresani, L., Paluri, M.: Learning spatiotemporal features with 3D convolutional networks. In: 2015 IEEE International Conference on Computer Vision (ICCV), pp. 4489–4497. IEEE (2015)

    Google Scholar 

  18. Ng, H., Nguyen, V.D., Vonikakis, V., Winkler, S.: Deep learning for emotion recognition on small datasets using transfer learning. In:17th ACM International Conference on Multimodal Interaction, pp. 443—449. ACM (2015)

    Google Scholar 

  19. Andrej K., George T., Sanketh S., Thomas L., Rahul S., Li F.: Large-scale video classification with convolutional neural networks. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1725–1732. IEEE (2014)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yue Liu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, G., Liu, Y., Song, W., Wang, C., Wang, Y. (2021). A New Dataset and Recognition for Egocentric Microgesture Designed by Ergonomists. In: Peng, Y., Hu, SM., Gabbouj, M., Zhou, K., Elad, M., Xu, K. (eds) Image and Graphics. ICIG 2021. Lecture Notes in Computer Science(), vol 12889. Springer, Cham. https://doi.org/10.1007/978-3-030-87358-5_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-87358-5_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-87357-8

  • Online ISBN: 978-3-030-87358-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics