Skip to main content

Prediction of Hand Kinematics in Grasping with Mamba-Based Graph Convolutional Networks

  • Conference paper
  • First Online:
Intelligent Robotics and Applications (ICIRA 2024)

Abstract

Prediction of hand kinematics in grasping contributes to accurate and efficient robotic grasping. In existing methods, the topology graph of hands only focuses on local constraints of hand structure, neglecting global coordination by the brain. Also, hand kinematics can be viewed as multivariate time series collected by sensors, so utilizing advanced neural networks can improve prediction accuracy. Therefore, we propose a virtual node-optimized topology graph and design Mamba-based graph convolutional networks (MambaGCN). Our topology graph achieves global coordination using virtual nodes, optimizing the aggregation of multivariate time series. We introduce the advanced Mamba architecture to enhance the modeling of multivariate time series. This study conducts on predicting hand kinematics in grasping using multivariate time series collected by sensors, which can serve as a pretext task for denoising and guiding robotic grasping. On the publicly available HANDdata dataset, results indicate that the proposed method outperforms three advanced models in terms of accuracy. Ablation studies validate the effectiveness of the virtual node-optimized topology graph and the improved Mamba architecture. The framework of MambaGCN is open-sourced at https://github.com/White-oranges/MambaGCN.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Liu, Z., Liu, Q., Xu, W., Wang, L., Zhou, Z.: Robot learning towards smart robotic manufacturing: a review. Rob. Comput. Integr. Manuf. 77, 1–21 (2022)

    Google Scholar 

  2. Liu, Y., Xu, H., Liu, D., Wang, L.: A digital twin-based sim-to-real transfer for deep reinforcement learning-enabled industrial robot grasping. Rob. Comput. Integr. Manuf. 78, 1–12 (2022)

    Google Scholar 

  3. Mengmeng, S., et al.: Reconfigurable magnetic slime robot: Deformation, adaptability, and multifunction. Adv. Func. Mater. 32, 03 (2022)

    MATH  Google Scholar 

  4. Chen, W., et al.: Soft exoskeleton with fully actuated thumb movements for grasping assistance. IEEE Trans. Rob. 38, 1–14 (2022)

    Google Scholar 

  5. Tang, J., Gong, Z., Tao, B., Yin, Z., Ding, H.: Network convergence indicator for efficient robot grasping pose detection under limited computation resource. IEEE Trans. Instrum. Meas. 73, 1–12 (2024)

    MATH  Google Scholar 

  6. Sundaram, S., Kellnhofer, P., Li, Y., Zhu, J.Y., Torralba, A., Matusik, W.: Learning the signatures of the human grasp using a scalable tactile glove. Nature 569, 698–702 (2019)

    Google Scholar 

  7. Zheng, K., Liu, S., Yang, J., Al-Selwi, M., Li, J.: SEMG-based continuous hand action prediction by using key state transition and model pruning. Sensors 22, 9949 (2022)

    Google Scholar 

  8. Yang, X., Fu, Z., Li, B., Liu, J.: An sEMG-based human-exoskeleton interface fusing convolutional neural networks with hand-crafted features. Front. Neurorobotics 16, 938345 (2022)

    Google Scholar 

  9. Mahboob, T., Chung, M., Choi, K.: EMG-based 3D hand gesture prediction using transformer-encoder classification. ICT Express 9, 04 (2023)

    Article  Google Scholar 

  10. Zhou, K., Shum, H.P., Li, F.W., Liang, X.: Multi-task spatial-temporal graph auto-encoder for hand motion denoising. In: IEEE Transactions on Visualization and Computer Graphics (2023)

    Google Scholar 

  11. Shi, X., Guo, W., Xu, W., Sheng, X.: Hand grasp pose prediction based on motion prior field. Biomimetics 8, 250 (2023)

    Google Scholar 

  12. Roda-Sales, A., Sancho-Bru, J.L., Vergara, M.: Problems using data gloves with strain gauges to measure distal interphalangeal joints’ kinematics. Sensors 22, 3757 (2022)

    Google Scholar 

  13. Liu, Q., et al.: CNN-based hand grasping prediction and control via postural synergy basis extraction. Sensors 22, 831 (2022)

    Google Scholar 

  14. Krammer, W., Missimer, J., Habegger, S., Pastore-Wapp, M., Wiest, R., Weder, B.: Sensing form - finger gaiting as key to tactile object exploration - a data glove analysis of a prototypical daily task. J. Neuroeng. Rehabil. 17, 09 (2020)

    Article  Google Scholar 

  15. Funabashi, S., et al.: Multi-fingered in-hand manipulation with various object properties using graph convolutional networks and distributed tactile sensors (2022)

    Google Scholar 

  16. Liao, J., Xiong, P., Liu, P.X., Li, Z., Song, A.: Enhancing robotic tactile exploration with multireceptive graph convolutional networks. In: IEEE Transactions on Industrial Electronics, pp. 1–12 (2023)

    Google Scholar 

  17. Wang, D., Teng, Y., Peng, J., Zhao, J., Wang, P.: Deep-learning-based object classification of tactile robot hand for smart factory. Appl. Intell. 53, 1–17 (2023)

    Google Scholar 

  18. Lipton, Z.C.: A critical review of recurrent neural networks for sequence learning. arXiv preprint arXiv:1506.00019 (2015)

  19. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  MATH  Google Scholar 

  20. Vaswani, A., et al.: Attention is all you need. In: Proceedings of the 31st International Conference on Neural Information Processing Systems, NIPS’17, pp. 6000–6010 (2017)

    Google Scholar 

  21. Gu, A., Dao, T.: Mamba: linear-time sequence modeling with selective state spaces (2023)

    Google Scholar 

  22. Mastinu, E., Coletti, A., Mohammad, S., Berg, J., Cipriani, C.: HANDdata - first-person dataset including proximity and kinematics measurements from reach-to-grasp actions. Sci. Data 10, 06 (2023)

    Article  Google Scholar 

  23. Woo, G., Liu, C., Sahoo, D., Kumar, A., Hoi, S.: ETSformer: exponential smoothing transformers for time-series forecasting (2022)

    Google Scholar 

  24. Chen, S.A., Li, C.L., Yoder, N., Arik, S.O., Pfister, T.: TSMixer: an all-MLP architecture for time series forecasting (2023)

    Google Scholar 

Download references

Acknowledgments

This work was supported by the National Natural Science Foundation of China under Grant (52188102).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zeyu Gong .

Editor information

Editors and Affiliations

Ethics declarations

Disclosure of Interests

The authors have no competing interests to declare that are relevant to the content of this article.

Rights and permissions

Reprints and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Peng, W., Tang, J., Gong, Z., Tao, B. (2025). Prediction of Hand Kinematics in Grasping with Mamba-Based Graph Convolutional Networks. In: Lan, X., Mei, X., Jiang, C., Zhao, F., Tian, Z. (eds) Intelligent Robotics and Applications. ICIRA 2024. Lecture Notes in Computer Science(), vol 15208. Springer, Singapore. https://doi.org/10.1007/978-981-96-0783-9_4

Download citation

  • DOI: https://doi.org/10.1007/978-981-96-0783-9_4

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-96-0782-2

  • Online ISBN: 978-981-96-0783-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics