Skip to main content

Vision-Tactile Fusion Based Detection of Deformation and Slippage of Deformable Objects During Grasping

  • Conference paper
  • First Online:
Cognitive Systems and Information Processing (ICCSIP 2022)

Abstract

The ability of humans to use visual and tactile information to grasp easily deformable objects and prevent them from deforming and slipping remains a challenge for robotic grasping tasks. The traditional CNN + LSTM network for visual-tactile fusion has the problems of inadequate feature fusion representation and too simple determination of the grasp state category of the object. To solve these problems, this paper proposes a new visual-tactile fusion deep neural network (RSEL) based on the traditional CNN + LSTM network with improvements for evaluating the grasping state of easily deformable objects during grasping. Specifically, we classify the states of easily deformable objects during grasping into five categories: no contact, moderate contact, exce-ssive contact, no slip and slip. In addition, training and testing datasets were built by conducting extensive grasping and lifting experiments on 15 deformable objects with different widths and forces. To evaluate the (RSEL) model, we compared the conventional CNN + LSTM network, and in comparison our model achieved 80.50% classification accuracy with a significant improvement in classification accuracy up to 7.37%. This experiment contributes to adaptive force tuning and robot dexterity operation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Sanchez, J., et al.: Robotic manipulation and sensing of deformable objects in domestic and industrial applications: a survey. Int. J. Robot. Res. 37, 688–716 (2018)

    Article  Google Scholar 

  2. Shan, L., et al.: Robotic tactile perception of object properties: a review. Mechatronics 48, 54–67 (2017)

    Article  Google Scholar 

  3. Chen, W., et al.: Tactile sensors for friction estimation and incipient slip detection -toward dexterous robotic manipulation: a review (2019)

    Google Scholar 

  4. Funabashi, S., Kage, Y., Oka, H., Sakamoto, Y., Sugano, S.: Object picking using a two-fingered gripper measuring the deformation and slip detection based on a 3-axis tactile sensing. In: 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3888-3895 (2021). https://doi.org/10.1109/IROS51168.2021.9636354

  5. Li, J., Dong, S., Adelson, E.: Slip detection with combined tactile and visual information. In: 2018 IEEE International Conference on Robotics and Automation (ICRA) (2018)

    Google Scholar 

  6. Bohg, J., et al.: Data-Driven Grasp Synthesis - A Survey. arXiv e-prints (2013)

    Google Scholar 

  7. Cui, S., et al.: Real-time perception and positioning for creature picking of an underwater vehicle. IEEE Trans. Veh. Technol. 69(99), 3783–3792 (2020)

    Article  Google Scholar 

  8. Cui, S., et al.: Grasp state assessment of deformable objects using visual-tactile fusion perception (2020)

    Google Scholar 

  9. Jie, H., et al.: Squeeze-and-excitation networks. IEEE Trans. Pattern Anal. Mach. Intell. 99 (2017)

    Google Scholar 

  10. Tomo, T.P., et al.: A new silicone structure for uSkin - a soft, distributed, digital 3-axis skin sensor and its integration on the humanoid robot iCub. IEEE Robot. Autom. Lett. 3, 2584–2591 (2018)

    Article  Google Scholar 

  11. Levine, S., et al.: Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection. Int. J. Robot. Res. 37(4–5), 421–436 (2016)

    Google Scholar 

  12. Tanii, R., et al.: Elasticity sensor using different tactile properties on one chip. In: IEEE. IEEE, pp. 862–865 (2018)

    Google Scholar 

  13. Yuan, W, Srinivasan, M. A., Adelson, E.H.: Estimating object hardness with a GelSight touch sensor. In: IEEE/RSJ International Conference on Intelligent Robots & Systems IEEE, pp. 208–215 (2016)

    Google Scholar 

  14. Kwiatkowski, J., Cockburn, D., Duchaine, V.: Grasp stability assessment through the fusion of proprioception and tactile signals using convolutional neural networks. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) IEEE (2017)

    Google Scholar 

  15. Funabashi, S., et al.: Variable in-hand manipulations for tactile-driven robot hand via CNN-LSTM. In: 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) IEEE (2020)

    Google Scholar 

  16. Allen, P.: Surface descriptions from vision and touch. In: IEEE International Conference on Robotics & Automation IEEE, pp. 394–397 (1984)

    Google Scholar 

  17. Zhao, Z.Q., et al.: Object detection with deep learning: a review. arXiv e-prints (2018)

    Google Scholar 

  18. Seminara, L., et al.: Active haptic perception in robots: a review. Front. Neurorobot. 13, 53 (2019)

    Article  Google Scholar 

  19. Roberto, C., et al.: More than a feeling: learning to grasp and regrasp using vision and touch. IEEE Robot. Autom. Lett. 3, 3300–3307 (2018)

    Article  Google Scholar 

Download references

Acknowledgment

This research was supported by the “New Generation Artificial Intelligence” major special project of Guangdong Provincial Key Area R&D Program, “Multi-degree-of-freedom Intelligent Body Complex Skills Autonomous Learning, Key Components and 3C Manufacturing Demonstration Applications” (2021B010410002), Research and Development of Micron-level Real-time Vision Inspection Technology and System of Guangdong Province Key Areas R&D Program (2020B0404030001) and National Natural Science Foundation of China - Youth Project “Research on Adaptation Problem and Update Mechanism of Online Learning of Data Stream in Visual Ash Measurement of Flotation Tailings” (62106048).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wenbo Zhu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ruan, W. et al. (2023). Vision-Tactile Fusion Based Detection of Deformation and Slippage of Deformable Objects During Grasping. In: Sun, F., Cangelosi, A., Zhang, J., Yu, Y., Liu, H., Fang, B. (eds) Cognitive Systems and Information Processing. ICCSIP 2022. Communications in Computer and Information Science, vol 1787. Springer, Singapore. https://doi.org/10.1007/978-981-99-0617-8_43

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-0617-8_43

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-0616-1

  • Online ISBN: 978-981-99-0617-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics