Abstract
The ability of humans to use visual and tactile information to grasp easily deformable objects and prevent them from deforming and slipping remains a challenge for robotic grasping tasks. The traditional CNN + LSTM network for visual-tactile fusion has the problems of inadequate feature fusion representation and too simple determination of the grasp state category of the object. To solve these problems, this paper proposes a new visual-tactile fusion deep neural network (RSEL) based on the traditional CNN + LSTM network with improvements for evaluating the grasping state of easily deformable objects during grasping. Specifically, we classify the states of easily deformable objects during grasping into five categories: no contact, moderate contact, exce-ssive contact, no slip and slip. In addition, training and testing datasets were built by conducting extensive grasping and lifting experiments on 15 deformable objects with different widths and forces. To evaluate the (RSEL) model, we compared the conventional CNN + LSTM network, and in comparison our model achieved 80.50% classification accuracy with a significant improvement in classification accuracy up to 7.37%. This experiment contributes to adaptive force tuning and robot dexterity operation.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Sanchez, J., et al.: Robotic manipulation and sensing of deformable objects in domestic and industrial applications: a survey. Int. J. Robot. Res. 37, 688–716 (2018)
Shan, L., et al.: Robotic tactile perception of object properties: a review. Mechatronics 48, 54–67 (2017)
Chen, W., et al.: Tactile sensors for friction estimation and incipient slip detection -toward dexterous robotic manipulation: a review (2019)
Funabashi, S., Kage, Y., Oka, H., Sakamoto, Y., Sugano, S.: Object picking using a two-fingered gripper measuring the deformation and slip detection based on a 3-axis tactile sensing. In: 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3888-3895 (2021). https://doi.org/10.1109/IROS51168.2021.9636354
Li, J., Dong, S., Adelson, E.: Slip detection with combined tactile and visual information. In: 2018 IEEE International Conference on Robotics and Automation (ICRA) (2018)
Bohg, J., et al.: Data-Driven Grasp Synthesis - A Survey. arXiv e-prints (2013)
Cui, S., et al.: Real-time perception and positioning for creature picking of an underwater vehicle. IEEE Trans. Veh. Technol. 69(99), 3783–3792 (2020)
Cui, S., et al.: Grasp state assessment of deformable objects using visual-tactile fusion perception (2020)
Jie, H., et al.: Squeeze-and-excitation networks. IEEE Trans. Pattern Anal. Mach. Intell. 99 (2017)
Tomo, T.P., et al.: A new silicone structure for uSkin - a soft, distributed, digital 3-axis skin sensor and its integration on the humanoid robot iCub. IEEE Robot. Autom. Lett. 3, 2584–2591 (2018)
Levine, S., et al.: Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection. Int. J. Robot. Res. 37(4–5), 421–436 (2016)
Tanii, R., et al.: Elasticity sensor using different tactile properties on one chip. In: IEEE. IEEE, pp. 862–865 (2018)
Yuan, W, Srinivasan, M. A., Adelson, E.H.: Estimating object hardness with a GelSight touch sensor. In: IEEE/RSJ International Conference on Intelligent Robots & Systems IEEE, pp. 208–215 (2016)
Kwiatkowski, J., Cockburn, D., Duchaine, V.: Grasp stability assessment through the fusion of proprioception and tactile signals using convolutional neural networks. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) IEEE (2017)
Funabashi, S., et al.: Variable in-hand manipulations for tactile-driven robot hand via CNN-LSTM. In: 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) IEEE (2020)
Allen, P.: Surface descriptions from vision and touch. In: IEEE International Conference on Robotics & Automation IEEE, pp. 394–397 (1984)
Zhao, Z.Q., et al.: Object detection with deep learning: a review. arXiv e-prints (2018)
Seminara, L., et al.: Active haptic perception in robots: a review. Front. Neurorobot. 13, 53 (2019)
Roberto, C., et al.: More than a feeling: learning to grasp and regrasp using vision and touch. IEEE Robot. Autom. Lett. 3, 3300–3307 (2018)
Acknowledgment
This research was supported by the “New Generation Artificial Intelligence” major special project of Guangdong Provincial Key Area R&D Program, “Multi-degree-of-freedom Intelligent Body Complex Skills Autonomous Learning, Key Components and 3C Manufacturing Demonstration Applications” (2021B010410002), Research and Development of Micron-level Real-time Vision Inspection Technology and System of Guangdong Province Key Areas R&D Program (2020B0404030001) and National Natural Science Foundation of China - Youth Project “Research on Adaptation Problem and Update Mechanism of Online Learning of Data Stream in Visual Ash Measurement of Flotation Tailings” (62106048).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Ruan, W. et al. (2023). Vision-Tactile Fusion Based Detection of Deformation and Slippage of Deformable Objects During Grasping. In: Sun, F., Cangelosi, A., Zhang, J., Yu, Y., Liu, H., Fang, B. (eds) Cognitive Systems and Information Processing. ICCSIP 2022. Communications in Computer and Information Science, vol 1787. Springer, Singapore. https://doi.org/10.1007/978-981-99-0617-8_43
Download citation
DOI: https://doi.org/10.1007/978-981-99-0617-8_43
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-0616-1
Online ISBN: 978-981-99-0617-8
eBook Packages: Computer ScienceComputer Science (R0)