Skip to main content
Log in

A mask area based grasping force control strategy for force sensor-less robot

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

The grasping force of service robots should be controlled within a proper range when they operate objects with unknown physical property (OUPP) whose physical property may be deformable. And that is challenging for service robots equipped with force sensor-less gripper. Therefore, this paper introduced a mask area based grasping force control strategy (MAFC) to realize the control of the grasping force in visual mode. Firstly, a semantic segmentation model was applied to monitor the deformation of the object. And this deformation was treated as a criterion to conduct the evaluation of the grasping state. Then the gripper can be adjusted based on the grasping state. Besides, a cup grasping experiment was conducted with the MAFC strategy. The experimental results showed that the success rate of grasping can be 90%. Meanwhile, the proportion of deformation was controlled within 2%. Moreover, with the MAFC strategy, the contrast experiments indicated that the grasping success rate can be increased by 40% compared with the” Pick and Place” module of MoveIt. All in all, the MAFC strategy can improve the grasping performance of service robots with force sensor-less gripper.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Bender J, Mller M, Otaduy MA et al (2014) A survey on position-based simulation methods in computer graphics[J]. Comput Graph Forum 33(6):228–251

    Article  Google Scholar 

  2. Bohg J, Morales A, Asfour T, Kragic D (2014) Data-driven grasp synthesis-a survey[J]. IEEE Trans Robot 30(2):289–309

    Article  Google Scholar 

  3. Brie D, Bombardier V, Baeteman G, Bennis A (2016) Local surface sampling step estimation for extracting boundaries of planar point clouds[J]. ISPRS J Photogramm Remote Sens 119:309–319

    Article  Google Scholar 

  4. Cretu AM, Payeur P, Petriu EM (2012) Soft object deformation monitoring and learning for model-based robotic hand manipulation[J]. IEEE Trans Syst Man Cybern B 42(3):740–753

    Article  Google Scholar 

  5. Demarsin K, Vanderstraeten D, Volodine T, Roose D (2007) Detection of closed sharp edges in point clouds using normal estimation and graph theory[J]. Comput Aided Des 39(4):276–283

    Article  Google Scholar 

  6. Deng Z, Yannick J, Zhang L et al (2020) Grasping force control of multi-fingered robotic hands through tactile sensing for object stabilization[J]. Sensors 20(4):1050

    Article  Google Scholar 

  7. Dong S, Yuan W, Adelson E (2017) Improved gel sight tactile sensor for measuring geometry and slip[C]. 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE 137–144

  8. Dong G, Zhu ZH (2015) Position-based visual servo control of autonomous robotic manipulators[J]. Acta Astronaut 115:291–302

    Article  Google Scholar 

  9. Fang HS, Wang C, Gou M et al (2020) GraspNet-1 billion: a large-scale benchmark for general object grasping[C]. 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE 11441–11450

  10. Fei X, Chen L, Xu Y et al (2017) Motion planning and object grasping of Baxter robot with bionic hand[M]. Advanced computational methods in life system modeling and simulation. Springer, Singapore 211–221

  11. Gamal M, Siam M, Abdel-Razek M (2018) ShuffleSeg: real-time semantic segmentation network[J]. arXiv preprint arXiv:1803.03816

  12. He K, Gkioxari G, Dollár P et al (2017) Mask r-cnn[C]. 2017 IEEE international conference on computer vision (ICCV). IEEE 2980–2988

  13. Hu Z, Sun P, Pan J (2017) 3D deformable object manipulation using fast online gaussian rrocess regression[J]. 3(2): 979–986

  14. Hui F, Payeur P, Cretu AM (2017) Visual tracking of deformation and classification of non-rigid objects with robot hand probing[J]. Robotics 6(1):5–30

    Article  Google Scholar 

  15. Hwang W, Lim SC (2017) Inferring interaction force from visual information without using physical force sensors[J]. Sensors 17(11):2455

    Article  Google Scholar 

  16. Ingrand F, Ghallab M (2017) Deliberation for autonomous robots: a survey [J]. Artif Intell 247:10–44

    Article  MathSciNet  Google Scholar 

  17. Kampouris C, Mariolis I, Peleka G et al (2016) Multi-sensorial and explorative recognition of garments and their material properties in unconstrained environment[C]. 2016 IEEE international conference on robotics and automation (ICRA). IEEE, Stockholm, Sweden 1656–1663

  18. Kappassov Z, Corrales JA, Perdereau V (2015) Tactile sensing in dexterous robot hands-review[J]. Robot Auton Syst 74:195–220

    Article  Google Scholar 

  19. Levine S, Pastor P, Krizhevsky A et al (2016) Learning hand-eye coordination for robotic grasping with large-scale data collection[C]. International Symposium on Experimental Robotics. Springer, Cham 173–184

  20. Li S, Zhang S, Fu Y, Wang H, Han K (2020) Task-based obstacle avoidance for uncertain targets based on semantic object matrix[J]. Control Eng Pract 105:104649

    Article  Google Scholar 

  21. Li S, Zhang S, Fu Y et al (2018) The grasping force control for force sensor-less robot through point clouds mask segmentation[C]. 2018 3rd International Conference on Robotics and Automation Engineering (ICRAE). IEEE 1–4

  22. Liu W, Anguelov D, Erhan D et al (2016) Ssd: single shot multibox detector[C]//European conference on computer vision. Springer, Cham 21–37

  23. Mahler J, Pokorny FT, Hou B et al (2016) Dex-net 1.0: a cloud-based network of 3d objects for robust grasping planning using a multi-armed bandit model with correlated rewards[C]. 2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE 1957–1964

  24. Matas J, James S, Davison AJ (2018) Sim-to-real reinforcement learning for deformable object manipulation[J]. arXiv preprint arXiv:1806.07851

  25. Mateo CM, Gil P, Torres F (2016) 3D visual data-driven spatiotemporal deformations for non-rigid object grasping using robot hands[J]. Sensors 16(5):640–665

    Article  Google Scholar 

  26. Nadon F, Payeur P (2019) Automatic selection of grasping points for shape control of non-rigid objects [C]. In 2019 IEEE International Symposium on Robotic and Sensors Environments (ROSE). IEEE 1–7

  27. Pham TH, Kyriazis N, Argyros AA et al (2015) Towards force sensing from vision: observing hand-object interactions to infer manipulation forces[C]. Proceedings of the IEEE conference on computer vision and pattern recognition. IEEE 2810–2819

  28. Pham TH, Kyriazis N, Argyros AA et al (2017) Hand-object contact force estimation from markerless visual tracking [J]. IEEE Trans Pattern Anal Mach Intell 1–14

  29. Pick-and-place. https://github.com/ros-planning/moveit_tutorials/tree/kinetic-devel/doc/pick_place, Accessed 26 Apr 2019

  30. Quillen D, Jang E, Nachum O et al (2018) Deep reinforcement learning for vision-based robotic grasping: a simulated comparative evaluation of off-policy methods [C] // 2018 IEEE International Conference on Robotics and Automation (ICRA). IEEE 6284–6291

  31. Reddy PVP, Suresh V (2013) A review on importance of universal gripper in industrial robot applications[J]. Int J Mech Eng Robot Res 2(2):255–264

    Google Scholar 

  32. Rusu RB, Cousins S (2011) 3d is here: point cloud library (pcl)[C]. 2011 IEEE international conference on robotics and automation (ICRA). IEEE 1–4

  33. Sahbani A, El-Khoury S, Bidaud P (2012) An overview of 3D object grasp synthesis algorithms [J]. Robot Auton Syst 60(3):326–336

    Article  Google Scholar 

  34. Sanchez J, Corrales JA et al (2018) Robotic manipulation and sensing of deformable objects in domestic and industrial applications: a survey[J]. Int J Robot Res 1–34

  35. Stachowsky M, Abdullah HA, Moussa M et al (2015) An object exploration strategy for com and mass determination for robot grasping [J]. Int J Mech Eng Robot Res 4(4):343

    Google Scholar 

  36. Steinemann D, Otaduy MA, Gross M (2008) Fast adaptive shape matching deformations [C]. Proceedings of the 2008 ACMSIGGRAPH/Euro graphics symposium on computer animation 87–94

  37. Su J, Qiao H, Ou Z, Liu ZY (2015) Vision-based caging grasps of polyhedron-like workpieces with a binary industrial gripper[J]. IEEE Trans Autom Sci Eng 12(3):1033–1046

    Article  Google Scholar 

  38. Tamada T, Ikarashi W, Yoneyama D et al (2014) High-speed bipedal robot running using high-speed visual feedback[C]. 2014 14th IEEE-RAS international conference on humanoid robots (humanoids). IEEE 140–145

  39. Teichmann M, Weber M, Zoellner M et al (2016) Multinet: real-time joint semantic reasoning for autonomous driving[J]. arXiv preprint arXiv:1612.07695

  40. Valencia AJ, Nadon F, Payeur P (2019) Toward real-time 3D shape tracking of deformable objects for robotic manipulation and shape control[C]. 2019 IEEE SENSORS. IEEE 1–4

  41. Widhiada W, Nindhia TGT, Budiarsa N (2015) Robust control for the motion five fingered robot gripper [J]. Int J Mech Eng Robot Res 4(3):226–232

    Google Scholar 

  42. Wu J, Lu E, Kohli P et al (2017) Learning to see physics via visual de-animation [C]. Advances in Neural Information Processing Systems 152–163

  43. Xu J, Danielczuk M, Ichnowski J et al (2020) Minimal work: a grasp quality metric for deformable hollow objects[C]. 2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE 1546–1552

  44. Yang P, Sasaki K, Suzuki K et al (2017) Repeatable folding task by humanoid robot worker using deep learning[J]. IEEE Robot Autom Lett 2(2):397–403

    Article  Google Scholar 

  45. Zhang F, Leitner J, Milford M et al (2015) Towards vision-based deep reinforcement learning for robotic motion control[J]. arXiv preprint arXiv:1511.03791

  46. Zhang B, Xie Y, Zhou J, Wang K, Zhang Z (2020) State-of-the-art robotic grippers, grasping and control strategies, as well as their applications in agricultural robots: a review[J]. Comput Electron Agric 177:105694

    Article  Google Scholar 

  47. Zheng Y (2018) Real-time contact force distribution using a polytope hierarchy in the grasp wrench set[J]. Robot Auton Syst 99:97–109

    Article  Google Scholar 

Download references

Acknowledgments

We acknowledge t the support received from the HUST & UBTECH Intelligent Service Robots Joint Lab and the National Nature Science Foundation of China (Grant No. 71771098).

Author information

Authors and Affiliations

Authors

Contributions

Conceptualization, S. Z. and S.Q. L.; methodology, S.Z.; software, S.Z.; validation, S.Z., Z.X. and Y.F.; formal analysis, S.Z.; investigation, S.Z.; resources, S.Z.; data curation, S.Z.; writing—original draft preparation, S.Z.; writing—review and editing, S.Z.; visualization, S.Z.; supervision, S. Q. L.; project administration, S. Q. L. and Y. J. X; funding acquisition, Y.F. and Y. J. X. All authors have read and agreed to the published version of the manuscript.

Corresponding author

Correspondence to Shiqi Li.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, S., Li, S., Yan, F. et al. A mask area based grasping force control strategy for force sensor-less robot. Multimed Tools Appl 81, 24849–24867 (2022). https://doi.org/10.1007/s11042-022-12016-w

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-022-12016-w

Keywords

Navigation