Abstract
The grasping force of service robots should be controlled within a proper range when they operate objects with unknown physical property (OUPP) whose physical property may be deformable. And that is challenging for service robots equipped with force sensor-less gripper. Therefore, this paper introduced a mask area based grasping force control strategy (MAFC) to realize the control of the grasping force in visual mode. Firstly, a semantic segmentation model was applied to monitor the deformation of the object. And this deformation was treated as a criterion to conduct the evaluation of the grasping state. Then the gripper can be adjusted based on the grasping state. Besides, a cup grasping experiment was conducted with the MAFC strategy. The experimental results showed that the success rate of grasping can be 90%. Meanwhile, the proportion of deformation was controlled within 2%. Moreover, with the MAFC strategy, the contrast experiments indicated that the grasping success rate can be increased by 40% compared with the” Pick and Place” module of MoveIt. All in all, the MAFC strategy can improve the grasping performance of service robots with force sensor-less gripper.
Similar content being viewed by others
References
Bender J, Mller M, Otaduy MA et al (2014) A survey on position-based simulation methods in computer graphics[J]. Comput Graph Forum 33(6):228–251
Bohg J, Morales A, Asfour T, Kragic D (2014) Data-driven grasp synthesis-a survey[J]. IEEE Trans Robot 30(2):289–309
Brie D, Bombardier V, Baeteman G, Bennis A (2016) Local surface sampling step estimation for extracting boundaries of planar point clouds[J]. ISPRS J Photogramm Remote Sens 119:309–319
Cretu AM, Payeur P, Petriu EM (2012) Soft object deformation monitoring and learning for model-based robotic hand manipulation[J]. IEEE Trans Syst Man Cybern B 42(3):740–753
Demarsin K, Vanderstraeten D, Volodine T, Roose D (2007) Detection of closed sharp edges in point clouds using normal estimation and graph theory[J]. Comput Aided Des 39(4):276–283
Deng Z, Yannick J, Zhang L et al (2020) Grasping force control of multi-fingered robotic hands through tactile sensing for object stabilization[J]. Sensors 20(4):1050
Dong S, Yuan W, Adelson E (2017) Improved gel sight tactile sensor for measuring geometry and slip[C]. 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE 137–144
Dong G, Zhu ZH (2015) Position-based visual servo control of autonomous robotic manipulators[J]. Acta Astronaut 115:291–302
Fang HS, Wang C, Gou M et al (2020) GraspNet-1 billion: a large-scale benchmark for general object grasping[C]. 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE 11441–11450
Fei X, Chen L, Xu Y et al (2017) Motion planning and object grasping of Baxter robot with bionic hand[M]. Advanced computational methods in life system modeling and simulation. Springer, Singapore 211–221
Gamal M, Siam M, Abdel-Razek M (2018) ShuffleSeg: real-time semantic segmentation network[J]. arXiv preprint arXiv:1803.03816
He K, Gkioxari G, Dollár P et al (2017) Mask r-cnn[C]. 2017 IEEE international conference on computer vision (ICCV). IEEE 2980–2988
Hu Z, Sun P, Pan J (2017) 3D deformable object manipulation using fast online gaussian rrocess regression[J]. 3(2): 979–986
Hui F, Payeur P, Cretu AM (2017) Visual tracking of deformation and classification of non-rigid objects with robot hand probing[J]. Robotics 6(1):5–30
Hwang W, Lim SC (2017) Inferring interaction force from visual information without using physical force sensors[J]. Sensors 17(11):2455
Ingrand F, Ghallab M (2017) Deliberation for autonomous robots: a survey [J]. Artif Intell 247:10–44
Kampouris C, Mariolis I, Peleka G et al (2016) Multi-sensorial and explorative recognition of garments and their material properties in unconstrained environment[C]. 2016 IEEE international conference on robotics and automation (ICRA). IEEE, Stockholm, Sweden 1656–1663
Kappassov Z, Corrales JA, Perdereau V (2015) Tactile sensing in dexterous robot hands-review[J]. Robot Auton Syst 74:195–220
Levine S, Pastor P, Krizhevsky A et al (2016) Learning hand-eye coordination for robotic grasping with large-scale data collection[C]. International Symposium on Experimental Robotics. Springer, Cham 173–184
Li S, Zhang S, Fu Y, Wang H, Han K (2020) Task-based obstacle avoidance for uncertain targets based on semantic object matrix[J]. Control Eng Pract 105:104649
Li S, Zhang S, Fu Y et al (2018) The grasping force control for force sensor-less robot through point clouds mask segmentation[C]. 2018 3rd International Conference on Robotics and Automation Engineering (ICRAE). IEEE 1–4
Liu W, Anguelov D, Erhan D et al (2016) Ssd: single shot multibox detector[C]//European conference on computer vision. Springer, Cham 21–37
Mahler J, Pokorny FT, Hou B et al (2016) Dex-net 1.0: a cloud-based network of 3d objects for robust grasping planning using a multi-armed bandit model with correlated rewards[C]. 2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE 1957–1964
Matas J, James S, Davison AJ (2018) Sim-to-real reinforcement learning for deformable object manipulation[J]. arXiv preprint arXiv:1806.07851
Mateo CM, Gil P, Torres F (2016) 3D visual data-driven spatiotemporal deformations for non-rigid object grasping using robot hands[J]. Sensors 16(5):640–665
Nadon F, Payeur P (2019) Automatic selection of grasping points for shape control of non-rigid objects [C]. In 2019 IEEE International Symposium on Robotic and Sensors Environments (ROSE). IEEE 1–7
Pham TH, Kyriazis N, Argyros AA et al (2015) Towards force sensing from vision: observing hand-object interactions to infer manipulation forces[C]. Proceedings of the IEEE conference on computer vision and pattern recognition. IEEE 2810–2819
Pham TH, Kyriazis N, Argyros AA et al (2017) Hand-object contact force estimation from markerless visual tracking [J]. IEEE Trans Pattern Anal Mach Intell 1–14
Pick-and-place. https://github.com/ros-planning/moveit_tutorials/tree/kinetic-devel/doc/pick_place, Accessed 26 Apr 2019
Quillen D, Jang E, Nachum O et al (2018) Deep reinforcement learning for vision-based robotic grasping: a simulated comparative evaluation of off-policy methods [C] // 2018 IEEE International Conference on Robotics and Automation (ICRA). IEEE 6284–6291
Reddy PVP, Suresh V (2013) A review on importance of universal gripper in industrial robot applications[J]. Int J Mech Eng Robot Res 2(2):255–264
Rusu RB, Cousins S (2011) 3d is here: point cloud library (pcl)[C]. 2011 IEEE international conference on robotics and automation (ICRA). IEEE 1–4
Sahbani A, El-Khoury S, Bidaud P (2012) An overview of 3D object grasp synthesis algorithms [J]. Robot Auton Syst 60(3):326–336
Sanchez J, Corrales JA et al (2018) Robotic manipulation and sensing of deformable objects in domestic and industrial applications: a survey[J]. Int J Robot Res 1–34
Stachowsky M, Abdullah HA, Moussa M et al (2015) An object exploration strategy for com and mass determination for robot grasping [J]. Int J Mech Eng Robot Res 4(4):343
Steinemann D, Otaduy MA, Gross M (2008) Fast adaptive shape matching deformations [C]. Proceedings of the 2008 ACMSIGGRAPH/Euro graphics symposium on computer animation 87–94
Su J, Qiao H, Ou Z, Liu ZY (2015) Vision-based caging grasps of polyhedron-like workpieces with a binary industrial gripper[J]. IEEE Trans Autom Sci Eng 12(3):1033–1046
Tamada T, Ikarashi W, Yoneyama D et al (2014) High-speed bipedal robot running using high-speed visual feedback[C]. 2014 14th IEEE-RAS international conference on humanoid robots (humanoids). IEEE 140–145
Teichmann M, Weber M, Zoellner M et al (2016) Multinet: real-time joint semantic reasoning for autonomous driving[J]. arXiv preprint arXiv:1612.07695
Valencia AJ, Nadon F, Payeur P (2019) Toward real-time 3D shape tracking of deformable objects for robotic manipulation and shape control[C]. 2019 IEEE SENSORS. IEEE 1–4
Widhiada W, Nindhia TGT, Budiarsa N (2015) Robust control for the motion five fingered robot gripper [J]. Int J Mech Eng Robot Res 4(3):226–232
Wu J, Lu E, Kohli P et al (2017) Learning to see physics via visual de-animation [C]. Advances in Neural Information Processing Systems 152–163
Xu J, Danielczuk M, Ichnowski J et al (2020) Minimal work: a grasp quality metric for deformable hollow objects[C]. 2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE 1546–1552
Yang P, Sasaki K, Suzuki K et al (2017) Repeatable folding task by humanoid robot worker using deep learning[J]. IEEE Robot Autom Lett 2(2):397–403
Zhang F, Leitner J, Milford M et al (2015) Towards vision-based deep reinforcement learning for robotic motion control[J]. arXiv preprint arXiv:1511.03791
Zhang B, Xie Y, Zhou J, Wang K, Zhang Z (2020) State-of-the-art robotic grippers, grasping and control strategies, as well as their applications in agricultural robots: a review[J]. Comput Electron Agric 177:105694
Zheng Y (2018) Real-time contact force distribution using a polytope hierarchy in the grasp wrench set[J]. Robot Auton Syst 99:97–109
Acknowledgments
We acknowledge t the support received from the HUST & UBTECH Intelligent Service Robots Joint Lab and the National Nature Science Foundation of China (Grant No. 71771098).
Author information
Authors and Affiliations
Contributions
Conceptualization, S. Z. and S.Q. L.; methodology, S.Z.; software, S.Z.; validation, S.Z., Z.X. and Y.F.; formal analysis, S.Z.; investigation, S.Z.; resources, S.Z.; data curation, S.Z.; writing—original draft preparation, S.Z.; writing—review and editing, S.Z.; visualization, S.Z.; supervision, S. Q. L.; project administration, S. Q. L. and Y. J. X; funding acquisition, Y.F. and Y. J. X. All authors have read and agreed to the published version of the manuscript.
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Zhang, S., Li, S., Yan, F. et al. A mask area based grasping force control strategy for force sensor-less robot. Multimed Tools Appl 81, 24849–24867 (2022). https://doi.org/10.1007/s11042-022-12016-w
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-022-12016-w