Abstract
This paper proposes a new approach to grasp novel objects while avoiding real-time obstacles. The general idea is to perform grasping of novel objects and do collision avoidance at the same time. There are two main contributions. Firstly, a fast and robust method of real-time grasp detection is presented based on morphological image processing and machine learning. Secondly, we integrate our robotic grasping algorithms with some existing collision prediction strategies. It is really helpful to grasp objects on the condition that a robot is surrounded by obstacles. Additionally, it is very practical, runs in real-time and can be easily adaptable with respect to different robots and working conditions. We demonstrate our approaches using the Kinect sensor and the Baxter robot with a series of experiments.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Chu, F.J., Vela, P.A.: Deep grasp: detection and localization of grasps with deep neural networks. arXiv preprint arXiv:1802.00520 (2018)
Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995)
Fischinger, D., Vincze, M.: Shape based learning for grasping novel objects in cluttered scenes. In: SyRoCo, pp. 787–792 (2012)
Jiang, Y., Moseson, S., Saxena, A.: Efficient grasping from RGBD images: learning using a new rectangle representation. In: 2011 IEEE International Conference on Robotics and Automation (ICRA), pp. 3304–3311. IEEE (2011)
Lenz, I., Lee, H., Saxena, A.: Deep learning for detecting robotic grasps. Int. J. Robot. Res. 34(4–5), 705–724 (2015)
Maciejewski, A.A., Klein, C.A.: Obstacle avoidance for kinematically redundant manipulators in dynamically varying environments. Int. J. Robot. Res. 4(3), 109–117 (1985)
Makhal, A., Thomas, F., Gracia, A.P.: Grasping unknown objects in clutter by superquadric representation. In: 2018 Second IEEE International Conference on Robotic Computing (IRC), pp. 292–299. IEEE (2018)
Miller, A.T., Allen, P.K.: Graspit! A versatile simulator for robotic grasping. IEEE Robot. Autom. Mag. 11(4), 110–122 (2004)
Miller, A.T., Knoop, S., Christensen, H.I., Allen, P.K.: Automatic grasp planning using shape primitives. In: 2003 Proceedings of the IEEE International Conference on Robotics and Automation, ICRA 2003, vol. 2, pp. 1824–1829. IEEE (2003)
ten Pas, A., Platt, R.: Using geometry to detect grasp poses in 3D point clouds. In: Bicchi, A., Burgard, W. (eds.) Robotics Research. SPAR, vol. 2, pp. 307–324. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-51532-8_19
Redmon, J., Angelova, A.: Real-time grasp detection using convolutional neural networks. In: 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 1316–1322. IEEE (2015)
Saxena, A., Driemeyer, J., Ng, A.Y.: Robotic grasping of novel objects using vision. Int. J. Robot. Res. 27(2), 157–173 (2008)
ten Pas, A., Platt, R.: Localizing handle-like grasp affordances in 3D point clouds. In: Hsieh, M.A., Khatib, O., Kumar, V. (eds.) Experimental Robotics. STAR, vol. 109, pp. 623–638. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-23778-7_41
Wang, X., Yang, C., Ju, Z., Ma, H., Fu, M.: Robot manipulator self-identification for surrounding obstacle detection. Multimedia Tools Appl. 76(5), 6495–6520 (2017)
Wang, X., Yang, C., Ma, H., Cheng, L.: Shared control for teleoperation enhanced by autonomous obstacle avoidance of robot manipulator. In: 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4575–4580. IEEE (2015)
Weisz, J., Allen, P.K.: Pose error robust grasping from contact wrench space metrics. In: 2012 IEEE International Conference on Robotics and Automation (ICRA), pp. 557–562. IEEE (2012)
Acknowledgement
This work was partially supported by National Nature Science Foundation (NSFC) under Grants 61473120, 61811530281 and 51705371, Science and Technology Planning Project of Guangzhou 201607010006, State Key Laboratory of Robotics and System (HIT) Grant SKLRS-2017-KF-13, and the Fundamental Research Funds for the Central Universities 2017ZD057.
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Zhang, J., Yang, C., Li, M., Feng, Y. (2018). Grasping Novel Objects with Real-Time Obstacle Avoidance. In: Ge, S., et al. Social Robotics. ICSR 2018. Lecture Notes in Computer Science(), vol 11357. Springer, Cham. https://doi.org/10.1007/978-3-030-05204-1_16
Download citation
DOI: https://doi.org/10.1007/978-3-030-05204-1_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-05203-4
Online ISBN: 978-3-030-05204-1
eBook Packages: Computer ScienceComputer Science (R0)