Skip to main content
Log in

Semantic grasping: planning task-specific stable robotic grasps

  • Published:
Autonomous Robots Aims and scope Submit manuscript

Abstract

We present an example-based planning framework to generate semantic grasps, stable grasps that are functionally suitable for specific object manipulation tasks. We propose to use partial object geometry, tactile contacts, and hand kinematic data as proxies to encode task-related constraints, which we call semantic constraints. We introduce a semantic affordance map, which relates local geometry to a set of predefined semantic grasps that are appropriate to different tasks. Using this map, the pose of a robot hand with respect to the object can be estimated so that the hand is adjusted to achieve the ideal approach direction required by a particular task. A grasp planner is then used to search along this approach direction and generate a set of final grasps which have appropriate stability, tactile contacts, and hand kinematics. We show experiments planning semantic grasps on everyday objects and applying these grasps with a physical robot.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

References

  • Aleotti, J., & Caselli, S. (2011). Part-based robot grasp planning from human demonstration. In ICRA (pp. 4554–4560).

  • Aleotti, J., & Caselli, S. (2012). A 3d shape segmentation approach for robot grasping by parts. Robotics and Autonomous Systems, 60(3), 358–366. doi:10.1016/j.robot.2011.07.022.

    Article  Google Scholar 

  • Belongie, S., Malik, J., & Puzicha, J. (2002). Shape matching and object recognition using shape contexts. IEEE Transactions on Pattern Analysis and Machine Intelligence, 24, 509–522.

    Article  Google Scholar 

  • Ben Amor, H., Heumer, G., Jung, B., & Vitzthum, A. (2008). Grasp synthesis from low-dimensional probabilistic grasp models. Computer Animation and Virtual Worlds, 19(3–4), 445–454. doi:10.1002/cav.252.

    Article  Google Scholar 

  • Ben Amor, H., Kroemer, O., Hillenbrand, U., Neumann, G., & Peters, J. (2012). Generalization of human grasping for multi-fingered robot hands. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 2043–2050).

  • Berenson, D., & Srinivasa, S. (2008). Grasp synthesis in cluttered environments for dexterous hands. In IEEE-RAS International Conference on Humanoid Robots (Humanoids08).

  • Berenson, D., Diankov, R., Nishiwaki, K., Kagami, S., & Kuffner, J. (2007). Grasp planning in complex scenes. In 7th IEEE-RAS International Conference on Humanoid Robots (pp. 42–48). doi:10.1109/ICHR.2007.4813847.

  • Boularias, A., Kroemer, O., & Peters, J. (2011). Learning robot grasping from 3-d images with markov random fields. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 1548–1553). doi:10.1109/IROS.2011.6094888.

  • Castiello, U. (2005). The neuroscience of grasping. Nature Reviews Neuroscience, 6(9), 726–736.

    Article  Google Scholar 

  • Chang, L. Y., Zeglin, G., & Pollard, N. (2008). Preparatory object rotation as a human-inspired grasping strategy. In IEEE-RAS International Conference on Humanoid Robots (pp. 527–534).

  • Ciocarlie, M., Lackner, C., & Allen, P. (2007). Soft finger model with adaptive contact geometry for grasping and manipulation tasks. In World Haptics Conference (pp. 219–224).

  • Ciocarlie, M. T., & Allen, P. K. (2009). Hand Posture Subspaces for Dexterous Robotic Grasping. The International Journal of Robotics Research, 28(7), 851–867.

    Article  Google Scholar 

  • Dang, H., & Allen, P. (2010). Robot learning of everyday object manipulations via human demonstration. In IROS (pp. 1284–1289). doi:10.1109/IROS.2010.5651244.

  • Dang, H., & Allen, P. (2012). Semantic grasping: Planning robotic grasps functionally suitable for an object manipulation task. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 1311–1317).

  • Dang, H., & Allen, P. (2014). Stable grasping under pose uncertainty using tactile feedback. Autonomous Robots, 36(4), 309–330.

  • Dang, H., Weisz, J., & Allen, P. (2011). Blind grasping: Stable robotic grasping using tactile feedback and hand kinematics. In IEEE International Conference on Robotics and Automation (ICRA) (pp. 5917–5922). doi:10.1109/ICRA.2011.5979679.

  • Detry, R., Kraft, D., Buch, A., Kruger, N., & Piater, J. (2010). Refining grasp affordance models by experience. In IEEE International Conference on Robotics and Automation (pp. 2287–2293). doi:10.1109/ROBOT.2010.5509126.

  • Diankov, R., & Kuffner, J. (2008). Openrave: A planning architecture for autonomous robotics. Tech. Rep. CMU-RI-TR-08-34, Robotics Institute, Pittsburgh, PA.

  • Dogar, M., & Srinivasa, S. (2011). A framework for push-grasping in clutter. In P. Abbeel (Ed.), Hugh Durrant-Whyte NR. Science and Systems VII. Cambridge: MIT Press.

    Google Scholar 

  • Ferrari, C., & Canny, J. (1992). Planning optimal grasps. In ICRA (vol. 3, pp. 2290–2295).

  • Geidenstam, S., Huebner, K., Banksell, D., & Kragic, D. (2009). Learning of 2D grasping strategies from box-based 3D object approximations. In Proceedings of Robotics: Science and Systems.

  • Gioioso, G., Salvietti, G., Malvezzi, M., & Prattichizzo, D. (2012). An object-based approach to map human hand synergies onto robotic hands with dissimilar kinematics. In Proceedings of Robotics: Science and Systems, Sydney, Australia.

  • Goldfeder, C., & Allen, P. K. (2011). Data-driven grasping. Auton Robots, 31, 1–20. doi:10.1007/s10514-011-9228-1.

    Article  Google Scholar 

  • Haschke, R., Steil, J. J., Steuwer, I., & Ritter, H. (2005). Task-oriented quality measures for dextrous grasping. In Proc. Conference on Computational Intelligence in Robotics and Automation.

  • Hillenbrand, U., & Roa, M. A. (2012). Transferring functional grasps through contact warping and local replanning. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

  • Hsiao, K., Kaelbling, L., & Lozano-Prez, T. (2011). Robust grasping under object pose uncertainty. Autonomous Robots, 31, 253–268. doi:10.1007/s10514-011-9243-2.

    Article  Google Scholar 

  • Kehoe, B., Matsukawa, A., Candido, S., Kuffner, J., & Goldberg, K. (2013). Cloud-Based Robot Grasping with the Google Object Recognition Engine. http://goldberg.berkeley.edu/pubs/ICRA-Cloud-Grasping-May-2013.pdf. Accessed 22 May 2014.

  • Li, Z., & Sastry, S. (1987). Task oriented optimal grasping by multifingered robot hands. In ICRA (pp. 389–394).

  • Ling, H., & Jacobs, D. (2007). Shape classification using the inner-distance. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(2), 286–299. doi:10.1109/TPAMI.2007.41.

    Article  Google Scholar 

  • Manis, R., & Santos, V. (2011a) Characteristics of a three-fingered grasp during a pouring task requiring dynamic stability. In Proc. Neural Control of Movement Ann Mtg.

  • Manis, R., & Santos, V. (2011b). Multi-digit coordination during a pouring task that requires dynamic stability. In Proc Ann Mtg Amer Soc Biomech.

  • Miller, A. T., & Allen, P. K. (2004). Graspit a versatile simulator for robotic grasping. IEEE Robotics and Automation Magazine, 11(4), 110–122.

    Article  Google Scholar 

  • Papazov, C., & Burschka, D. (2010). An efficient ransac for 3d object recognition in noisy and occluded scenes. In ACCV (pp. 135–148).

  • Popovic, M., Kraft, D., Bodenhagen, L., Baseski, E., Pugeault, N., Kragic, D., et al. (2010). A strategy for grasping unknown objects based on co-planarity and colour information. Robotics and Autonomous Systems, 58(5), 551–565.

  • Prats, M., Sanz. P., & del Pobil, A. (2007). Task-oriented grasping using hand preshapes and task frames. In IEEE International Conference on Robotics and Automation (pp. 1794–1799).

  • Rosales, C., Ros, L., Porta, J. M., & Suarez, R. (2010). Synthesizing Grasp Configurations with Specified Contact Regions. The International Journal of Robotics Research,. doi:10.1177/0278364910370218.

    Google Scholar 

  • Sahbani, A., & El-Khoury, S. (2009). A hybrid approach for grasping 3d objects. In IROS (pp. 1272–1277).

  • Saxena, A., Driemeyer, J., Kearns, J., & Ng, A. Y. (2007). Robotic grasping of novel objects. Advances in Neural Information Processing Systems, 19, 1209–1216.

    Google Scholar 

  • Song, D., Huebner, K., Kyrki, V., & Kragic, D. (2010). Learning task constraints for robot grasping using graphical models. In IROS (pp. 1579–1585). doi:10.1109/IROS.2010.5649406.

  • Varadarajan, K., & Vincze, M. (2011a) Knowledge representation and inference for grasp affordances. In Crowley, J., Draper, B., Thonnat, M. (Eds.) Computer Vision Systems, Lecture Notes in Computer Science (vol. 6962, pp. 173–182). Berlin: Springer. doi:10.1007/978-3-642-23968-7_18.

  • Varadarajan, K., & Vincze, M. (2011b). Object part segmentation and classification in range images for grasping. In 15th International Conference on Advanced Robotics (ICAR) (pp. 21–27). doi:10.1109/ICAR.2011.6088647.

  • Ying, L., Fu, J., & Pollard, N. (2007). Data-driven grasp synthesis using shape matching and task-based pruning. IEEE Transactions on Visualization and Computer Graphics, 13(4), 732–747. doi:10.1109/TVCG.2007.1033.

    Article  Google Scholar 

Download references

Acknowledgments

This work is funded by NSF Grant IIS-0904514.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hao Dang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dang, H., Allen, P.K. Semantic grasping: planning task-specific stable robotic grasps. Auton Robot 37, 301–316 (2014). https://doi.org/10.1007/s10514-014-9391-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10514-014-9391-2

Keywords

Navigation