Skip to main content

Visual control of grasping

  • Conference paper
  • First Online:
The confluence of vision and control

Part of the book series: Lecture Notes in Control and Information Sciences ((LNCIS,volume 237))

Abstract

Most robotic hands are either sensorless or lack the ability to accurately and robustly report position and force information relating to contact. This paper describes a system that integrates real-time computer vision with a sensorless gripper to provide closed loop feedback control for grasping and manipulation tasks. Many hand-eye coordination skills can be thought of as sensory-control loops, where specialized reasoning has been embodied as a feedback or control path in the loop’s construction. This system captures the essence of these hand-eye coordination skills in simple visual control primitives, which can be used to perform higher-level grasping and manipulation tasks. Experimental results are shown for two typical robotics tasks: the positioning task of locating, picking up, and inserting a bolt into a nut under visual control and the visual control of a bolt tightening task.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. S. Abrams. Sensor Planning in an active robot work-cell. PhD thesis, Department of Computer Science, Columbia University, January 1997.

    Google Scholar 

  2. P. Allen, A. Timcenko, B. Yoshimi, and P. Michelman. Automated tracking and grasping of a moving object with a robotic hand-eye system. IEEE Trans. on Robotics and Automation, 9(2):152–165, 1993.

    Article  Google Scholar 

  3. P. K. Allen, A. Miller, P. Oh, and B. Leibowitz. Using tactile and visual sensing with a robotic hand. In IEEE Int. Conf. on Robotics and Automation, pages 676–681, April 22–25 1997.

    Google Scholar 

  4. A. Blake. Computational modelling of hand-eye coordination. In J. Aloimonos, editor, Active Perception. Lawrence Erlbaum Associates, Inc., 1993.

    Google Scholar 

  5. A. Castano and S. Hutchinson. Visual compliance: Task-directed visual servo control. IEEE Trans. on Robotics and Automation, 10(3):334–342, June 1994.

    Article  Google Scholar 

  6. J. Feddema and C. S. G. Lee. Adaptive image feature prediction and control for visual tracking with a hand-eye coordinated camera. IEEE Transactions on Systems, Man and Cybernetics, 20:1172–1183, Sept./Oct. 1990.

    Article  Google Scholar 

  7. G. Hager, W. Chang, and A. Morse. Robot feedback control based on stereo vision: Towards calibration-free hand-eye coordination. In Proc. IEEE Conf. on Robotics and Automation, volume 4, pages 2850–2856, 1994.

    Google Scholar 

  8. G. D. Hager. Six DOF visual control of relative position. DCS RR-1038, Yale University, New Haven, CT, June 1994.

    Google Scholar 

  9. N. Hollinghurst and R. Cipolla. Uncalibrated stereo hand-eye coordination. Technical Report CUED/F-INFENG/TR126, Department of Engineering, University of Cambridge, 1993.

    Google Scholar 

  10. E. W. Kent, M. O. Shneier, and R. Lumia. Pipe: Pipelined image processing engine. Journal of Parallel and Distributed Computing, (2):50–78, 1985.

    Google Scholar 

  11. A. Koivo and N. Houshangi. Real-time vision feedback for servoing robotic manipulator with self-tuning controller. IEEE Transactions on System, Man, and Cybernetics, 21, No. 1:134–142, Feb. 1991.

    Article  Google Scholar 

  12. S. Maybank and O. Faugeras. A theory of self-calibration of a moving camera. International Journal of Computer Vision, 8(3):123–151, 1992.

    Article  Google Scholar 

  13. N. Papanikolopoulos, B. Nelson, and P. Khosla. Six degree-of-freedom hand/eye visual tracking with uncertain parameters. In Proc. of IEEE International Conference on Robotics and Automation, pages 174–179, May 1994.

    Google Scholar 

  14. R. Sharma, J. Herve, and P. Cucka. Analysis of dynamic hand positioning tasks using visual feedback. Technical Report CAR-TR-574, Center for Auto. Res., University of Maryland, 1991.

    Google Scholar 

  15. S. Skaar, W. Brockman, and R. Hanson. Camera-space manipulation. International Journal of Robotics Research, 6(4):20–32, Winter 1987.

    Article  Google Scholar 

  16. T. M. Sobh and R. Bajcsy. Autonomous observation under uncertainty. In Proc. of IEEE International Conference on Robotics and Automation, pages 1792–1798, May 1992.

    Google Scholar 

  17. K. Suzumori, S. Iikura, and H. Tanaka. Development of a flexible microactuator and its application to robotic mechanisms. In IEEE International Conference of Robotics and Automation, pages 1622–1627, April 1991.

    Google Scholar 

  18. B. Yoshimi. Visual Control of Robotics Tasks. PhD thesis, Dept. of Computer Science, Columbia University, May 1995.

    Google Scholar 

  19. B. H. Yoshimi and P. K. Allen. Active uncalibrated visual servoing. IEEE Transactions on Robotics and Automation, 11(5):516–521, August 1995.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

David J. Kriegman PhD Gregory D. Hager PhD A. Stephen Morse PhD

Rights and permissions

Reprints and permissions

Copyright information

© 1998 Springer-Verlag

About this paper

Cite this paper

Yoshimi, B.H., Allen, P.K. (1998). Visual control of grasping. In: Kriegman, D.J., Hager, G.D., Morse, A.S. (eds) The confluence of vision and control. Lecture Notes in Control and Information Sciences, vol 237. Springer, London. https://doi.org/10.1007/BFb0109673

Download citation

  • DOI: https://doi.org/10.1007/BFb0109673

  • Published:

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-85233-025-5

  • Online ISBN: 978-1-84628-528-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics