Skip to main content

Integrating sensor placement and visual tracking strategies

  • Section 3 Visual Servoing
  • Conference paper
  • First Online:
Experimental Robotics III

Part of the book series: Lecture Notes in Control and Information Sciences ((LNCIS,volume 200))

  • 181 Accesses

Abstract

Real-time visual feedback is an important capability that many robotic systems must possess if these systems are to operate successfully in dynamically varying and/or uncalibrated environments. An eye-in-hand system is a common technique for providing camera motion to increase the working region of a visual sensor. Although eye-in-hand robotic systems have been well-studied, several deficiencies in proposed systems make them inadequate for actual use. Typically, the systems fail if manipulators pass through singularities or joint limits. Objects being tracked can be lost if the objects become defocused, occluded, or if features on the objects lie outside the field of view of the camera. In this paper, a technique is introduced for integrating a visual tracking strategy with dynamically determined sensor placement criteria. This allows the system to automatically determine, in real-time, proper camera motion for tracking objects successfully while accounting for the undesirable, but often unavoidable, characteristics of camera-lens and manipulator systems. The sensor placement criteria considered include focus, field-of-view, spatial resolution, manipulator configuration, and a newly introduced measure called resolvability. Experimental results are presented.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. P. Anandan, “Measuring Visual Motion from Image Sequences,” Technical Report COINS-TR-87-21, COINS Department, University of Massachusetts, 1987.

    Google Scholar 

  2. C.K. Cowan and A. Bergman, “Determining the Camera and Light Source Location for a Visual Task,” Proc. of the 1989 IEEE Int. Conf. on Robotics and Automation, 509–514, 1989.

    Google Scholar 

  3. E. Krotkov, “Focusing,” International Journal of Computer Vision, (1), 223–237, 1987.

    Article  Google Scholar 

  4. B. Nelson and P.K. Khosla, “Increasing the Tracking Region of an Eye-in-Hand System by Singularity and Joint Limit Avoidance,” Proc. of the 1993 IEEE Int. Conf. on Robotics and Automation, 3:418–3:423, 1993.

    Google Scholar 

  5. B. Nelson and P.K. Khosla, “The Resolvability Ellipsoid for Visual Servoing,” Technical Report CMU-RI-TR-93-28, The Robotics Institute, Carnegie Mellon University., 1993.

    Google Scholar 

  6. N.P. Papanikolopoulos, B. Nelson, and P.K. Khosla, “Monocular 3-D Visual Tracking of a Moving Target by an Eye-in-Hand Robotic System,” Proc. of the 31st IEEE Conf. on Decision and Control (31stCDC), 3805–3810, 1992.

    Google Scholar 

  7. N.P. Papanikolopoulos and P.K. Khosla, “Robotic Visual Servoing Around a Static Target: An Example of Controlled Active Vision,” Proc. of the 1992 American Control Conference, 1489–1494, 1992.

    Google Scholar 

  8. D.B. Stewart, D.E. Schmitz, and P.K. Khosla, “The Chimera II Real-Time Operating System for Advanced Sensor-Based Control System,” IEEE Trans. Sys., Man and Cyb., 22, 1282–1295, 1992.

    Google Scholar 

  9. K. Tarabanis, R.Y. Tsai, and P.K. Allen, “Automated Sensor Planning for Robotic Vision Tasks,” Proc. of the 1991 IEEE Int. Conf. on Robotics and Automation, 76–82, 1991.

    Google Scholar 

  10. M.J. Tsai, Workspace Geometric Characterization of Industrial Robot, Ph.D. Thesis, Ohio State University, Department of Mechanical Engineering, 1986.

    Google Scholar 

  11. S. Yi, R.M. Haralick, and L.G. Shapiro, “Automatic Sensor and Light Positioning for Machine Vision,” Proc. of the 10th Int. Conf. on Pattern Recognition, 55–59, 1990.

    Google Scholar 

  12. T. Yoshikawa, “Manipulability of Robotic Mechanisms,” Robotics Research 2, eds. H. Hanafusa and H. Inoue, 439–446, MIT Press, Cambridge, MA, 1985.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Tsuneo Yoshikawa (PhD)Fumio Miyazaki (PhD)

Rights and permissions

Reprints and permissions

Copyright information

© 1994 Springer-Verlag London Limited

About this paper

Cite this paper

Nelson, B., Khosla, P.K. (1994). Integrating sensor placement and visual tracking strategies. In: Yoshikawa, T., Miyazaki, F. (eds) Experimental Robotics III. Lecture Notes in Control and Information Sciences, vol 200. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0027593

Download citation

  • DOI: https://doi.org/10.1007/BFb0027593

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-19905-2

  • Online ISBN: 978-3-540-39355-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics