Abstract
Real-time visual feedback is an important capability that many robotic systems must possess if these systems are to operate successfully in dynamically varying and/or uncalibrated environments. An eye-in-hand system is a common technique for providing camera motion to increase the working region of a visual sensor. Although eye-in-hand robotic systems have been well-studied, several deficiencies in proposed systems make them inadequate for actual use. Typically, the systems fail if manipulators pass through singularities or joint limits. Objects being tracked can be lost if the objects become defocused, occluded, or if features on the objects lie outside the field of view of the camera. In this paper, a technique is introduced for integrating a visual tracking strategy with dynamically determined sensor placement criteria. This allows the system to automatically determine, in real-time, proper camera motion for tracking objects successfully while accounting for the undesirable, but often unavoidable, characteristics of camera-lens and manipulator systems. The sensor placement criteria considered include focus, field-of-view, spatial resolution, manipulator configuration, and a newly introduced measure called resolvability. Experimental results are presented.
Preview
Unable to display preview. Download preview PDF.
References
P. Anandan, “Measuring Visual Motion from Image Sequences,” Technical Report COINS-TR-87-21, COINS Department, University of Massachusetts, 1987.
C.K. Cowan and A. Bergman, “Determining the Camera and Light Source Location for a Visual Task,” Proc. of the 1989 IEEE Int. Conf. on Robotics and Automation, 509–514, 1989.
E. Krotkov, “Focusing,” International Journal of Computer Vision, (1), 223–237, 1987.
B. Nelson and P.K. Khosla, “Increasing the Tracking Region of an Eye-in-Hand System by Singularity and Joint Limit Avoidance,” Proc. of the 1993 IEEE Int. Conf. on Robotics and Automation, 3:418–3:423, 1993.
B. Nelson and P.K. Khosla, “The Resolvability Ellipsoid for Visual Servoing,” Technical Report CMU-RI-TR-93-28, The Robotics Institute, Carnegie Mellon University., 1993.
N.P. Papanikolopoulos, B. Nelson, and P.K. Khosla, “Monocular 3-D Visual Tracking of a Moving Target by an Eye-in-Hand Robotic System,” Proc. of the 31st IEEE Conf. on Decision and Control (31stCDC), 3805–3810, 1992.
N.P. Papanikolopoulos and P.K. Khosla, “Robotic Visual Servoing Around a Static Target: An Example of Controlled Active Vision,” Proc. of the 1992 American Control Conference, 1489–1494, 1992.
D.B. Stewart, D.E. Schmitz, and P.K. Khosla, “The Chimera II Real-Time Operating System for Advanced Sensor-Based Control System,” IEEE Trans. Sys., Man and Cyb., 22, 1282–1295, 1992.
K. Tarabanis, R.Y. Tsai, and P.K. Allen, “Automated Sensor Planning for Robotic Vision Tasks,” Proc. of the 1991 IEEE Int. Conf. on Robotics and Automation, 76–82, 1991.
M.J. Tsai, Workspace Geometric Characterization of Industrial Robot, Ph.D. Thesis, Ohio State University, Department of Mechanical Engineering, 1986.
S. Yi, R.M. Haralick, and L.G. Shapiro, “Automatic Sensor and Light Positioning for Machine Vision,” Proc. of the 10th Int. Conf. on Pattern Recognition, 55–59, 1990.
T. Yoshikawa, “Manipulability of Robotic Mechanisms,” Robotics Research 2, eds. H. Hanafusa and H. Inoue, 439–446, MIT Press, Cambridge, MA, 1985.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1994 Springer-Verlag London Limited
About this paper
Cite this paper
Nelson, B., Khosla, P.K. (1994). Integrating sensor placement and visual tracking strategies. In: Yoshikawa, T., Miyazaki, F. (eds) Experimental Robotics III. Lecture Notes in Control and Information Sciences, vol 200. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0027593
Download citation
DOI: https://doi.org/10.1007/BFb0027593
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-19905-2
Online ISBN: 978-3-540-39355-9
eBook Packages: Springer Book Archive