Hostname: page-component-7c8c6479df-7qhmt Total loading time: 0 Render date: 2024-03-28T16:16:04.693Z Has data issue: false hasContentIssue false

Convergence analysis for the uncalibrated robotic hand–eye coordination based on the unmodeled dynamics observer

Published online by Cambridge University Press:  11 August 2009

Jianbo Su*
Affiliation:
Research Center of Intelligent Robotics & Department of Automation, Shanghai Jiao Tong University, Shanghai 200240, China
*
*Corresponding author. E-mail: jbsu@sjtu.edu.cn

Summary

The uncalibrated robotic hand–eye coordination problem is firstly modeled by a dynamic system, where the unknown hand–eye relationship is regarded as the system's unmodeled dynamics. A state observer is then designed to estimate impacts of this modeling error together with the system's external disturbances. With the estimation results as the compensation, the system control is thus accomplished based on a nonlinear combination of the system state errors. Convergence analysis of the whole system under the proposed control scheme is emphasized. Simulations and experiment results are presented to verify the performance of the proposed approach.

Type
Article
Copyright
Copyright © Cambridge University Press 2009

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Hager, G. D., Chang, W. C. and Morse, A. S., “Robot Feedback Control Based on Stereo Vision: Towards Calibration-Free Hand/Eye Coordination,” Proceedings of IEEE International Conference on Robotics and Automation, San Francisco, CA, USA (1994) pp. 28502856.Google Scholar
2.Yoshimi, B. H. and Allen, P. K., “Alignment using an uncalibrated camera system,” IEEE Trans. Robot. Autom., 11 (4), 516521 (1995).Google Scholar
3.Hespanha, J. P., Dodds, Z., Hager, G. D. and Morse, A. S., “What tasks can be performed with an uncalibrated stereo vision system,” Int. J. Comput. Vis., 35 (1), 6585 (1999).Google Scholar
4.Sutanto, H., Sharma, R. and Varma, V., “Image Based Autodocking Without Calibration,” Proceedings of IEEE International Conference on Robotics and Automation, Albuquerque, NM, USA (1997) pp. 974979.CrossRefGoogle Scholar
5.Hsu, L. and Aquino, P. L. S., “Adaptive Visual Tracking with Uncertain Manipulator Dynamics and Uncalibrated Camera,” Proceedings of the 38th Conference on Decision & Control, Phoenix, AZ, USA (1999) pp. 12481253.Google Scholar
6.Hashimoto, H., Sato, M. and Harashima, F., “Visual control of robotic manipulator based on neural networks,” IEEE Trans. Ind. Electron., 39 (6), 490496 (1992).CrossRefGoogle Scholar
7.Stanley, K., Wu, Q. M. and Jerbi, A., “Neural Network-Based Vision Guided Robotics,” Proceedings of 1999 IEEE International Conference on Robotics and Automation, Detroit, Michigan, USA (1999) pp. 281286.Google Scholar
8.Pan, Q. L., Su, J. B. and Xi, Y. G., “Eye-in-hand 3D robotic visual tracking without calibration,” Acta Autom. Sin., 28 (3), 371377 (2002).Google Scholar
9.Su, J. B., Xi, Y. G., Hanebeck, U. D. and Schmidt, G., “Nonlinear visual mapping model for 3-D visual tracking with uncalibrated eye-in-hand robotic system,” IEEE Trans. Syst. Man Cybern. B, 34 (1), 652659 (2004).CrossRefGoogle ScholarPubMed
10.Huang, Y. and Han, J., “Analysis and design for the second order nonlinear continuous extended states observer,” Chin. Sci. Bul., 45 (21), 19381944 (2000).CrossRefGoogle Scholar
11.Han, J. Q., “The extended state observer of a class of uncertain systems,” Control Decis., 10 (1), 8588 (1995).Google Scholar
12.Utkin, V. I., Sliding Modes in Control and Optimization (Spring-Verlag, Berlin, 1992).CrossRefGoogle Scholar
13.Su, J., “Performance analysis of neural network-based uncalibrated hand-eye coordination,” Lecture Notes in Computer Science, 3498, 222227 (2005).CrossRefGoogle Scholar
14.Darouach, M., Zasadzinski, M. and Xu, S. J., “Full-order observers for linear systems with uncertain inputs,” IEEE Trans. Autom. Control, 39 (3), 606609 (1994).Google Scholar
15.Bhat, S. P. and Bernstein, D. S., “Continuous finite time stabilization of the translational and rotational double integrators,” IEEE Trans. Autom. Control, 43 (5), 678682 (1998).CrossRefGoogle Scholar
16.Gao, Z., Huang, Y. and Han, J., “An alternative paradigm for control system design,” Proceedings of the 40th IEEE International Conference on Decision and Control, Orlando, FL, USA5 (Dec. 2001) pp. 45784585.Google Scholar
17.Hutchinson, S., Hager, G. D. and Corke, P. I., “A tutorial on visual servo control,” IEEE Trans. Robot. Autom., 12 (5), 651670 (1996).CrossRefGoogle Scholar
18.Drummond, T. and Cipolla, R., “Real-time tracking of complex structures with on-line camera calibration,” Image Vis. Comput., 20, 427433 (2002).Google Scholar
19.Colombo, C. and Allotta, B., “Image-based robot task planning and control using a compact visual representation,” IEEE Trans. Syst. Man Cybern. A, 29, 9299 (1999).CrossRefGoogle Scholar
20.Su, J. B., Qiu, W., Ma, H. and Woo, P. Y., “Calibration-free eye-in-hand coordination based on an auto disturbance-rejection controller,” IEEE Trans. Robot., 20 (5), 899907 (2004).Google Scholar
21.Su, Y. X., Duan, B. Y., Zheng, C. H., Zhang, Y. F., Chen, G. D. and Mi, J. W., “Disturbance-rejection high-precision motion control of a Steward platform,” IEEE Trans. Control Syst. Technol., 12 (3), 364374 (2004).CrossRefGoogle Scholar
22.Su, J. B., Ma, H., Qiu, W. and Xi, Y. G., “Task-independent robotic uncalibrated hand-eye coordination based on extended state observer,” IEEE Trans. Syst. Man Cybern. B, 34 (4), 19171922 (2004).Google Scholar