Skip to main content

Real-time vision plus remote-brained design opens a new world for experimental robotics

  • Chapter 3 Autonomy Via Vision
  • Conference paper
  • First Online:
Experimental Robotics IV

Part of the book series: Lecture Notes in Control and Information Sciences ((LNCIS,volume 223))

Abstract

We present our approach for experimental robotics based on real-time tracking vision and remote-brained design. A robot with remote-brained design does not bring its own brain within the body but leaves the brain in the mother environment. The robot talks with it by radio links. The brain is raised in the mother environment inherited over generations. The key idea of the remote-brained approach is that of interfacing intelligent software systems with real robot bodies through wireless technology. In this framework the robot system can have a powerful vision system in the brain environment. We have applied this approach toward the creation of vision-based dynamic and intelligent behaviors in various robot configurations. In this paper we introduce our robot vision system and the remote-brained approach and describe visual processes for vision-based behaviors with remote-brained robots.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Masayuki Inaba. Remote-Brained Robotics: Interfacing AI with Real World Behaviors. In Robotics Research: The Sixth International Symposium, pp. 335–344. International Foundation for Robotics Research, 1993.

    Google Scholar 

  2. M. Inaba, S. Kagami, K. Sakaki, F. Kanehiro, and H. Inoue. Vision-Based Multisensor Integration in Remote-Brained Robots. In 1994 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 747–754, 1994.

    Google Scholar 

  3. H. Inoue, T. Tachikawa, and M. Inaba. Robot vision system with a correlation chip for real-time tracking, optical flow and depth map generation. In Proceedings of the 1992 IEEE International Conference on Robotics and Automation, pp. 1621–1626, 1992.

    Google Scholar 

  4. M. Inaba, T. Kamada, and H. Inoue. Rope Handling by Mobile Hand-Eye Robots. In Proceedings of the International Conference on Advanced Robotics ICAR '93, pp. 121–126, 1993.

    Google Scholar 

  5. Masayuki Inaba, Satoshi Kagami, and Hirochika Inoue. Real time vision-based control in sumo playing robot. In Proceedings of the 1993 JSME International Conference on Advanced Mechatronics, pp. 854–859, 1993.

    Google Scholar 

  6. Toshihiro Matsui and Masayuki Inaba. EusLisp: An Object-Based Implementation of Lisp. Journal of Information Processing. Vol. Vol. 13, No. 3, pp. 327–338, 1990.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Oussama Khatib J. Kenneth Salisbury

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag London Limited

About this paper

Cite this paper

Inaba, M., Kagami, S., Inoue, H. (1997). Real-time vision plus remote-brained design opens a new world for experimental robotics. In: Khatib, O., Salisbury, J.K. (eds) Experimental Robotics IV. Lecture Notes in Control and Information Sciences, vol 223. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0035201

Download citation

  • DOI: https://doi.org/10.1007/BFb0035201

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-76133-4

  • Online ISBN: 978-3-540-40942-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics