Abstract
Conventional T.V. cameras are limited in their field of view. A real-time omnidirectional camera which can acquire an omnidirectional (360 degrees) field of view at video rate and which could be applied in a variety of fields, such as autonomous navigation, telepresence, virtual reality and remote monitoring, is presented. We have developed three different types of omnidirectional image sensors, and two different types of multiple-image sensing systems which consist of an omnidirectional image sensor and binocular vision. In this paper, we describe the outlines and fundamental optics of our developed sensors and show examples of applications for robot navigation.
Similar content being viewed by others
References
Aihara, N., Iwasa, H., Yokoya, N., and Takemura, H. 1998. Memory-based self-localization using omnidirectional images. In Proc. of Int. Conf. Pattern Recognition, pp. 1799–1803.
Ayres, W.A. 1942. Projecting device. US Patent 2304434.
Bang, S.W., Yu, W., and Chung, M.J. 1995. Sensor-based local homing using omnidirectional range and intensity sensing system for indoor mobile robot navigation. In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 542–548.
Barth, M. and Barrows, C. 1996. A fast panoramic imaging system and intelligent imaging technique for mobile robots. In Prof. of IEEE/RSJ Int. Conf. Intelligent. Robots and Systems, no. 2, pp. 626–633.
Brassart, E., Delahoche, L., Cauchois, C., Drocourt, C., Pegard, C., and Mouaddib, M. 2000. Experimental results got with the omnidirectional vision sensor SYCLOP. In Proc. IEEE Workshop on Omnidirectional Vision, pp. 145–160.
Bruckstein, A. and Richardson, T. 2000. Omniview cameras with curved surface mirrors. In Proc. IEEE Workshop on Omnidirectional Vision, pp. 79–84.
Cao, Z. and Hall, E.L. 1990. Beacon recognition in omni-vision guidance. In Proc. of Int. Conf. Optoelectronic Science and Engineering, vol. 1230, pp. 788–790.
Cao, Z.L., Oh, S.J., and Hall, E.L. 1986. Dynamic omnidirectional vision for mobile robots. J. Robotic Systems, 3(1):5–17.
Chahl, J.S. and Srinivasan, M.V. 1997. Reflective surfaces for panoramic imaging. Applied Optics, 36(31):8275–8285.
Chang, P. and Hebert, M. 2000. Omni-directional structure from motion. In Proc. IEEEWorkshop on Omnidirectional Vision, pp. 127– 133.
Davis, J.E., Todd, M.N., Ruda, M., Stuhlinger, T.W., and Castle, K.R. 1997. Optics assembly for observing a panoramic scene. US Patent 5627675.
Delahoche, L., Pegard, C., Mouaddib, E.M., and Vasseur, P. 1998. Incremental map building for mobile robot navigation in an indoor environment. In Proc. of IEEE Int. Conf. Robotics andAutomation, pp. 2560–2565.
Etoh, M., Aoki, T., and Hata, K. 1999. Estimation of structure and motion parameters for a roaming robot that scans the space. In Proc. IEEE Int. Conf. on Computer Vision, vol. 1, pp. 579–584.
Glucjman, J. and Nayer, S.K. 1998. Ego-motion and omnidirectional cameras. In Proc. IEEE Int. Conf. on Computer Vision, pp. 999–1005.
Greguss, P. 1985. The tube peeper: A new concept in endoscopy. Optics and Laser Technology, 41–45.
Hicks, R.A., Pettey, D., Daniilidis, K., and Bajcsy, R. 2000. Closed form solutions for reconstruction via complex analysis. Journal of Mathematical Imaging and Vision, 13(1):57–70.
Hiura, R., Yagi, Y., and Yachida, M. 1995. Topological mapping using a combination of edge and region information. In Proc. RSJ Robot Symposium, pp. 151–156.
Holenstein, A. and Badreddin, E. 1991. Collision avoidance in a behavior-based mobile robot design. In Proc. IEEE Int. Conf. on Robotics and Automation, vol. 1, pp. 898–903.
Irani, M. and Peleg, S. 1991. Improving resolution by image registration. Computer Vision, Graphics, and Image Processing, 53(3):231–239.
Ishiguro, H. and Tsuji, S. 1996. Image-based memory of environment. In Proc. of Int. Conf. Intelligent Robots and Systems, pp. 634–639.
Ishiguro, H., Yamamoto, M., and Tsuji, S. 1992. Omni-directional stereo. IEEE Trans. Pattern Analysis and Machine Intelligence, 14(2):257–262.
Kawasaki, H., Yatabe, T., Ikeuchi, K., and Sakauchi, M. 2000a. Construction of 3D city map using EPI analysis and DP matching. In Proc. Asian Conf. Computer Vision.
Kawasaki, H., Ikeuchi, K., and Sakauchi, M. 2000b. EPI analysis of omni-camera image. In Proc. IAPR Int. Conf. of Pattern Recognition, vol. I, pp. 379–383.
Konparu, T., Yagi, Y., and Yachida, M. 1997. Finding and tracking a person by cooperation between an omnidirectional sensor robot and a binocular vision robot. In Proc. 15th Annual Conf. on RSJ, vol. 3, pp. 957–958.
Konparu, T., Yagi, Y., and Yachida, M. 1998. Finding and tracking a person by cooperation between an omnidirectional sensor robot and a binocular vision robot. In Proc. Meeting on Image Recognition and Understanding MIRU98, vol. II, pp. 7–12.
Li, S., Ochi, A., Yagi, Y., and Yachida, M. 2000. Making 2D map of environments based upon routes scenes. Journal of Autonomous Robots, 8(2):117–128.
Matthews, B.O., Perdue, D., and Hall, E.L. 1995. Omnidirectional vision applications for line following. In Proc. of SPIE Intelligent Robots and Computer Vision XIV: Algorithms, Techniques, Active Vision, and Materials Handling, vol. 2588, pp. 438–449.
Matsumoto, Y., Inaba, M., and Inoue, H. 1997. Memory-based navigation using omni-view sequence. In Proc. of Int. Conf. Field and Service Robotics, pp. 184–191.
Morita, T., Yasukawa, Y., Inamoto, Y., Uchiyama T., and Kawakami S. 1989. Measurement in three dimensions by motion stereo and spherical mapping. In Proc. of IEEE Computer Vision and Pattern Recognition, pp. 422–434.
Nagahara, H., Yagi, Y., and Yachida, M. 2000. Super-resolution from an omnidirectional image sequence. In Proc. IEEE Int. Conf. on Industrial Electronics, Control and Instrumentation, pp. 2559– 2564.
Nagahara, H., Yagi, Y., and Yachida, M. 2001. Resolution imrpoving method from multi-focal omnidirectional images. IEEE International Conference on Image Processing, vol. 1, pp. 654–657.
Nagahara, H., Yagi, Y., and Yachida, M. 2001b. Hi-resolution modeling of 3D environment using omnidirectional image sensor. Technical Report of IEICE, PRMU-2000-152, pp. 39–46.
Nayar, S.K. and Peri, V. 1999. Folded catadioptric cameras. In Proc. IEEE Computer Society Conf. on Computer Vision and Pattern Recognition, Vol. II, pp. 217–223.
Nayar, S.K. and Karmarkar, A. 2000. 360 × 360 mosaics. In Proc. IEEE Computer Vision and Pattern Recognition, vol. II, pp. 388–395.
Nelson, R.C. and Aloinomous, J. 1988. Finding motion parameters from spherical motion fields. Biological Cybernetics, 58:261–273.
Pegard, C. and Mouaddib, E.M. 1996. Mobile robot using a panoramic view. In Proc. of IEEE Int. Conf. Robotics and Automation, vol. 1, pp. 89–94.
Peleg, S., Keren, D., and Schweitzer, L. 1987. Improving image resolution using subpixel motion. Pattern Recognition Letters, 5(3):223–226.
Powell, I. 1995. Panoramic lens. US Patent 5473474.
Rees, W.D. and Mich, W. 1970. Panoramic television viewing system. US Patent 3505465.
Roning, J.J., Cao, Z.L., and Hall, E.L. 1987. Color target recognition using omnidirectional vision. In Proc. of SPIE Optics, Illumination, and Image Sensing for Machine Vision, vol. 728, pp. 57–63.
Rosendahi, G.R. and Dykes, W.V. 1983. Lens system for panoramic imagery. US Patent 4395093.
Rossi, B. 1962. Optics, Addison-Wesley Publishing Co. Inc.
Santos-Victor, J., Sandini, G., Curotto, F., and Garibaldi, S. 1995. Divergent stereo in autonomous navigation: From Bees to Robots. Int. J. of Computer Vision, 14:159–177.
Saraclik, K.B. 1989. Characterizing an indoor environment with a mobile robot and uncalibrated stereo. In Proc. of IEEE Int. Conf. Robotics and Automation, pp. 984–989.
Simamura, J., Yokoya, N., Takemura, H., and Yamazawa, K. 2000. Construction of an immersive mixed environment using an omnidirectional image sensor. In Proc. IEEE Workshop on Omnidirectional Vision, pp. 62–69.
Svodoba, T., Pajdla, T., and Hlavac, V. 1998. Epipolar geometry for panoramic cameras. Europian Conf. Computer Vision, 218–232.
Takeya, A., Kuroda, T., Nishiguchi, K., and Ichikawa, A. 1998. Omnidirectional vision system using two mirrors. In Proc. SPIE, vol. 3430, pp. 50–60.
Tsai, R.Y. and Hang, T.S. 1984. Multiframe image resolution and registration. Advanced in Computer Vision and Image Processing, 1:317–339.
Tsuji, Y., Yagi, Y., and Yachida, M. 1997. The observation planing for generating an environment map. In Proc. 15th Annual Conf. RSJ, vol. 3, pp. 961–962.
Utsumi, A., Yagi, Y., and Yachida, M. 1992. Estimating surface and spatial structure from wire-frame model using geometrical and heuristical relation. In Proc. IAPR Machine Vision and Appl., pp. 25–28.
Wei, S., Yagi, Y., and Yachida, M. 1998. Building a local floor map by use of ultrasonic and omnidirectional vision sensors. Advanced Robotics, 12(4):433–453.
Yachida, M. 1998. Omnidirectional sensing and combined multiple sensing. In Proc.Workshop on Computer Vision for Virtual Reality Based Human Communications, pp. 20–27.
Yagi, Y. 1999. Omnidirectional sensing and its applications, IEICE Trans. Information and Systems, E82-D(3):568–579.
Yagi, Y., Egami, K., and Yachida, M. 1997. Map generation for multiple image sensing sensor MISS under unknown robot egomotion. In Proc. Int. Conf. Intelligent Robots and Systems, vol. 2, pp. 1024–1029.
Yagi, Y., Fujimura, S., and Yachida, M. 1998. Route representation for mobile robot navigation by omnidirectional route panorama fourier transformation. In Proc. IEEE Int. Conf. Robotics and Automation, pp. 1250–1255.
Yagi, Y., Hamada, H., Benson, N., and Yachida, M. 2000. Generation of stationary environmental map under unknown robot motion. In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1487–1492.
Yagi, Y., Izuhara, S., and Yachida, M. 1996. The integration of an environmental map observed by multiple mobile robots with omnidirectional image sensor COPIS. In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, vol. 2, pp. 640– 647.
Yagi, Y. and Kawato, S. 1990. Panorama scene analysis with conic projection. In Proc. IEEE/RSJ Int.Workshop on Intelligent Robots and Systems, pp. 181–187.
Yagi, Y., Kawato, S., and Tsuji, S. 1994. Real-time omnidirectional image sensor (COPIS) for vision-guided navigation. IEEE Trans. Robotics and Automation, 10(1):11–22.
Yagi, Y., Lin, Y., and Yachida, M. 1994. Detection of unknown moving objects by reciprocation of observed information between mobile robot. In Proc. IEEE/RSJ/GI Int. Conf. on Intelligent Robots and Systems, vol. 2, pp. 996–1003.
Yagi, Y., Nagai, H., Yamazawa, K., and Yachida, M. 1999. Reactive visual navigation based on omnidirectional sensing—Path following and collision avoidance. In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, vol. 1, pp. 58–63.
Yagi, Y., Nishii, W., Yamazawa, K., and Yachida, M. 1996a. Rolling motion estimation for mobile robot by using omnidirectional image sensor hyperomnivision. In Proc. IAPR Int. Conf. on Pattern Recognition, vol. 1, pp. 946–950.
Yagi, Y., Nishii, W., Yamazawa, K., and Yachida, M. 1996b. Stabilization for mobile robot by using omnidirectional optical flow. In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 618–625.
Yagi, Y., Nishizawa, Y., and Yachida, M. 1995. Map-based navigation for a mobile robot with omnidirectional image sensor COPIS. IEEE Trans. Robotics and Automation, 11(5):634–648.
Yagi, Y., Okumura H., and Yachida, M. 1994. Multiple visual sensing system for mobile robot. In Proc. IEEE Int. Conf. on Robotics and Automation, vol. 2, pp. 1679–1684.
Yagi, Y., Shouya, K., and Yachida, M. 2000. Environmental map generation and egomotion estimation in a dynamic environment for an omnidirectional image sensor. In Proc. IEEE Int. Conf. on Robotics and Automation, pp. 3493–3498.
Yagi, Y., Sato, K., Yamazawa, K., and Yachida, M. 1998. Autonomous guidance robot system with omnidirectional image sensor. In Proc. Int. Conf. on Quality Control by Artificial Vision, pp. 385–390.
Yagi, Y. and Yachida, M. 1999. Development of a tiny omnidirectional image sensor. 1999 JSME Conf. on Robotics and Mechatronics, No. 99-9,2A1-66-060, pp. 1–2.
Yagi, Y. and Yachida, M. 2000. Development of a tiny omnidirectional image sensor. In Proc. Asian Conf. on Computer Vision, pp. 23–28.
Yagi, Y. and Yachida, M. 1998. Omnidirectional visual sensor. Japanese Patent Application, 10-17251.
Yamazawa, K., Yagi, Y., and Yachida, M. 1993. Omnidirectional imaging with hyperboloidal projection. In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, no. 2, pp. 1029–1034.
Yamazawa, K., Yagi, Y., and Yachida, M. 1995. Obstacle detection with omnidirectional image sensor hyperomni vision. IEEE the International Conference on Robotics and Automation, pp. 1062– 1067.
Yamazawa, K., Yagi, Y., and Yachida, M. 2000. 3D line segment reconstruction by using HyperOmni vision and omnidirectional hough transforming. In Proc. IAPR Int. Conf. on Pattern Recognition, vol. 3, pp. 487–490.
Zheng, J.Y. and Tsuji, S. 1990. Panoramic representation of scenes for route understanding. In Proc. of Int. Conf. Pattern Recognition, pp. 161–167.
Zheng, J.Y. and Tsuji, S. 1990. From anorthoscope perception to dynamic vision. In Proc. of IEEE Int. Conf. Robotics andAutomation, pp. 1154–1160.
Zheng, J.Y. and Tsuji, S. 1992. Panoramic representation for route recognition by a mobile robot. Int. J. Computer Vision, 9(1):55–76.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Yagi, Y., Yachida, M. Real-Time Omnidirectional Image Sensors. International Journal of Computer Vision 58, 173–207 (2004). https://doi.org/10.1023/B:VISI.0000019684.35147.fc
Issue Date:
DOI: https://doi.org/10.1023/B:VISI.0000019684.35147.fc