single-rb.php

JRM Vol.17 No.1 pp. 36-43
doi: 10.20965/jrm.2005.p0036
(2005)

Paper:

Fusion of Multiple Ultrasonic Sensor Data and Image Data for Measuring an Object’s Motion

Kazunori Umeda*, Jun Ota**, and Hisayuki Kimura***

*Dept. Precision Mechanics, Faculty of Science and Engineering, Chuo University, 1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551, Japan

**Dept. of Precision Engineering, Graduate School of Engineering, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan

***Kanagawa Prefectural Shoko Commercial and Technical High School, 743 Imai-cho, Hodogaya-ku, Yokohama 240-0035, Japan

Received:
December 26, 2003
Accepted:
December 14, 2004
Published:
February 20, 2005
Keywords:
multiple ultrasonic sensor, image, sensor fusion, measurement of mobile robot motion, Kalman filter
Abstract
Robot sensing requires two types of observation – intensive and wide-angle. We selected multiple ultrasonic sensors for intensive observation and an image sensor for wide-angle observation in measuring a moving object’s motion with sensors in two kinds of fusion – one fusing multiple ultrasonic sensor data and the other fusing the two types of sensor data. The fusion of multiple ultrasonic sensor data takes advantage of object movement from a measurement range of an ultrasonic sensor to another sensor’s range. They are formulated in a Kalman filter framework. Simulation and experiments demonstrate the effectiveness and applicability to an actual robot system.
Cite this article as:
K. Umeda, J. Ota, and H. Kimura, “Fusion of Multiple Ultrasonic Sensor Data and Image Data for Measuring an Object’s Motion,” J. Robot. Mechatron., Vol.17 No.1, pp. 36-43, 2005.
Data files:
References
  1. [1] Y. Yagi, H. Okumura, and M. Yachida, “Multiple visual sensing system for mobile robot,” Proc. 1994 IEEE Int. Conf. on RA, pp. 1679-1684, 1994.
  2. [2] Y. Kuniyoshi, N. Kita, K. Sugimoto, S. Nakamura, and T. Suehiro, “A Foveated Wide Angle Lens for Active Vision,” Proc. 1995 IEEE Int. Conf. on RA, pp. 2982-2988, 1995.
  3. [3] A. Ohya, Y. Nagashima, and S. Yuta, “Exploring Unknown Environment and Map Construction Using Ultrasonic Sensing of Normal Direction of Walls,” IEEE Int. Conf. Robotics and Automation’94, Vol.1, pp. 485-492, 1994.
  4. [4] M. Takano, S. Odaka, T. Tsukishima, and K. Sasaki, “Study On Mobile Robot Navigation Control By Internal And External Sensor Data With Ultrasonic Sensor,” Proc. IEEE/RSJ Int. Workshop on Robots and Systems’89 (IROS’89), pp. 456-463, 1989.
  5. [5] H. Choset, K. Nagatani, and N. Lazar, “The Arc-Transversal Median Algorithm: A Geometric Approach to Increasing Ultrasonic Sensor Azimuth Accuracy,” IEEE Trans. on Robotics and Automation, Vol.19, No.3, pp. 513-523, 2003.
  6. [6] O. Wijk, P. Jensfelt, and H. Christensen, “Triangulation Based Fusion of Ultrasonic Sensor Data,” IEEE Proc. Int. Conf. on Robotics and Automation, pp. 3419-3424, 1998.
  7. [7] A. Ohya, A. Kosaka, and A. Kak, “Vision-Based Navigation by Mobile Robots with Obstacle Avoidance Using Single-Camera Vision and Ultrasonic Sensing,” IEEE Trans. Robotics and Automation, Vol.14, No.6, pp. 969-978, 1998.
  8. [8] K. Umeda, J. Ota, and H. Kimura, “Fusion of Multiple Ultrasonic Sensor Data and Imagery Data for Measuring Moving Obstacle’s Motion,” Proc. 1996 IEEE/SICE/RSJ Int. Conf. on Multisensor Fusion and Integration for Intelligent Systems, pp. 742-748, 1996.
  9. [9] For example, iRobot
    http://www.irobot.com
  10. [10] J. Borenstein, and Y. Koren, “Error eliminating rapid ultrasonic firing for mobile robot obstacle avoidance,” IEEE Trans. RA, Vol.11, No.1, pp. 132-138, 1995.
  11. [11] A. la Cour-Harbo, and J. Stoustrup, “Using spread spectrum transform for fastand robust simultaneous measurement in active sensors with multiple emitters,” Proc. of IEEE IECON’02, pp. 2669-2674, 2002.
  12. [12] Z. Zhang, and O. Faugeras, “3D dynamic scene analysis,” Springer-Verlag, 1992.
  13. [13] R. C. Luo, and M. G. Kay, “Multisensor integration and fusion in intelligent systems,” IEEE Trans. SMC, Vol.19, No.5, pp. 901-931, 1989.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024