single-rb.php

JRM Vol.30 No.4 pp. 552-562
doi: 10.20965/jrm.2018.p0552
(2018)

Paper:

Robust Road-Following Navigation System with a Simple Map

Yuki Hosoda, Ryota Sawahashi, Noriaki Machinaka, Ryota Yamazaki, Yudai Sadakuni, Kazuya Onda, Ryosuke Kusakari, Masaro Kimba, Tomotaka Oishi, and Yoji Kuroda

Meiji University
1-1-1 Higashimita, Tama-ku, Kawasaki, Kanagawa 214-8571, Japan

Received:
February 23, 2018
Accepted:
June 6, 2018
Published:
August 20, 2018
Keywords:
autonomous navigation system, electronic map, road-following
Abstract

This paper presents a novel autonomous navigation system. Our proposed system is based on a simple map (an Edge-Node Graph, which is created from an electronic map). This system consists of “Localization,” which estimates which edge is on the Edge-Node Graph, “Environmental Recognition,” which recognizes the environment around the robot, and “Path Planning,” which avoids objects. Since the robot travels using the Edge-Node Graph, there is no need to prepare an environmental map in advance. In addition, the system is quite robust, since it relies less on prior information. To show the effectiveness of our system, we conducted experiments on each elemental technology as well as some traveling tests.

Navigation with edge-node graph

Navigation with edge-node graph

Cite this article as:
Y. Hosoda, R. Sawahashi, N. Machinaka, R. Yamazaki, Y. Sadakuni, K. Onda, R. Kusakari, M. Kimba, T. Oishi, and Y. Kuroda, “Robust Road-Following Navigation System with a Simple Map,” J. Robot. Mechatron., Vol.30 No.4, pp. 552-562, 2018.
Data files:
References
  1. [1] R. Kümmerle et al., “Autonomous robot navigation in highly populated pedestrian zones,” J. of Field Robotics, Vol.32, No.4, pp. 565-589, 2015.
  2. [2] P. Ruchti et al., “Localization on OpenStreetMap data using a 3D Laser Scanner,” IEEE Int. Conf. on Robotics and Automation (ICRA), May 26-30, 2015.
  3. [3] M. Saito et al., “Pre-Driving Needless System for Autonomous Mobile Robots Navigation in Real World Robot Challenge 2013,” J. Robot. Mechatron., Vol.26, No.2, pp. 185-195, 2014.
  4. [4] S. Muramatsu et al., “Mobile Robot Navigation Utilizing the WEB Based Aerial Images Without Prior Teaching Run,” J. Robot. Mechatron., Vol.29, No.4, pp. 697-705, 2017.
  5. [5] Y. Hosoda et al., “Intersection Detection based on Shape Information with Semantic Information,” SICE System Integration Division Annual Conf. (SI2017), December 20-22, 2017.
  6. [6] S. Thrun et al., “Stanley: The robot that won the DARPA Grand Challenge,” J. of Field Robotics, Vol.23, No.9, pp. 661-692, 2006.
  7. [7] F. Neuhaus et al., “Terrain drivability analysis in 3D laser range data for autonomous robot navigation in unstructured environments,” Emerging Technologies and Factory Automation (ETFA), pp. 1-4, 2009.
  8. [8] V. Badrinarayanan, A. Kendall, and R. Cipolla, “SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.39, No.12, 2017.
  9. [9] G. J. Brostow, J. Fauqueur, and R. Cipolla, “Semantic object classes in video: A high-definition ground truth database,” Pattern Recognition Letters, Vol.30, No.2, pp. 88-97, 2009.
  10. [10] C. Tongtong, D. Bin, L. Daxue, and L. Zhao, “Lidar-based long range road intersection detection,” 2011 6th Int. Conf. on IEEE Image and Graphics (ICIG), pp. 754-759, 2011.
  11. [11] Yihuan Zhang et al., “3D LIDAR-based Intersection Recognition and Road Boundary Detection Method for Unmanned Ground Vehicle,” 2015 IEEE 18th Int. Conf. on Intelligent Transportation Systems, 2015.
  12. [12] S. Thrun, W. Burgard, and D. Fox, “Probabilistic Robotics,” MIT Press, 2005.
  13. [13] D. Ferguson, T. M. Howard, and M. Likhachev, “Motionplanning in urban environments,” Proc. of the JFR, Vol.25, pp. 939-960, 2008.
  14. [14] T. M. Howard and A. Kelly, “Optimal rough terrain trajectory generation for wheeled mobile robots,” Proc. of the IJRR, Vol.26, pp. 141-166, 2007.
  15. [15] T. M. Howard and C. J. Green, “State space sampling of feasible motions for high-performance mobile robot navigation in complex environments,” Proc. of the JFR, Vol.25, pp. 325-345, 2008.
  16. [16] P. Besl and H. McKay, “A method for registration of 3-D shapes,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.14, No.2, pp. 239-256, 1992.
  17. [17] Y. Chen and G. Medioni, “Object modeling by registration of multiple range images,” Proc. of IEEE Int. Conf. on Robotics and Automation, Vol.3, pp. 2724-2729, 1991.
  18. [18] P. Biber and W. Straßer, “The normal distributions transform: A new approach to laser scan matching,” Proc. 2003 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS 2003), pp. 2743-2748.
  19. [19] Y. Aotani et al., “Development of Autonomous Navigation System Using 3D Map with Geometric and Semantic Information,” J. Robot. Mechatron., Vol.29, No.4, pp. 639-648, 2017.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024