Skip to main content
Log in

Task-oriented navigation algorithms for an outdoor environment with colored borders and obstacles

  • Original Research Paper
  • Published:
Intelligent Service Robotics Aims and scope Submit manuscript

Abstract

This paper presents task-oriented navigation algorithms used for an outdoor environment. The goals of the navigation are recognizing colored border lines on both sides of a path, avoiding obstacles on the path, and navigating the given path. To recognize the colored border lines with one camera, we apply a support vector data description method, which employs six color features extracted from two color models. To avoid collision with obstacles on the path, we fuse the data of the lines measured by a camera and the obstacles measured by a laser range finder. These algorithms were applied to autonomous navigation of about 100 m long curved track. We demonstrate that a four-wheel skid-steering mobile robot successfully finishes the mission.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Buluswar SD, Draper BA (1998) Color recognition in outdoor images. In: Computer vision, 1998. Sixth international conference on 4–7 Jan 1998, pp 171–177. doi:10.1109/iccv.1998.710715

  2. Crowley J (1985) Navigation for an intelligent mobile robot. IEEE J Robotics Autom 1(1): 31–41

    Article  MathSciNet  Google Scholar 

  3. Darpa http://www.darpagrandchallenge.com

  4. De Wit CC, Bastin G, Siciliano B (1996) Theory of robot control. Springer, New York, Inc.

    Book  MATH  Google Scholar 

  5. Folkesson J, Jensfelt P, Christensen H (2005) Vision SLAM in the measurement subspace. In: Proceedings of the IEEE international conference on robotics and automation

  6. Goldberg DE (1989) Genetic algorithms in search, optimization, and machine learning. Addison-wesley, Boston

    MATH  Google Scholar 

  7. GwangJuTechnopark http://www.robotcenter.or.kr

  8. Jung E-j, Yi B-J, Kim W (2011) Motion planning algorithms of an omni-directional mobile robot with active caster wheels. Intell Service Robot 4(3): 167–180. doi:10.1007/s11370-011-0089-4

    Article  Google Scholar 

  9. Kim J, Park C, Kweon I (2011) Vision-based navigation with efficient scene recognition. Intell Service Robot 4(3):191–202. doi:10.1007/s11370-011-0091-x

  10. Kozlowski K, Pazderski D (2004) Modeling and control of a 4-wheel skid-steering mobile robot. Int J Appl Math Comput Sci 14(4): 477–496

    MathSciNet  MATH  Google Scholar 

  11. Martin DR, Fowlkes CC, Malik J (2004) Learning to detect natural image boundaries using local brightness, color, and texture cues. IEEE Trans Pattern Anal Mach Intell 26(5): 530–549

    Article  Google Scholar 

  12. Mitchell TM, Carbonell JG (1986) Machine learning: a guide to current research. Springer, Berlin

    Book  Google Scholar 

  13. New Technology Foundation http://www.ntf.or.jp/challenge

  14. Pearl J (1988) Probabilistic reasoning in intelligent systems: networks of plausible inference. Morgan Kaufmann, Massachusetts

    Google Scholar 

  15. Se S, Lowe DG, Little JJ (2005) Vision-based global localization and mapping for mobile robots. IEEE Trans Robot 21(3): 364–375

    Article  Google Scholar 

  16. Tachibana S, Ishihara S (1960) On infinitesimal holomorphically projective transformations in Kählerian manifolds. Tohoku Math J 12(1): 77–101

    Article  MathSciNet  MATH  Google Scholar 

  17. Tax DMJ, Duin RPW (2004) Support vector data description. Mach Learn 54(1): 45–66

    Article  MATH  Google Scholar 

  18. Teslić L, Klančar G, Škrjanc I (2007) Simulation of a mobile robot with an LRF in a 2D environment and map building. Robot Motion Control 2007: 239–246

    Google Scholar 

  19. Teslic L, Skrjanc I, Klancar G (2010) Using a LRF sensor in the Kalman-filtering-based localization of a mobile robot. ISA Trans 49(1): 145–153

    Article  Google Scholar 

  20. Thrun S, Montemerlo M, Dahlkamp H, Stavens D, Aron A, Diebel J, Fong P, Gale J, Halpenny M, Hoffmann G, Lau K, Oakley C, Palatucci M, Pratt V, Stang P, Strohband S, Dupont C, Jendrossek L-E, Koelen C, Markey C, Rummel C, van Niekerk J, Jensen E, Alessandrini P, Bradski G, Davies B, Ettinger S, Kaehler A, Nefian A, Mahoney P (2007) Stanley: the Robot that won the DARPA Grand Challenge The 2005 DARPA grand challenge. In: Buehler M, Iagnemma K, Singh S (eds) Springer tracts in advanced robotics. Springer, Berlin, pp 36–143. doi:10.1007/978-3-540-73429-1_1

    Google Scholar 

  21. Tomono M (2004) Building an object map for mobile robots using LRF scan matching and vision-based object recognition. In: Robotics and automation, 2004. Proceedings, ICRA’04. IEEE international conference on April 26–May 1 2004, vol 3764, pp 3765-3770. doi:10.1109/robot.2004.1308855

  22. Weifeng Z, Zhaoda Z (2007) Recognition method for color image of target based on space transformation distance. J Nanjing University of Aeronautics Astronautics 5

  23. Yang T, Shadlen MN (2007) Probabilistic reasoning by neurons. Nature 447(7148): 1075–1080

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Byung-Ju Yi.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Jung, EJ., Yi, BJ. Task-oriented navigation algorithms for an outdoor environment with colored borders and obstacles. Intel Serv Robotics 6, 69–77 (2013). https://doi.org/10.1007/s11370-012-0114-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11370-012-0114-2

Keywords

Navigation