Skip to main content
Log in

Outdoor autonomous navigation using SURF features

  • Original Article
  • Published:
Artificial Life and Robotics Aims and scope Submit manuscript

Abstract

In this article, we propose a speeded-up robust features (SURF)-based approach for outdoor autonomous navigation. In this approach, we capture environmental images using an omni-directional camera and extract features of these images using SURF. We treat these features as landmarks to estimate a robot’s self-location and direction of motion. SURF features are invariant under scale changes and rotation, and are robust under image noise, changes in light conditions, and changes of viewpoint. Therefore, SURF features are appropriate for the self-location estimation and navigation of a robot. The mobile robot navigation method consists of two modes, the teaching mode and the navigation mode. In the teaching mode, we teach a navigation course. In the navigation mode, the mobile robot navigates along the teaching course autonomously. In our experiment, the outdoor teaching course was about 150 m long, the average speed was 2.9 km/h, and the maximum trajectory error was 3.3 m. The processing time of SURF was several times shorter than that of scale-invariant feature transform (SIFT). Therefore, the navigation speed of the mobile robot was similar to the walking speed of a person.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. DARPA Urban Challenge (2007) http://archive.darpa.mil/grandchallenge/index.asp

  2. Real World Robot Challenge (2007) http://www.ntf.or.jp/challenge/

  3. Thrapp R, Westbrook C, Subramanian D (2001) Robust localization algorithms for an autonomous campus tour guide. Proceedings of the 2001 IEEE International Conference on Robotics and Automation, pp 2065–2071

  4. Ohno K, Tsubouchi T, Shigematsu B, et al (2004) Differential GPS and odometry-based outdoor navigation of a mobile robot. Adv Robotics 18(6):611–635

    Article  Google Scholar 

  5. Horswill I (1993) Polly: a vision-based artificial agent. Proceedings of the International Conference on AAAI’ 93, pp 824–829

  6. Matsumoto Y, Inaba M, Inoue H (2000) View-based approach to robot navigation. Proceedings of the 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp 1702–1708

  7. Morita H, Hild M, Miura J, et al (2005) View-based localization in outdoor environments using support vector learning. Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp 3083–3088

  8. Bay H, Ess A, Tuytelaars T, et al (2008) Speeded-up robust features (SURF). Computer Vision and Image Understanding 110(3): 346–359

    Article  Google Scholar 

  9. Lowe DG (1999) Object recognition from local scale-invariant features. Proceedings of the IEEE International Conference on Computer Vision, pp 1150–1157

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Masayoshi Tabuse.

Additional information

This work was presented in part at the 16th International Symposium on Artificial Life and Robotics, Oita, Japan, January 27–29, 2011

About this article

Cite this article

Tabuse, M., Kitaoka, T. & Nakai, D. Outdoor autonomous navigation using SURF features. Artif Life Robotics 16, 356–360 (2011). https://doi.org/10.1007/s10015-011-0950-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10015-011-0950-8

Key words