Skip to main content

Autonomous Flights Through Image-Defined Paths

  • Chapter
  • First Online:

Part of the book series: Springer Proceedings in Advanced Robotics ((SPAR,volume 2))

Abstract

This paper addresses the problem of autonomous quadrotor navigation through a previously-mapped indoor area. In particular, we focus on the case where a user walks through a building and collects images. Subsequently, a visual map of the area, represented as a graph of linked images, is constructed and used for automatically determining visual paths (i.e., sequences of images connecting the start to the end image locations specified by the user). The quadrotor follows the desired path by iteratively (i) determining the desired motion to the next reference frame, (ii) controlling its roll, pitch, yaw-rate, and thrust, and (iii) appropriately switching to a new reference image. For motion estimation and reference-image switching, we concurrently employ the results of the 2pt and the 5pt RANSAC to distinguish and deal with both cases of sufficient and insufficient baseline (e.g., rotation in place). The accuracy and robustness of our algorithm are evaluated experimentally on two quadrotors navigating along lengthy corridors, and through tight spaces inside a building and in the presence of dynamic obstacles (e.g., people walking).

This work was supported by the AFOSR (FA 9550-10-1-0567).

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    The 2pt RANSAC estimates the relative orientation \(^{\scriptscriptstyle {I_{1}}}_{\scriptscriptstyle {I_{2}}}\mathbf {R}\) between two images, \(I_{\scriptscriptstyle {1}}\) and \(I_{\scriptscriptstyle {2}}\), under the assumption of very small baseline compared to the depth of the scene. A closed-form solution for the 2pt minimal case is provided in Sect. 6.2, while the analytical solution for the least-squares solver is presented in [21]. The 5pt RANSAC [26] estimates the relative orientation \(^{\scriptscriptstyle {I_{1}}}_{\scriptscriptstyle {I_{2}}}\mathbf {R}\) and the unit vector of translation \(^{\scriptscriptstyle {I_{1}}}\mathbf {t}_{\scriptscriptstyle {I_{2}}}\) between two images \(I_{\scriptscriptstyle {1}}\) and \(I_{\scriptscriptstyle {2}}\).

  2. 2.

    Under low-light conditions, the velocity measurements are reliable only for a fixed tilt angle of the vehicle. Note that when in motion, the quadrotor changes its roll and pitch which causes image blurriness (due to the increased exposure) and, hence, large errors in the optical-flow estimates.

  3. 3.

    Note that although both the embedded controller and the cell phone contain IMUs, which can be used, in conjunction with the camera, to form a vision-aided inertial navigation system [18], in this work, we intentionally focus on a “light”, in terms of processing, vision-only approach so as to assess its performance and use it as a baseline for future comparisons.

  4. 4.

    Note that since all images were recorded at about the same height, the z component of the desired motion estimate is rather small after the first reference image and we subsequently ignore it. Instead, we use the distance-to-the-ground measurements to maintain a constant-altitude flight.

  5. 5.

    This threshold depends on the onboard camera’s fov and is selected so as to ensure a significant overlap (more than 80%) between the current camera image and the next reference image.

References

  1. Autonomous flights through image-defined paths (videos). http://mars.cs.umn.edu/research/quadrotor_project.php

  2. Alahi, A., Ortiz, R., Vandergheynst, P.: FREAK: Fast retina keypoint. In: Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition, Providence, RI, pp. 510–517, 16–21 June 2012

    Google Scholar 

  3. Azrad, S., Kendoul, F., Nonami, K.: Visual servoing of quadrotor micro-air vehicle using color-based tracking algorithm. J. Syst. Design Dyn. 4(2), 255–268 (2010)

    Article  Google Scholar 

  4. Bebop Drone. http://www.parrot.com/products/bebop-drone/

  5. Bills, C., Chen, J., Saxena, A.: Autonomous MAV flight in indoor environments using single image perspective cues. In: Proceedings of the IEEE International Conference on Robotics and Automation, Shanghai, China, pp. 5776–5783, 9–13 May 2011

    Google Scholar 

  6. Bourquardez, O., Mahony, R., Guenard, F.C.N.: Image-based visual servo control of the translation kinematics of a quadrotor aerial vehicle. IEEE Trans. Robot. 25(3), 743–749 (2009)

    Article  Google Scholar 

  7. Chaumette, F., Hutchinson, S.: Visual servo control, part i: basic approaches. IEEE Robot. Autom. Mag. 13(4), 82–90 (2006)

    Article  Google Scholar 

  8. Chaumette, F., Hutchinson, S.: Visual servo control, part ii: advanced approaches. IEEE Robot. Autom. Mag. 14(1), 109–118 (2007)

    Article  Google Scholar 

  9. Chen, Z., Birchfield, R.T.: Qualitative vision-based path following. IEEE Trans. Robot. 25(3), 749–754 (2009)

    Article  Google Scholar 

  10. Cormen, T.H., Leiserson, C.E., Rivest, R.L., Stein, C.: Introduction to Algorithms. MIT Press, Cambridge (2001)

    MATH  Google Scholar 

  11. Courbon, J., Mezouar, Y., Guenard, N., Martinet, P.: Vision-based navigation of unmanned aerial vehicle. Control Eng. Pract. 18(7), 789–799 (2010)

    Article  Google Scholar 

  12. Courbon, J., Mezouar, Y., Martinet, P.: Indoor navigation of a non-holonomic mobile robot using a visual memory. Auton. Robot. 25(3), 253–266 (2008)

    Article  Google Scholar 

  13. Diosi, A., S̆egvić, S., Remazeilles, A., Chaumette, F.: Experimental evaluation of autonomous driving based on visual memory and image-based visual servoing. IEEE Trans. Robot. 12(3), 870–883 (2011)

    Google Scholar 

  14. Do, T., Carrillo-Arce, L.C., Roumeliotis, S.I.: Autonomous flights through image-defined paths, Technical Report (2015). http://mars.cs.umn.edu/publications.html

  15. Fischler, M., Bolles, R.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24(6), 381–395 (1981)

    Article  MathSciNet  Google Scholar 

  16. Franklin, G.F., Powell, J.D., Workman, M.L.: Digital Control of Dynamic Systems. Addison-Wesley, Reading (1997)

    MATH  Google Scholar 

  17. Goedemé, T., Tuytelaars, T., Gool, L.V., Vanacker, G., Nuttin, M.: Feature based omnidirectional sparse visual path following. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Alberta, Canada, pp. 1806–1811, 2–6 August 2005

    Google Scholar 

  18. Guo, C.X., Kottas, D.G., DuToit, R.C., Ahmed, A., Li, R., Roumeliotis, S.I.: Efficient visual-inertial navigation using a rolling-shutter camera with inaccurate timestamps. In: Proceedings of the Robotics: Science and Systems Conference, Berkeley, CA, 12–16 July 2014

    Google Scholar 

  19. Hartley, R.I., Zisserman, A.: Multiple View Geometry in Computer Vision, 2nd edn. Cambridge University Press, Cambridge (2004). ISBN: 0521540518

    Book  MATH  Google Scholar 

  20. Honegger, D., Meier, L., Tanskanen, P., Pollefeys, M.: An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications. In: Proceedings of the IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, pp. 1736–1741, 6–16 May 2013

    Google Scholar 

  21. Horn, B.: Closed-form solution of absolute orientation using unit quaternions. J. Opt. Soc. America A 4(4), 629–642 (1987)

    Article  Google Scholar 

  22. Horn, B.: Relative orientation. Int. J. Comput. Vis. 4(1), 59–78 (1990)

    Article  Google Scholar 

  23. Lee, D., Ryan, T., Kim, H.J.: Autonomous landing of a VTOL UAV on a moving platform using image-based visual servoing. In: Proceedings of the IEEE International Conference on Robotics and Automation, Saint Paul, MN, pp. 971–976, 14–18 May 2012

    Google Scholar 

  24. Mariottini, G.L., Roumeliotis, S.I.: Active vision-based robot localization and navigation in a visual memory. In: Proceedings of the IEEE International Conference on Robotics and Automation, Shanghai, China, pp. 6192–6198, 9–13 May 2011

    Google Scholar 

  25. Nguyen, T., Mann, G.K.I., Gosine, R.G.: Vision-based qualitative path-following control of quadrotor aerial vehicle. In: Proceedings of the IEEE International Conference on Unmanned Aircraft Systems, Orlando, FL, pp. 412–417, 27–30 May 2014

    Google Scholar 

  26. Nistér, D.: An efficient solution to the five-point relative pose problem. IEEE Trans. Patter Anal. Mach. Intell. 26(6), 756–770 (2004)

    Article  Google Scholar 

  27. Nistér, D., Stewénius, H.: Scalable recognition with a vocabulary tree. In: Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition, New York, NY, pp. 2161–2168, 17–22 June 2006

    Google Scholar 

  28. Vicon Motion Systems Ltd. http://www.vicon.com/

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tien Do .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this chapter

Cite this chapter

Do, T., Carrillo-Arce, L.C., Roumeliotis, S.I. (2018). Autonomous Flights Through Image-Defined Paths. In: Bicchi, A., Burgard, W. (eds) Robotics Research. Springer Proceedings in Advanced Robotics, vol 2. Springer, Cham. https://doi.org/10.1007/978-3-319-51532-8_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-51532-8_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-51531-1

  • Online ISBN: 978-3-319-51532-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics