Skip to main content
Log in

Efficient velocity estimation for MAVs by fusing motion from two frontally parallel cameras

  • Original Research Paper
  • Published:
Journal of Real-Time Image Processing Aims and scope Submit manuscript

Abstract

Efficient velocity estimation is crucial for the robust operation of navigation control loops of micro aerial vehicles (MAVs). Motivated by the research on how animals exploit their visual topographies to rapidly perform locomotion, we propose a bio-inspired method that applies quasi-parallax technique to estimate the velocity of an MAV equipped with a forward-looking stereo camera without GPS. Different to the available optical flow-based methods, our method can realize efficient metric velocity estimation without applying any depth information from either additional distance sensors or from stereopsis. In particular, the quasi-parallax technique, which claims to press maximal benefits from the configuration of two frontally parallel cameras, leverages pairs of parallel visual rays to eliminate rotational flow for translational velocity estimation, followed by refinement of the estimation of rotational velocity and translational velocity iteratively and alternately. Our method fuses the motion information from two frontal-parallel cameras without performing correspondences matching, achieving enhanced robustness and efficiency. Extensive experiments on synthesized and actual scenes demonstrate the effectiveness and efficiency of our method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Notes

  1. For our closed loop trajectory, the ending location coincides with the starting location. Thus DiffRatio = 0 ideally.

  2. We put several objects on the ground to make it non-planar, as shown in Fig. 10d.

  3. As a rule of thumb in the control community, high-frequency changes of input could be unexpectedly magnified by control units, resulting in unpredictable controlling performance.

References

  1. Briod, A., Zufferey, J.C., Floreano, D.: Optic-flow based control of a 46g quadrotor. In: Workshop on Vision-based Closed-Loop Control and Navigation of Micro Helicopters in GPS-Denied Environments, IROS 2013, EPFL-CONF-189879 (2013)

  2. Cheong, L.F., Gao, Z.: Quasi-parallax for nearly parallel frontal eyes. Int. J. Comput. Vis. 101(1), 45–63 (2013)

    Article  MathSciNet  Google Scholar 

  3. Faessler, M., Fontana, F., Forster, C., Mueggler, E., Pizzoli, M., Scaramuzza, D. (2015) Autonomous, vision-based flight and live dense 3d mapping with a quadrotor micro aerial vehicle. J. Field Robot.

  4. Fraundorfer, F., Scaramuzza, D.: Visual odometry: Part I: the first 30 years and fundamentals. IEEE Robot. Autom. Mag. 18(4), 80–92 (2011)

    Article  Google Scholar 

  5. Heng, L., Honegger, D., Lee, G.H., Meier, L., Tanskanen, P., Fraundorfer, F., Pollefeys, M.: Autonomous visual mapping and exploration with a micro aerial vehicle. J. Field Robot. 31(4), 654–675 (2014)

    Article  Google Scholar 

  6. Honegger, D., Greisen, P., Meier, L., Tanskanen, P., Pollefeys, M.: Real-time velocity estimation based on optical flow and disparity matching. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, pp. 5177–5182 (2012)

  7. Honegger, D., Meier, L., Tanskanen, P., Pollefeys, M.: An open source and open hardware embedded metric optical flow cmos camera for indoor and outdoor applications. In: Robotics and Automation (ICRA), 2013 IEEE International Conference on, IEEE, pp. 1736–1741 (2013)

  8. Kato, S., Takeuchi, E., Ishiguro, Y., Ninomiya, Y., Takeda, K., Hamada, T.: An open approach to autonomous vehicles. IEEE Micro 35(6), 60–68 (2015)

    Article  Google Scholar 

  9. Kim, J.H., Li, H., Hartley, R.: Motion estimation for nonoverlapping multicamera rigs: linearalgebraic and l-infinity geometric solutions. IEEE Trans. Pattern Anal. Mach. Intell. 32(6), 1044–1059 (2010)

    Article  Google Scholar 

  10. Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: Mixed and Augmented Reality, 2007. ISMAR 2007. 6th IEEE and ACM International Symposium on, IEEE, pp. 225–234 (2007)

  11. Lee, H., Faundorfer, F., Pollefeys, M.: Motion estimation for self-driving cars with a generalized camera. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2746–2753 (2013)

  12. Martin, G.R.: What is binocular vision for? A birds’ eye view. J. Vis. 9(11), 14–14 (2009)

    Article  Google Scholar 

  13. McGuire, K., de Croon, G., De Wagter, C., Remes, B., Tuyls, K., Kappen, H.: Local histogram matching for efficient optical flow computation applied to velocity estimation on pocket drones. In: Robotics and Automation (ICRA), 2016 IEEE International Conference on, IEEE, pp. 3255–3260 (2016)

  14. McGuire, K., de Croon, G., De Wagter, C., Tuyls, K., Kappen, H.: Efficient optical flow and stereo vision for velocity estimation and obstacle avoidance on an autonomous pocket drone. IEEE Robot. Autom. Lett. 2(2), 1070–1076 (2017)

    Article  Google Scholar 

  15. Mur-Artal, R., Montiel, J., Tardos, J.D.: Orb-slam: a versatile and accurate monocular slam system. IEEE Trans. Robot. 31(5), 1147–1163 (2015)

    Article  Google Scholar 

  16. Pless, R.: Camera cluster in motion: motion estimation for generalized camera designs. IEEE Robot. Autom. Mag. 11(4), 39–44 (2004)

    Article  Google Scholar 

  17. Rieger, J., Lawton, D.: Processing differential image motion. JOSA A 2(2), 354–359 (1985)

    Article  Google Scholar 

  18. Shi, J., Tomasi, C.: Good features to track. In: Computer Vision and Pattern Recognition, 1994. Proceedings CVPR’94., 1994 IEEE Computer Society Conference on, IEEE, pp. 593–600 (1994)

  19. Srinivasan, M.V., Thurrowgood, S., Soccol, D.: From visual guidance in flying insects to autonomous aerial vehicles. In: Flying insects and robots, Springer, Berlin, pp. 15–28 (2009)

    Chapter  Google Scholar 

  20. Sun, D., Roth, S., Black, M.J.: Secrets of optical flow estimation and their principles. In: Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on, IEEE, pp. 2432–2439 (2010)

  21. Trucco, E., Verri, A.: Introductory Techniques for 3-D Computer Vision, vol. 201. Prentice Hall, Englewood Cliffs (1998)

    Google Scholar 

  22. Weiss, S., Brockers, R., Matthies, L.: 4dof drift free navigation using inertial cues and optical flow. In: Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference on, IEEE, pp. 4180–4186 (2013)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhi Gao.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 9087 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gao, Z., Ramesh, B., Lin, WY. et al. Efficient velocity estimation for MAVs by fusing motion from two frontally parallel cameras. J Real-Time Image Proc 16, 2367–2378 (2019). https://doi.org/10.1007/s11554-018-0752-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11554-018-0752-5

Keywords

Navigation