Abstract
Efficient velocity estimation is crucial for the robust operation of navigation control loops of micro aerial vehicles (MAVs). Motivated by the research on how animals exploit their visual topographies to rapidly perform locomotion, we propose a bio-inspired method that applies quasi-parallax technique to estimate the velocity of an MAV equipped with a forward-looking stereo camera without GPS. Different to the available optical flow-based methods, our method can realize efficient metric velocity estimation without applying any depth information from either additional distance sensors or from stereopsis. In particular, the quasi-parallax technique, which claims to press maximal benefits from the configuration of two frontally parallel cameras, leverages pairs of parallel visual rays to eliminate rotational flow for translational velocity estimation, followed by refinement of the estimation of rotational velocity and translational velocity iteratively and alternately. Our method fuses the motion information from two frontal-parallel cameras without performing correspondences matching, achieving enhanced robustness and efficiency. Extensive experiments on synthesized and actual scenes demonstrate the effectiveness and efficiency of our method.
Similar content being viewed by others
Notes
For our closed loop trajectory, the ending location coincides with the starting location. Thus DiffRatio = 0 ideally.
We put several objects on the ground to make it non-planar, as shown in Fig. 10d.
As a rule of thumb in the control community, high-frequency changes of input could be unexpectedly magnified by control units, resulting in unpredictable controlling performance.
References
Briod, A., Zufferey, J.C., Floreano, D.: Optic-flow based control of a 46g quadrotor. In: Workshop on Vision-based Closed-Loop Control and Navigation of Micro Helicopters in GPS-Denied Environments, IROS 2013, EPFL-CONF-189879 (2013)
Cheong, L.F., Gao, Z.: Quasi-parallax for nearly parallel frontal eyes. Int. J. Comput. Vis. 101(1), 45–63 (2013)
Faessler, M., Fontana, F., Forster, C., Mueggler, E., Pizzoli, M., Scaramuzza, D. (2015) Autonomous, vision-based flight and live dense 3d mapping with a quadrotor micro aerial vehicle. J. Field Robot.
Fraundorfer, F., Scaramuzza, D.: Visual odometry: Part I: the first 30 years and fundamentals. IEEE Robot. Autom. Mag. 18(4), 80–92 (2011)
Heng, L., Honegger, D., Lee, G.H., Meier, L., Tanskanen, P., Fraundorfer, F., Pollefeys, M.: Autonomous visual mapping and exploration with a micro aerial vehicle. J. Field Robot. 31(4), 654–675 (2014)
Honegger, D., Greisen, P., Meier, L., Tanskanen, P., Pollefeys, M.: Real-time velocity estimation based on optical flow and disparity matching. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, pp. 5177–5182 (2012)
Honegger, D., Meier, L., Tanskanen, P., Pollefeys, M.: An open source and open hardware embedded metric optical flow cmos camera for indoor and outdoor applications. In: Robotics and Automation (ICRA), 2013 IEEE International Conference on, IEEE, pp. 1736–1741 (2013)
Kato, S., Takeuchi, E., Ishiguro, Y., Ninomiya, Y., Takeda, K., Hamada, T.: An open approach to autonomous vehicles. IEEE Micro 35(6), 60–68 (2015)
Kim, J.H., Li, H., Hartley, R.: Motion estimation for nonoverlapping multicamera rigs: linearalgebraic and l-infinity geometric solutions. IEEE Trans. Pattern Anal. Mach. Intell. 32(6), 1044–1059 (2010)
Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: Mixed and Augmented Reality, 2007. ISMAR 2007. 6th IEEE and ACM International Symposium on, IEEE, pp. 225–234 (2007)
Lee, H., Faundorfer, F., Pollefeys, M.: Motion estimation for self-driving cars with a generalized camera. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2746–2753 (2013)
Martin, G.R.: What is binocular vision for? A birds’ eye view. J. Vis. 9(11), 14–14 (2009)
McGuire, K., de Croon, G., De Wagter, C., Remes, B., Tuyls, K., Kappen, H.: Local histogram matching for efficient optical flow computation applied to velocity estimation on pocket drones. In: Robotics and Automation (ICRA), 2016 IEEE International Conference on, IEEE, pp. 3255–3260 (2016)
McGuire, K., de Croon, G., De Wagter, C., Tuyls, K., Kappen, H.: Efficient optical flow and stereo vision for velocity estimation and obstacle avoidance on an autonomous pocket drone. IEEE Robot. Autom. Lett. 2(2), 1070–1076 (2017)
Mur-Artal, R., Montiel, J., Tardos, J.D.: Orb-slam: a versatile and accurate monocular slam system. IEEE Trans. Robot. 31(5), 1147–1163 (2015)
Pless, R.: Camera cluster in motion: motion estimation for generalized camera designs. IEEE Robot. Autom. Mag. 11(4), 39–44 (2004)
Rieger, J., Lawton, D.: Processing differential image motion. JOSA A 2(2), 354–359 (1985)
Shi, J., Tomasi, C.: Good features to track. In: Computer Vision and Pattern Recognition, 1994. Proceedings CVPR’94., 1994 IEEE Computer Society Conference on, IEEE, pp. 593–600 (1994)
Srinivasan, M.V., Thurrowgood, S., Soccol, D.: From visual guidance in flying insects to autonomous aerial vehicles. In: Flying insects and robots, Springer, Berlin, pp. 15–28 (2009)
Sun, D., Roth, S., Black, M.J.: Secrets of optical flow estimation and their principles. In: Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on, IEEE, pp. 2432–2439 (2010)
Trucco, E., Verri, A.: Introductory Techniques for 3-D Computer Vision, vol. 201. Prentice Hall, Englewood Cliffs (1998)
Weiss, S., Brockers, R., Matthies, L.: 4dof drift free navigation using inertial cues and optical flow. In: Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference on, IEEE, pp. 4180–4186 (2013)
Author information
Authors and Affiliations
Corresponding author
Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Gao, Z., Ramesh, B., Lin, WY. et al. Efficient velocity estimation for MAVs by fusing motion from two frontally parallel cameras. J Real-Time Image Proc 16, 2367–2378 (2019). https://doi.org/10.1007/s11554-018-0752-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11554-018-0752-5