Skip to main content
Log in

Real-Time Multi-Car Localization and See-Through System

  • Published:
International Journal of Computer Vision Aims and scope Submit manuscript

Abstract

In this paper, we propose a multi-vehicle localization approach relying exclusively on cameras installed on connected cars (e.g. vehicles with Internet access). The proposed method is designed to perform in real-time while requiring a low bandwidth connection as a result of an efficient distributed architecture. Hence, our approach is compatible with both LTE Internet connection and local Wi-Fi networks. To reach this goal, the vehicles share small portions of their respective 3D maps to estimate their relative positions. The global consistency between multiple vehicles is enforced via a novel graph-based strategy. The efficiency of our system is highlighted through a series of real experiments involving multiple vehicles. Moreover, the usefulness of our technique is emphasized by an innovative and unique multi-car see-through system resolving the inherent limitations of the previous approaches. A video demonstration is available via: https://youtu.be/GD7Z95bWP6k.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17

Similar content being viewed by others

Notes

  1. Here, we follow MATLAB notation (;) to conveniently denote row change.

References

  • Bailo, O., Rameau, F., & Kweon, I. S. (2017). Light-weight place recognition and loop detection using road markings. arXiv:1710.07434.

  • Bailo, O., Rameau, F., Joo, K., Park, J., Bogdan, O., & Kweon, I. S. (2018). Efficient adaptive non-maximal suppression algorithms for homogeneous spatial keypoint distribution. Pattern Recognition Letters, 106, 53–60.

    Article  Google Scholar 

  • Burri, M., Nikolic, J., Gohl, P., Schneider, T., Rehder, J., Omari, S., et al. (2016). The euroc micro aerial vehicle datasets. The International Journal of Robotics Research. https://doi.org/10.1177/0278364915620033.

    Article  Google Scholar 

  • Chen, H. I., Chen, Y. L., Lee, W. T., Wang, F., & Chen, B. Y. (2015). Integrating dashcam views through inter-video mapping. In ICCV.

  • Chen, X., Lu, H., Xiao, J., & Zhang, H. (2018). Distributed monocular multi-robot slam. In 2018 IEEE 8th annual international conference on CYBER technology in automation, control, and intelligent systems (CYBER) (pp. 73–78). IEEE.

  • Chum, O., & Matas, J. (2005). Matching with prosac-progressive sample consensus. In CVPR.

  • Cieslewski, T., Choudhary, S., & Scaramuzza, D. (2017). Data-efficient decentralized visual slam. arXiv:1710.05772.

  • Dubois, R., Eudes, A., & Frémont, V. (2019). On data sharing strategy for decentralized collaborative visual-inertial simultaneous localization and mapping. In IROS. https://hal.archives-ouvertes.fr/hal-02190833.

  • Feilner, M. (2006). OpenVPN: Building and integrating virtual private networks. Packt Publishing Ltd.

  • Forster, C., Lynen, S., Kneip, L., & Scaramuzza, D. (2013). Collaborative monocular slam with multiple micro aerial vehicles. In IROS.

  • Gálvez-López, D., & Tardos, J. D. (2012). Bags of binary words for fast place recognition in image sequences. IEEE Transactions on Robotics, 28(5), 1188–1197.

    Article  Google Scholar 

  • Geiger, A., Lenz, P., & Urtasun, R. (2012). Are we ready for autonomous driving? The kitti vision benchmark suite In CVPR.

  • Geiger, A., Roser, M., & Urtasun, R. (2010). Efficient large-scale stereo matching. In ACCV.

  • Geiger, A., Ziegler, J., & Stiller, C. (2011). Stereoscan: Dense 3d reconstruction in real-time. In IV.

  • Golodetz, S., Cavallari, T., Lord, N. A., Prisacariu, V. A., Murray, D. W., & Torr, P. H. (2018). Collaborative large-scale dense 3d reconstruction with online inter-agent pose optimisation. arXiv:1801.08361.

  • Ha, H., Perdoch, M., Alismail, H., Kweon, I. S., & Sheikh, Y. (2017). Deltille grids for geometric camera calibration. In ICCV.

  • Ha, H., Rameau, F., & Kweon, I. S. (2015). 6-dof direct homography tracking with extended kalman filter. In PSIVT.

  • Hartley, R., & Zisserman, A. (2003). Multiple view geometry in computer vision. Cambridge University Press.

  • Ikeda, S., Takemura, I., Kimura, A., & Shibata, F. (2018). Diminished reality system based on open-source software for self-driving mobility. In 2018 IEEE international symposium on mixed and augmented reality adjunct (ISMAR-Adjunct) (pp. 354–357). IEEE.

  • Karrer, M., Schmuck, P., & Chli, M. (2018). Cvi-slam-collaborative visual-inertial slam. IEEE Robotics and Automation Letters, 3(4), 2762–2769.

    Article  Google Scholar 

  • Klein, G., & Murray, D. (2009). Parallel tracking and mapping on a camera phone. In ISMAR.

  • Kneip, L., Scaramuzza, D., & Siegwart, R. (2011). A novel parametrization of the perspective-three-point problem for a direct computation of absolute camera position and orientation. In CVPR.

  • Levinson, J., Askeland, J., Becker, J., Dolson, J., Held, D., Kammel, S., Kolter, J. Z., Langer, D., Pink, O., Pratt, V., & Sokolsky, M. (2011). Towards fully autonomous driving: Systems and algorithms. In IV.

  • Liu, J., Gong, X., & Liu, J. (2012). Guided inpainting and filtering for kinect depth maps. In ICPR.

  • Liu, J., Cai, Bg., & Wang, J. (2017). Cooperative localization of connected vehicles: Integrating gnss with dsrc using a robust cubature kalman filter. IEEE Transactions on Intelligent Transportation Systems, 18(8), 2111–2125.

    Article  Google Scholar 

  • Lucas, B. D., Kanade, T., et al. (1981). An iterative image registration technique with an application to stereo vision. British Columbia.

  • Miksik, O., & Mikolajczyk, K. (2012). Evaluation of local detectors and descriptors for fast feature matching. In ICPR.

  • Mohanarajah, G., Usenko, V., Singh, M., D’Andrea, R., & Waibel, M. (2015). Cloud-based collaborative 3d mapping in real-time with low-cost robots. IEEE Transactions on Automation Science and Engineering, 12(2), 423–431.

  • Moratuwage, M., Wijesoma, W. S., Kalyan, B., Patrikalakis, N. M., & Moghadam, P. (2010). Collaborative multi-vehicle localization and mapping in high clutter environments. In ICARCV.

  • Morrison, J. G., Gálvez-López, D., & Sibley, G. (2016). Moarslam: Multiple operator augmented rslam. In Distributed autonomous robotic systems (pp. 119–132). Springer.

  • Mur-Artal, R., & Tardós, J. D. (2017). Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras. IEEE Transactions on Robotics, 33(5), 1255–1262.

    Article  Google Scholar 

  • Naja, R., et al. (2013). Wireless vehicular networks for car collision avoidance (Vol. 2013). Springer.

  • Nettleton, E. W., Durrant-Whyte, H. F., Gibbens, P. W., & Göktogan, A. H. (2000). Multiple-platform localization and map building. In Sensor fusion and decentralized control in robotic systems III, international society for optics and photonics (Vol. 4196, pp. 337–347).

  • Olaverri-Monreal, C., Gomes, P., Fernandes, R., Vieira, F., & Ferreira, M. (2010). The see-through system: A vanet-enabled assistant for overtaking maneuvers. In IV.

  • Pereira, F. I., Luft, J. A., Ilha, G., & Susin, A. (2018). A novel resection-intersection algorithm with fast triangulation applied to monocular visual odometry. IEEE Transactions on Intelligent Transportation Systems, 19(11), 3584–3593.

    Article  Google Scholar 

  • Perron, J. M., Huang, R., Thomas, J., Zhang, L., Tan, P., & Vaughan, R. T. (2015). Orbiting a moving target with multi-robot collaborative visual slam. In Workshop on multi-view geometry in robotics (MVIGRO) (pp. 1339–1344).

  • Persson, M., Piccini, T., Felsberg, M., & Mester, R. (2015). Robust stereo visual odometry from monocular techniques. In IV.

  • Piasco, N., Sidibé, D., Demonceaux, C., & Gouet-Brunet, V. (2018). A survey on visual-based localization: On the benefit of heterogeneous data. Pattern Recognition, 74, 90–109.

    Article  Google Scholar 

  • Pire, T., Fischer, T., Castro, G., De Cristóforis, P., Civera, J., & Berlles, J. J. (2017). S-ptam: Stereo parallel tracking and mapping. Robotics and Autonomous Systems, 93, 27–42.

    Article  Google Scholar 

  • Qin, T., Cao, S., Pan, J., & Shen, S. (2019). A general optimization-based framework for global pose estimation with multiple sensors. arXiv:1901.03642

  • Rameau, F., Bailo, O., Park, J., Joo, K., Choe, J., & Kweon, I. S. (2018). Real-time demonstration of collaborative localization of a swarm of connected vehicles. In IW-FCV.

  • Rameau, F., Ha, H., Joo, K., Choi, J., & Kweon, I. (2016a). A real-time vehicular vision system to seamlessly see-through cars. In ECCV-Workshop.

  • Rameau, F., Ha, H., Joo, K., Choi, J., Park, K., & Kweon, I. S. (2016b). A real-time augmented reality system to see-through cars. IEEE Transactions on Visualization and Computer Graphics,22(11), 2395–2404.

  • Rezaei, S., & Sengupta, R. (2007). Kalman filter-based integration of dgps and vehicle sensors for localization. IEEE Transactions on Control Systems Technology, 15(6), 1080–1088.

    Article  Google Scholar 

  • Riazuelo, L., Civera, J., & Montiel, J. M. (2014). C2tam: A cloud framework for cooperative tracking and mapping. Robotics and Autonomous Systems, 62(4), 401–413.

    Article  Google Scholar 

  • Rosten, E., Porter, R., & Drummond, T. (2010). Faster and better: A machine learning approach to corner detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(1), 105–119.

    Article  Google Scholar 

  • Rublee, E., Rabaud, V., Konolige, K., & Bradski, G. (2011). Orb: An efficient alternative to sift or surf. In ICCV.

  • Schmuck, P., & Chli, M. (2017). Multi-uav collaborative monocular slam. In ICRA.

  • Schmuck, P., & Chli, M. (2019). Ccm-slam: Robust and efficient centralized collaborative monocular simultaneous localization and mapping for robotic teams. Journal of Field Robotics, 36(4), 763–781.

    Article  Google Scholar 

  • Shen, M., Sun, J., Peng, H., & Zhao, D. (2018). Improving localization accuracy in connected vehicle networks using Rao–Blackwellized particle filters: Theory, simulations, and experiments. IEEE Transactions on Intelligent Transportation Systems, 20(6), 2255–2266.

  • Silpa-Anan, C., & Hartley, R. (2008). Optimised kd-trees for fast image descriptor matching. In CVPR.

  • Steiner, A., Hunt, D., Ho, S. P., Kirchengast, G., Mannucci, A., Scherllin-Pirscher, B., et al. (2013). Quantification of structural uncertainty in climate data records from gps radio occultation. Atmospheric Chemistry and Physics, 13(3), 1469–1484.

    Article  Google Scholar 

  • Tao, Z., Bonnifait, P., Fremont, V., & Ibanez-Guzman, J. (2013). Lane marking aided vehicle localization. In ITSC.

  • Van Opdenbosch, D., Aykut, T., Alt, N., & Steinbach, E. (2018). Efficient map compression for collaborative visual slam. In 2018 IEEE winter conference on applications of computer vision (WACV) (pp. 992–1000). IEEE.

  • Wei, L., Soheilian, B., Gouet- & Brunet, V. (2014). Augmenting vehicle localization accuracy with cameras and 3d road infrastructure database. In ECCV-Workshop.

  • WHO. (2015). lobal status report on road safety 2015. World Health Organization.

  • Wimmer, P. (2005). Stereoscopic player and stereoscopic multiplexer: A computer-based system for stereoscopic video playback and recording. In Stereoscopic displays and virtual reality systems XII (Vol. 5664, pp. 400–412).

  • Wing, M. G., Eklund, A., & Kellogg, L. D. (2005). Consumer-grade global positioning system (gps) accuracy and reliability. Journal of Forestry, 103(4), 169.

    Article  Google Scholar 

  • Wiseman, Y. (2018). In an era of autonomous vehicles, rails are obsolete. International Journal of Control and Automation, 11(2), 151–160.

    Article  Google Scholar 

  • Xu, Z., Li, X., Zhao, X., Zhang, M. H., & Wang, Z. (2017). DSRC versus 4G-LTE for connected vehicle applications: A study on field experiments of vehicular communication performance. Journal of Advanced Transportation, 2017, 2750452. Available: https://www.hindawi.com/journals/jat/2017/2750452/

  • Zheng, Y., Kuang, Y., Sugimoto, S., Astrom, K., & Okutomi, M. (2013). Revisiting the pnp problem: A fast, general and optimal solution. In ICCV.

  • Zou, D., & Tan, P. (2012). Coslam: Collaborative visual slam in dynamic environments. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(2), 354–366.

  • Zou, D., Tan, P., & Yu, W. (2019). Collaborative visual slam for multiple agents: A brief survey. Virtual Reality & Intelligent Hardware,1(5), 461–482.

Download references

Acknowledgements

This research was supported by the Shared Sensing for Cooperative Cars Project funded by Bosch (China) Investment Ltd. The first author was supported by Korea Research Fellowship (KRF) Program through the NRF funded by the Ministry of Science, ICT and Future Planning (2015H1D3A1066564). The third author (Jinsun Park) was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2021R1I1A1A01060267). The fourth author (Kyungdon Joo) was supported by the Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No.2020-0-01336, Artificial Intelligence Graduate School Program (UNIST)) and the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. NRF-2021R1C1C1005723).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Francois Rameau.

Additional information

Communicated by Jun Sato.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Oleksandr Bailo: The work was done at KAIST.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rameau, F., Bailo, O., Park, J. et al. Real-Time Multi-Car Localization and See-Through System. Int J Comput Vis 130, 384–404 (2022). https://doi.org/10.1007/s11263-021-01558-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11263-021-01558-5

Keywords

Navigation