skip to main content
10.1145/3459066.3459075acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmvaConference Proceedingsconference-collections
research-article

Standard and Event Cameras Fusion for Feature Tracking

Published:26 July 2021Publication History

ABSTRACT

Standard cameras are frame-based sensors that capture the scene at a fixed rate. They cannot provide information between two frames and suffer from the motion blur problem in high-speed robotic and vision applications. By contrast, event-based cameras are a novel type of sensors that generate asynchronous ”events” if the intensity changes at a particular pixel. The data types of these two sensors are fundamentally different. In this paper, we leverage the complementarity of event-based and standard frame-based cameras and propose a fusion strategy for feature tracking. Features are extracted in frames, tracked by the event stream and updated when new frames come. The event camera tracks and predicts features on new frames, and the standard camera modifies features. This paradigm is different from existing fusion-based tracking methods which only use the first frame for initialization and abandon the following frames. We evaluate our method on Event-Camera Dataset [17] and show that the feature update process improves the tracking qualities.

References

  1. I. Alzugaray and M. Chli. 2018. Asynchronous Corner Detection and Tracking for Event Cameras in Real Time. IEEE Robotics and Automation Letters 3, 4 (2018), 3177–3184. https://doi.org/10.1109/LRA.2018.2849882Google ScholarGoogle ScholarCross RefCross Ref
  2. I. Alzugaray and M. Chli. 2019. Asynchronous Multi-Hypothesis Tracking of Features with Event Cameras. In 2019 International Conference on 3D Vision (3DV). 269–278. https://doi.org/10.1109/3DV.2019.00038Google ScholarGoogle Scholar
  3. C. Brandli, R. Berner, M. Yang, S. Liu, and T. Delbruck. 2014. A 240 × 180 130 dB 3 µs Latency Global Shutter Spatiotemporal Vision Sensor. IEEE Journal of Solid-State Circuits 49, 10 (2014), 2333–2341. https://doi.org/10.1109/JSSC.2014.2342715Google ScholarGoogle ScholarCross RefCross Ref
  4. C. Brändli, J. Strubel, S. Keller, D. Scaramuzza, and T. Delbruck. 2016. ELiSeD — An event-based line segment detector. In 2016 Second International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP). 1–7. https://doi.org/10.1109/EBCCSP.2016.7605244Google ScholarGoogle Scholar
  5. J. Canny. 1986. A Computational Approach to Edge Detection. IEEE Transactions on Pattern Analysis and Machine Intelligence PAMI-8, 6(1986), 679–698. https://doi.org/10.1109/TPAMI.1986.4767851Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. A. Censi, J. Strubel, C. Brandli, T. Delbruck, and D. Scaramuzza. 2013. Low-latency localization by active LED markers tracking using a dynamic vision sensor. In 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems. 891–898. https://doi.org/10.1109/IROS.2013.6696456Google ScholarGoogle ScholarCross RefCross Ref
  7. G. D. Evangelidis and E. Z. Psarakis. 2008. Parametric Image Alignment Using Enhanced Correlation Coefficient Maximization. IEEE Transactions on Pattern Analysis and Machine Intelligence 30, 10(2008), 1858–1865. https://doi.org/10.1109/TPAMI.2008.113Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. L. Everding and J. Conradt. 2018. Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors. Frontiers in neurorobotics 12 (2018), 4. https://doi.org/10.3389/fnbot.2018.00004Google ScholarGoogle Scholar
  9. G. Gallego, T. Delbruck, G.M. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. Davison, J. Conradt, K. Daniilidis, and et al.2020. Event-based Vision: A Survey. IEEE Transactions on Pattern Analysis and Machine Intelligence (2020), 1–1. https://doi.org/10.1109/tpami.2020.3008413Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. D. Gehrig, H. Rebecq, G. Gallego, and D. Scaramuzza. 2020. EKLT: Asynchronous Photometric Feature Tracking Using Events and Frames. International Journal of Computer Vision 128, 3 (2020), 601 – 618. https://doi.org/10.1007/s11263-019-01209-wGoogle ScholarGoogle ScholarCross RefCross Ref
  11. B. Kueng, E. Mueggler, G. Gallego, and D. Scaramuzza. 2016. Low-latency visual odometry using event-based feature tracks. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 16–23. https://doi.org/10.1109/IROS.2016.7758089Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. K. Li, D. Shi, Y. Zhang, R. Li, W. Qin, and R. Li. 2019. Feature Tracking Based on Line Segments With the Dynamic and Active-Pixel Vision Sensor (DAVIS). IEEE Access 7(2019), 110874–110883. https://doi.org/10.1109/ACCESS.2019.2933594Google ScholarGoogle ScholarCross RefCross Ref
  13. R. X. Li, D. X. Shi, Y. J. Zhang, K. Y. Li, and R. H. Li. 2019. FA-Harris: A Fast and Asynchronous Corner Detector for Event Cameras. arxiv:1906.10925 [cs.CV]Google ScholarGoogle Scholar
  14. H. Liu, D. P. Moeys, G. Das, D. Neil, S. Liu, and T. Delbrück. 2016. Combined frame- and event-based detection and tracking. In 2016 IEEE International Symposium on Circuits and Systems (ISCAS). 2511–2514. https://doi.org/10.1109/ISCAS.2016.7539103Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. B. D. Lucas and T. Kanade. 1981. An Iterative Image Registration Technique with an Application to Stereo Vision. In Proceedings of the 7th International Joint Conference on Artificial Intelligence - Volume 2 (Vancouver, BC, Canada) (IJCAI’81). Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 674–679.Google ScholarGoogle Scholar
  16. E. Mueggler, B. Huber, and D. Scaramuzza. 2014. Event-based, 6-DOF pose tracking for high-speed maneuvers. In 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems. 2761–2768. https://doi.org/10.1109/IROS.2014.6942940Google ScholarGoogle ScholarCross RefCross Ref
  17. E. Mueggler, H. Rebecq, G. Gallego, T. Delbruck, and D. Scaramuzza. 2017. The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM. The International Journal of Robotics Research 36, 2 (2017), 142–149. https://doi.org/10.1177/0278364917691115Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. E. Muggler, C. Bartolozzi, and D. Scaramuzza. 2017. Fast event-based corner detection. In British Machine Vision Conference (BMVC). British Machine Vision Conference (BMVC), London, 2017., 1–8. https://doi.org/10.5167/uzh-138925Google ScholarGoogle Scholar
  19. J. B. Paul and D. M. Neil. 1992. Method for registration of 3-D shapes. In Sensor Fusion IV: Control Paradigms and Data Structures, Paul S. Schenker (Ed.), Vol. 1611. International Society for Optics and Photonics, SPIE, 586 – 606. https://doi.org/10.1117/12.57955Google ScholarGoogle Scholar
  20. C. Posch, T. Serrano-Gotarredona, B. Linares-Barranco, and T. Delbruck. 2014. Retinomorphic Event-Based Vision Sensors: Bioinspired Cameras With Spiking Output. Proc. IEEE 102, 10 (2014), 1470–1484. https://doi.org/10.1109/JPROC.2014.2346153Google ScholarGoogle Scholar
  21. T. Serrano-Gotarredona and B. Linares-Barranco. 2013. A 128 × 128 1.5% Contrast Sensitivity 0.9% FPN 3 µs Latency 4 mW Asynchronous Frame-Free Dynamic Vision Sensor Using Transimpedance Preamplifiers. IEEE Journal of Solid-State Circuits 48, 3 (2013), 827–838. https://doi.org/10.1109/JSSC.2012.2230553Google ScholarGoogle ScholarCross RefCross Ref
  22. J. Shi and Tomasi. 1994. Good features to track. In 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition. 593–600. https://doi.org/10.1109/CVPR.1994.323794Google ScholarGoogle Scholar
  23. D. R. Valeiras, X. Clady, S. H. Ieng, and R. Benosman. 2018. Event-Based Line Fitting and Segment Detection Using a Neuromorphic Visual Sensor. IEEE transactions on neural networks and learning systems (September 2018). https://doi.org/10.1109/tnnls.2018.2807983Google ScholarGoogle Scholar
  24. A. R. Vidal, H. Rebecq, T. Horstschaefer, and D. Scaramuzza. 2018. Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High-Speed Scenarios. IEEE Robotics and Automation Letters 3, 2 (2018), 994–1001. https://doi.org/10.1109/LRA.2018.2793357Google ScholarGoogle ScholarCross RefCross Ref
  25. A. Z. Zhu, N. Atanasov, and K. Daniilidis. 2017. Event-based feature tracking with probabilistic data association. In 2017 IEEE International Conference on Robotics and Automation (ICRA). 4465–4470. https://doi.org/10.1109/ICRA.2017.7989517Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Standard and Event Cameras Fusion for Feature Tracking
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          ICMVA '21: Proceedings of the 2021 International Conference on Machine Vision and Applications
          February 2021
          75 pages
          ISBN:9781450389556
          DOI:10.1145/3459066

          Copyright © 2021 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 26 July 2021

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article
          • Research
          • Refereed limited
        • Article Metrics

          • Downloads (Last 12 months)108
          • Downloads (Last 6 weeks)12

          Other Metrics

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format .

        View HTML Format