skip to main content
10.1145/3650400.3650427acmotherconferencesArticle/Chapter ViewAbstractPublication PageseitceConference Proceedingsconference-collections
research-article

UCED-Detector: An Ultra-fast Corner Event Detector for Event Camera in Complex Scenes: UCED-Detector: Ultra-fast Detector An ultra-fast detector for detecting feature corner events in a high-speed event stream

Published:17 April 2024Publication History

ABSTRACT

The asynchronous event stream of event camera output overcomes exposure problems caused by dramatic changes in ambient light and motion blur caused by high-speed motion, which are common challenges with traditional cameras. With the increasing number of events per second delivered by event cameras, a faster feature extraction method is necessary to process large amounts of events to take advantage of event cameras for various computer vision tasks. We propose UCED-Detector, an event frame-based corner event detector that can detect features in event streams at three times the speed of the SOTA method. Firstly, we use events captured in the past to remove noise events from the current event stream. The events in the circular mask around the event to be detected are then constructed as event pairs and mark the events whose timestamps are one threshold larger than the other event in the event pair. Finally, the marked adjacent events are connected into arcs, and whether the event to be detected is a feature corner event is judged according to the arc length. To evaluate the performance of our proposed approach, we conducted extensive experiments on datasets of event cameras. The results show that our method reduces the detection time to one-third of the SOTA method and reduces the average processing time per event from 0.15 milliseconds to 0.04 milliseconds.

References

  1. Campos C, Elvira R, Rodríguez J J G, Orb-slam3: An accurate open-source library for visual, visual–nertial, and multimap slam [J]. IEEE Transactions on Robotics, 2021, 37(6): 1874-1890.Google ScholarGoogle ScholarCross RefCross Ref
  2. Qin T, Li P, Shen S. Vins-mono: A robust and versatile monocular visual-inertial state estimator [J]. IEEE Transactions on Robotics, 2018, 34(4): 1004-1020.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bescos B, Fácil J M, Civera J, DynaSLAM: Tracking, mapping, and inpainting in dynamic scenes [J]. IEEE Robotics and Automation Letters, 2018, 3(4): 4076-4083.Google ScholarGoogle ScholarCross RefCross Ref
  4. Yang S, Scherer S. Cubeslam: Monocular 3-d object slam[J]. IEEE Transactions on Robotics, 2019, 35(4): 925-938.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Mur-Artal R, Tardós J D. Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras [J]. IEEE transactions on robotics, 2017, 33(5): 1255-1262.Google ScholarGoogle Scholar
  6. Zhu Z, Peng S, Larsson V, Nice-slam: Neural implicit scalable encoding for slam[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2022, 12786-12796.Google ScholarGoogle Scholar
  7. Feng S, Sun H, Yan X, Dense reinforcement learning for safety validation of autonomous vehicles [J]. Nature, 2023, 615(7953): 620-627.Google ScholarGoogle ScholarCross RefCross Ref
  8. Serrano-Gotarredona T, Linares-Barranco B. A 128$\,\times $128 1.5\% Contrast Sensitivity 0.9\% FPN 3 μs Latency 4 mW Asynchronous Frame-Free Dynamic Vision Sensor Using Transimpedance Preamplifiers [J]. IEEE Journal of Solid-State Circuits, 2013, 48(3): 827-838.Google ScholarGoogle ScholarCross RefCross Ref
  9. Brandli C, Berner R, Yang M, A 240× 180 130 db 3 μs latency global shutter spatiotemporal vision sensor [J]. IEEE Journal of Solid-State Circuits, 2014, 49(10): 2333-2341.Google ScholarGoogle ScholarCross RefCross Ref
  10. Huang K, Zhang S, Zhang J, Event-based Simultaneous Localization and Mapping: A Comprehensive Survey [J]. arXiv preprint arXiv:2304.09793, 2023.Google ScholarGoogle Scholar
  11. Gallego G, Delbrück T, Orchard G, Event-based vision: A survey [J]. IEEE transactions on pattern analysis and machine intelligence, 2020, 44(1): 154-180.Google ScholarGoogle Scholar
  12. Chen P, Guan W, Lu P. Esvio: Event-based stereo visual inertial odometry [J]. IEEE Robotics and Automation Letters, 2023.Google ScholarGoogle Scholar
  13. Zuo Y F, Yang J, Chen J, Devo: Depth-event camera visual odometry in challenging conditions[C]//2022 International Conference on Robotics and Automation (ICRA). IEEE, 2022: 2179-2185.Google ScholarGoogle Scholar
  14. Clady X, Ieng S H, Benosman R. Asynchronous event-based corner detection and matching [J]. Neural Networks, 2015, 66: 91-106.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Moeys D P, Li C, Martel J N P, Color temporal contrast sensitivity in dynamic vision sensors[C]//2017 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE, 2017, 1-4.Google ScholarGoogle Scholar
  16. Huang J, Guo M, Chen S. A dynamic vision sensor with direct logarithmic output and full-frame picture-on-demand[C]//2017 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE, 2017, 1-4.Google ScholarGoogle Scholar
  17. Huang J, Guo M, Wang S, A motion sensor with on-chip pixel rendering module for optical flow gradient extraction[C]//2018 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE, 2018, 1-5.Google ScholarGoogle Scholar
  18. Benosman R, Ieng S H, Clercq C, Asynchronous frameless event-based optical flow [J]. Neural Networks, 2012, 27: 32-37.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Alzugaray I, Chli M. ACE: An efficient asynchronous corner tracker for event cameras[C]//2018 International Conference on 3D Vision (3DV). IEEE, 2018, 653-661.Google ScholarGoogle Scholar
  20. Manderscheid J, Sironi A, Bourdis N, Speed invariant time surface for learning to detect corner points with event-based cameras[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2019, 10245-10254.Google ScholarGoogle Scholar
  21. Alzugaray I, Chli M. Asynchronous corner detection and tracking for event cameras in real time[J]. IEEE Robotics and Automation Letters, 2018, 3(4): 3177-3184.Google ScholarGoogle ScholarCross RefCross Ref
  22. Brändli C, Strubel J, Keller S, ELiSeD–An event-based line segment detector[C]//2016 Second International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP). IEEE, 2016, 1-7.Google ScholarGoogle Scholar
  23. Glover A, Bartolozzi C. Event-driven ball detection and gaze fixation in clutter[C]//2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2016, 2203-2208.Google ScholarGoogle Scholar
  24. Vasco V, Glover A, Bartolozzi C. Fast event-based Harris corner detection exploiting the advantages of event-driven cameras[C]//2016 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, 2016, 4144-4149.Google ScholarGoogle Scholar
  25. Mueggler E, Bartolozzi C, Scaramuzza D. Fast event-based corner detection [J]. 2017.Google ScholarGoogle ScholarCross RefCross Ref
  26. Li R, Shi D, Zhang Y, Fa-harris: A fast and asynchronous corner detector for event cameras[C]//2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2019, 6223-6229.Google ScholarGoogle Scholar
  27. Zhao J, Su L, Wang X, DTFS-eHarris: A High Accuracy Asynchronous Corner Detector for Event Cameras in Complex Scenes [J]. Applied Sciences, 2023, 13(9): 5761.Google ScholarGoogle ScholarCross RefCross Ref
  28. Mueggler E, Rebecq H, Gallego G, The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM[J]. The International Journal of Robotics Research, 2017, 36(2): 142-149.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. UCED-Detector: An Ultra-fast Corner Event Detector for Event Camera in Complex Scenes: UCED-Detector: Ultra-fast Detector An ultra-fast detector for detecting feature corner events in a high-speed event stream

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      EITCE '23: Proceedings of the 2023 7th International Conference on Electronic Information Technology and Computer Engineering
      October 2023
      1809 pages
      ISBN:9798400708305
      DOI:10.1145/3650400

      Copyright © 2023 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 17 April 2024

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate508of972submissions,52%
    • Article Metrics

      • Downloads (Last 12 months)4
      • Downloads (Last 6 weeks)4

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format