skip to main content
10.1145/3458305.3463373acmconferencesArticle/Chapter ViewAbstractPublication PagesmmsysConference Proceedingsconference-collections
research-article

Motion segmentation and tracking for integrating event cameras

Published:15 July 2021Publication History

ABSTRACT

Integrating event cameras are asynchronous sensors wherein incident light values may be measured directly through continuous integration, with individual pixels' light sensitivity being adjustable in real time, allowing for extremely high frame rate and high dynamic range video capture. This paper builds on lessons learned with previous attempts to compress event data and presents a new scheme for event compression that has many analogues to traditional framed video compression techniques. We show how traditional video can be transcoded to an event-based representation, and describe the direct encoding of motion data in our event-based representation. Finally, we present experimental results proving how our simple scheme already approaches the state-of-the-art compression performance for slow-motion object tracking. This system introduces an application "in the loop" framework, where the application dynamically informs the camera how sensitive each pixel should be, based on the efficacy of the most recent data received.

References

  1. Z. Bi, S. Dong, Y. Tian, and T. Huang. 2018. Spike Coding for Dynamic Vision Sensors. In 2018 Data Compression Conference. 117--126.Google ScholarGoogle Scholar
  2. Christian Brandli, Lorenz Muller, and Tobi Delbruck. 2014. Real-time, high-speed video decompression using a frame- and event-based DAVIS sensor. In 2014 IEEE International Symposium on Circuits and Systems (ISCAS). 686--689. Google ScholarGoogle ScholarCross RefCross Ref
  3. S. Dong, Z. Bi, Y. Tian, and T. Huang. 2019. Spike Coding for Dynamic Vision Sensor in Intelligent Driving. IEEE Internet of Things Journal 6, 1 (2019), 60--71. Google ScholarGoogle ScholarCross RefCross Ref
  4. FFmpeg Project. 2020. FFmpeg. https://ffmpeg.org/Google ScholarGoogle Scholar
  5. Andrew C. Freeman and Ketan Mayer-Patel. 2020. Integrating Event Camera Sensor Emulator. In Proceedings of the 28th ACM International Conference on Multimedia (Seattle, WA, USA) (MM '20). Association for Computing Machinery, New York, NY, USA, 4503--4505. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Andrew C. Freeman and Ketan Mayer-Patel. 2021. Lossy Compression for Integrating Event Cameras. In 2021 Data Compression Conference.Google ScholarGoogle Scholar
  7. G. Gallego, T. Delbruck, G. M. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza. 2020. Event-based Vision: A Survey. IEEE Transactions on Pattern Analysis and Machine Intelligence (2020), 1--1. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. H. K. Galoogahi, A. Fagg, C. Huang, D. Ramanan, and S. Lucey. 2017. Need for Speed: A Benchmark for Higher Frame Rate Object Tracking. In 2017 IEEE International Conference on Computer Vision (ICCV). 1134--1143. Google ScholarGoogle ScholarCross RefCross Ref
  9. Nabeel Khan, Khurram Iqbal, and Maria Martini. 2020. Lossless Compression of Data From Static and Mobile Dynamic Vision Sensors-Performance and Tradeoffs. IEEE Access PP (05 2020), 1--1. Google ScholarGoogle ScholarCross RefCross Ref
  10. P. Lichtsteiner, C. Posch, and T. Delbruck. 2006. A 128 X 128 120db 30mw asynchronous vision sensor that responds to relative intensity change. In 2006 IEEE International Solid State Circuits Conference - Digest of Technical Papers. 2060--2069.Google ScholarGoogle Scholar
  11. D. P. Moeys, F. Corradi, C. Li, S. A. Bamford, L. Longinotti, F. F. Voigt, S. Berry, G. Taverni, F. Helmchen, and T. Delbruck. 2018. A Sensitive Dynamic and Active Pixel Vision Sensor for Color or Neural Imaging Applications. IEEE Transactions on Biomedical Circuits and Systems 12, 1 (Feb 2018), 123--136. Google ScholarGoogle ScholarCross RefCross Ref
  12. D. P. Moeys, C. Li, J. N. P. Martel, S. Bamford, L. Longinotti, V. Motsnyi, D. San Segundo Bello, and T. Delbruck. 2017. Color temporal contrast sensitivity in dynamic vision sensors. In 2017 IEEE International Symposium on Circuits and Systems (ISCAS). 1--4. Google ScholarGoogle ScholarCross RefCross Ref
  13. S. M. Mostafavi I., J. Choi, and K. J. Yoon. 2020. Learning to Super Resolve Intensity Images From Events. In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). 2765--2773. Google ScholarGoogle ScholarCross RefCross Ref
  14. OpenCV. 2020. OpenCV. https://opencv.org/Google ScholarGoogle Scholar
  15. António R. C. Paiva, Il Park, and José C. Príncipe. 2009. A Reproducing Kernel Hilbert Space Framework for Spike Train Signal Processing. Neural Comput. 21, 2 (Feb. 2009), 424--449. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. H. Rebecq, R. Ranftl, V. Koltun, and D. Scaramuzza. 2019. Events-To-Video: Bringing Modern Computer Vision to Event Cameras. In 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). 3852--3861. Google ScholarGoogle ScholarCross RefCross Ref
  17. M. Singh, P. Zhang, A. Vitkus, K. Mayer-Patel, and L. Vicci. 2017. A Frameless Imaging Sensor with Asynchronous Pixels: An Architectural Evaluation. In 2017 23rd IEEE International Symposium on Asynchronous Circuits and Systems (ASYNC). 110--117. Google ScholarGoogle ScholarCross RefCross Ref
  18. Aaron J. Smith, Montek Singh, and Ketan Mayer-Patel. 2017. A System Model For Frameless Asynchronous High Dynamic Range Sensors. In NOSSDAV'17.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Timo Stoffregen, Guillermo Gallego, Tom Drummond, Lindsay Kleeman, and Davide Scaramuzza. 2019. Event-Based Motion Segmentation by Motion Compensation. CoRR abs/1904.01293 (2019). arXiv:1904.01293 http://arxiv.org/abs/1904.01293Google ScholarGoogle Scholar
  20. G. Taverni, D. Paul Moeys, C. Li, C. Cavaco, V. Motsnyi, D. San Segundo Bello, and T. Delbruck. 2018. Front and Back Illuminated Dynamic and Active Pixel Vision Sensors Comparison. IEEE Transactions on Circuits and Systems II: Express Briefs 65, 5 (May 2018), 677--681. Google ScholarGoogle ScholarCross RefCross Ref
  21. Y. Wu, J. Lim, and M. Yang. 2013. Online Object Tracking: A Benchmark. In 2013 IEEE Conference on Computer Vision and Pattern Recognition. 2411--2418. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Motion segmentation and tracking for integrating event cameras

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in
            • Published in

              cover image ACM Conferences
              MMSys '21: Proceedings of the 12th ACM Multimedia Systems Conference
              June 2021
              254 pages
              ISBN:9781450384346
              DOI:10.1145/3458305

              Copyright © 2021 ACM

              Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 15 July 2021

              Permissions

              Request permissions about this article.

              Request Permissions

              Check for updates

              Qualifiers

              • research-article

              Acceptance Rates

              MMSys '21 Paper Acceptance Rate18of55submissions,33%Overall Acceptance Rate176of530submissions,33%

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader