Loading [a11y]/accessibility-menu.js
Efficient Spatial-Temporal Normalization of SAE Representation for Event Camera | IEEE Journals & Magazine | IEEE Xplore

Efficient Spatial-Temporal Normalization of SAE Representation for Event Camera


Abstract:

Event-based cameras are a new type of vision sensor that can encode spatial-temporal context in a pixel-level event stream. Its appealing properties offer great potential...Show More

Abstract:

Event-based cameras are a new type of vision sensor that can encode spatial-temporal context in a pixel-level event stream. Its appealing properties offer great potential for applications requiring low processing latency and low power consumption. As an effective representation of events, the surface of active event (SAE) has become a favorable choice for corner detection and object classification, among others. These tasks apply normalizations as an essential preprocessing step to extract time-invariant features from SAEs. However, previous normalization methods have some drawbacks, including low efficiency, requiring parameter tuning, etc. These drawbacks largely limit their performances in practical tasks. In this work, we propose a highly efficient normalization method, i.e., chain normalization, to break the limits in the previous state-of-the-art. We leverage the inherent properties of SAE in designing. First, we propose a novel SAE implementation to utilize the characteristics of SAE. Compared with previous works, our method can efficiently capture the spatial and temporal relationships of events and enable robust normalization. Second, we further increase the efficiency by using a novel stacking strategy. We compare our method to the state-of-the-art with extensive experiments, showing high classification accuracy and a significant improvement in runtime performance. We also release the source codes for future distribution and improvement in the community.
Published in: IEEE Robotics and Automation Letters ( Volume: 5, Issue: 3, July 2020)
Page(s): 4265 - 4272
Date of Publication: 18 May 2020

ISSN Information:

Funding Agency:


References

References is not available for this document.