Denoising for Neuromorphic Cameras Based on Graph Spectral Features | IEEE Conference Publication | IEEE Xplore

Denoising for Neuromorphic Cameras Based on Graph Spectral Features


Abstract:

Neuromorphic cameras, also known as event-based cameras, can detect changes in the environmental brightness asynchronously and independently for each pixel. They output t...Show More

Abstract:

Neuromorphic cameras, also known as event-based cameras, can detect changes in the environmental brightness asynchronously and independently for each pixel. They output the changes, i.e., events, as 3-D (2-D pixel coordinates + time) streaming data. While event-based cameras are used in many applications because of their desirable characteristics, e.g., high temporal resolution, low latency, low power consumption, and high dynamic range, their measurements contain considerable noise due to their high sensitivity. In this paper, we propose a simple yet effective denoising method for event-based cameras based on graph spectral features. We utilize the fact that the real events captured are often densely distributed in the streaming data while the noise events are spatiotemporally sparse. In the proposed method, we first construct a graph where nodes represent events and edges represent the spatiotemporal distance between the events. Next, we calculate the Fiedler vector, which is the eigenvector of the graph operator associated with the second smallest eigenvalue. The obtained Fiedler vector is used for extracting real events directly. In the calculation of the Fiedler vector, we leverage a power method instead of the naive eigenvalue decomposition and thereby reduce its computational complexity. In experiments, we demonstrate that the proposed method effectively removes noise events from the raw events compared to alternative methods.
Date of Conference: 02-04 October 2024
Date Added to IEEE Xplore: 12 November 2024
ISBN Information:

ISSN Information:

Conference Location: West Lafayette, IN, USA

Contact IEEE to Subscribe

References

References is not available for this document.