Abstract:
This demo presents a method of visual tracking using the output of an event-based asynchronous neuromorphic event-based camera. The approach is event-based thus adapted t...Show MoreMetadata
Abstract:
This demo presents a method of visual tracking using the output of an event-based asynchronous neuromorphic event-based camera. The approach is event-based thus adapted to the scene-driven properties of these sensors. The method allows to track multiple visual features in real time at a frequency of several hundreds kilohertz. It adapts to scene contents, combining both spatial and temporal correlations of events in an asynchronous iterative framework. Various kernels are used to track features from incoming events such as Gaussian, Gabor, combinations of Gabor functions and any hand-made kernel with very weak constraints. The proposed features tracking method can deal with feature variations in position, scale and orientation. The tracking performance is evaluated experimentally for each kernel to prove the robustness of the proposed solution.
Date of Conference: 22-24 October 2014
Date Added to IEEE Xplore: 11 December 2014
Electronic ISBN:978-1-4799-2346-5
Print ISSN: 2163-4025