Abstract:
Spiking Neural Networks (SNNs) offer a promising avenue for developing adaptive intelligence that can operate effectively, when subject to energy constraints. This is esp...Show MoreMetadata
Abstract:
Spiking Neural Networks (SNNs) offer a promising avenue for developing adaptive intelligence that can operate effectively, when subject to energy constraints. This is especially true for emerging gradient-based SNNs that are designed for continuous learning. However, hardware efficiency and complex SNN functionality are at odds with each other, leading to large design space, with many possible trade-offs. This paper discusses the co-optimization of SNNs and hardware architectures. We compare digital designs and emerging compute-in-memory architectures, in terms of operation. We discuss the effect of different layer-types on memory organization and outline how different memory organization schemes are impacted by the SNN layertype. Finally, we outline some open problems that must be solved to design the next-generation of intelligent hardware systems.
Date of Conference: 09-12 August 2020
Date Added to IEEE Xplore: 02 September 2020
ISBN Information: