Loading web-font TeX/Main/Regular
HTNN: Deep Learning in Heterogeneous Transform Domains with Sparse-Orthogonal Weights | IEEE Conference Publication | IEEE Xplore

HTNN: Deep Learning in Heterogeneous Transform Domains with Sparse-Orthogonal Weights


Abstract:

Convolutional neural networks (CNNs) achieved great success on various tasks in recent years. Their applications to low power and low cost hardware platforms, however, ha...Show More

Abstract:

Convolutional neural networks (CNNs) achieved great success on various tasks in recent years. Their applications to low power and low cost hardware platforms, however, have been often limited due to extensive complexity of convolution layers. We present a new class of transform domain deep neural networks (DNNs), where convolution operations are replaced by element-wise multiplications in heterogeneous transform domains. To further reduce the network complexity, we propose a framework to learn sparse-orthogonal weights in heterogeneous transform domains co-optimized with a hardware-efficient accelerator architecture to minimize the overhead of handling sparse weights. Furthermore, sparse-orthogonal weights are non-uniformly quantized with canonical-signed-digit (CSD) representations to substitute multiplications with simpler additions. The proposed approach reduces the complexity by a factor of 4.9– 6.8 \times without compromising the DNN accuracy compared to equivalent CNNs that employ sparse (pruned) weights. The code is available at https://github.com/unchenyu/HTNN.
Date of Conference: 26-28 July 2021
Date Added to IEEE Xplore: 04 August 2021
ISBN Information:
Conference Location: Boston, MA, USA

Contact IEEE to Subscribe

References

References is not available for this document.