Group Action Equivariance and Generalized Convolution in Multi-layer Neural Networks | IEEE Conference Publication | IEEE Xplore

Group Action Equivariance and Generalized Convolution in Multi-layer Neural Networks


Abstract:

Convolutional neural networks have achieved great success in speech, image, and video signal processing tasks in recent years. There have been several attempts to justify...Show More

Abstract:

Convolutional neural networks have achieved great success in speech, image, and video signal processing tasks in recent years. There have been several attempts to justify the convolutional architecture and to generalize the convolution operation for treatment of other data types such as graphs and manifolds. Based on group representation theory and noncommutative harmonic analysis, it has recently been shown that the so-called group equivariance requirement of a feed-forward neural network necessitates the convolutional architectures. In this paper, based on the familiar concepts of linear time-invariant systems, we develop an elementary proof of the same result. The nonlinear activation function, being a necessary components of practical deep neural networks, has been glossed over in previous analyses of the connection between equivariance and convolution. We identify sufficient conditions for the non-linear activation functions to preserve equivariance, and hence the necessity of the group convolution structure. Our analysis method is simple and intuitive, and holds the potential to be applied to more challenging scenarios such as non-transitive domains and multiple simultaneous equivariances.
Date of Conference: 12-17 May 2019
Date Added to IEEE Xplore: 17 April 2019
ISBN Information:

ISSN Information:

Conference Location: Brighton, UK

References

References is not available for this document.