On the Stability of Analog ReLU Networks | IEEE Journals & Magazine | IEEE Xplore

On the Stability of Analog ReLU Networks


Abstract:

Rectified linear unit (ReLU) networks have become widely used in machine learning and automated inference using neural networks. Various forms of hardware accelerators ba...Show More

Abstract:

Rectified linear unit (ReLU) networks have become widely used in machine learning and automated inference using neural networks. Various forms of hardware accelerators based on ReLU networks have also been under development. In this brief, the stability problem in analog ReLU networks is addressed. Using the Lyapunov stability theory, it is shown that the origin of an unforced, analog ReLU dynamical system is globally asymptotically stable if the induced Euclidean norm of its connectivity matrix is less than one. An example is given to demonstrate that this upper bound is the best that can be achieved. In particular, the stability result holds for the case of a nonsymmetric connectivity matrix as may occur in some mathematical models of neurobiology.
Page(s): 2426 - 2430
Date of Publication: 03 December 2020

ISSN Information:


References

References is not available for this document.