Abstract:
We consider a nonconvex optimization problem arising in the theory of global asymptotic stability of discrete time recurrent neural networks. Under certain weak condition...Show MoreMetadata
Abstract:
We consider a nonconvex optimization problem arising in the theory of global asymptotic stability of discrete time recurrent neural networks. Under certain weak conditions on the transfer functions we proved that corresponding nonconvex cost function has at most one point of local maximum over every hyperplane. We derived sufficient conditions for existence of a point of local maximum of cost functions over planes of less dimensions, and show that for standard transfer functions the number of points of local maximum of cost function over a plane may be bigger than the co-dimension of the plane.
Published in: 2015 54th IEEE Conference on Decision and Control (CDC)
Date of Conference: 15-18 December 2015
Date Added to IEEE Xplore: 11 February 2016
ISBN Information: