Abstract
A special class of recurrent neural networks (RNN) has recently been proposed by Zhang et al. for solving online time-varying matrix problems. Being different from conventional gradient-based neural networks (GNN), such RNN (termed specifically as Zhang neural networks, ZNN) are designed based on matrix-valued error functions, instead of scalar-valued norm-based energy functions. In this paper, we generalize and further investigate the ZNN model for time-varying matrix square root finding. For the purpose of possible hardware (e.g., digital circuit) realization, a discrete-time ZNN model is constructed and developed, which incorporates Newton iteration as a special case. Besides, to obtain an appropriate step-size value (in each iteration), a line-search algorithm is employed for the proposed discrete-time ZNN model. Computer-simulation results substantiate the effectiveness of the proposed ZNN model aided with a line-search algorithm, in addition to the connection and explanation to Newton iteration for matrix square root finding.
Similar content being viewed by others
References
Iannazzo B (2003) A note on computing the matrix square root. Calcolo 40:273–283. doi:10.1007/s10092-003-0079-9
Charles RJ, Kazuyoshi O, Robert R (2001) Uniqueness of matrix square roots and an application. Linear Algebra Appl 323(1):51–60. doi:10.1016/S0024-3795(00)00243-3
Higham NJ (2008) Functions of matrices: theory and computation. Society for Industrial and Applied Mathematics, USA
Mohammed AH, Ali AH, Syed R (2000) Fixed point iterations for computing square roots and the matrix sign function of complex matrices. In: Proceedings of IEEE Conference on Decision and Control, 4253–4258. doi:10.1109/CDC.2001.914567
Denman E, Beavers N (1976) The matrix sign function and computations in systems. Appl Math Comput 2:63–94. doi:10.1016/0096-3003(76)90020-5
Meini B (2003) The matrix square root from a new functional perspective: theoretical results and computational issues. Technical Report 1455, Dipartimento di Matematica, Universitá di Pisa, Pisa
Ortega JM (1972) Numerical analysis: a second course. Academic Press, New York
Long J, Hu X, Zhang L (2008) Newton’s method with exact line search for the square root of a matrix. J Phys Conf Ser 96:1–5. doi:10.1088/1742-6596/96/1/012034
Tank DW, Hopfield JJ (1986) Simple ’neural’ optimization networks: an A/D converter, signal decision circuit, and a linear programming. IEEE Trans Circuits Syst 33(5):533–541. doi:10.1109/TCS.1986.1085953
Dan Z, Huang WZ, Huang Y (2010) Chaos and rigorous verification of horseshoes in a class of Hopfield neural networks. Neural Comput Appl 19(1):159–166. doi:10.1007/s00521-009-0269-8
Zhang Y, Jiang D, Wang J (2002) A recurrent neural network for solving Sylvester equation with time-varying coefficients. IEEE Trans Neural Netw 13(5):1053–1063. doi:10.1109/TNN.2002.1031938
Yildiz N (2005) Layered feedforward neural network is relevant to empirical physical formula construction: A theoretical analysis and some simulation results. Phys Lett A 345(1–3):69–87. doi:10.1016/j.physleta.2005.06.116
Zhang Y, Li Z (2009) Zhang neural network for online solution of time-varying convex quadratic program subject to time-varying linear-equality constraints. Phys Lett A 373:1639–1643. doi:10.1016/j.physleta.2009.03.011
Zhang Y, Ge SS (2005) Design and analysis of a general recurrent neural network model for time-varying matrix inversion. IEEE Trans Neural Netw 16(6):1477–1490. doi:10.1109/TNN.2005.857946
Zhang Y, Ma W, Cai B (2009) From Zhang neural network to Newton iteration for matrix inversion. IEEE Trans Circuits Syst I 56(7):1405–1415. doi:10.1109/TCSI.2008.2007065
Zhang Y (2005) Revisit the analog computer and gradient-based neural system for matrix inversion. In: Proceedings of IEEE international symposium on intelligent control, pp 1411–1416. doi:10.1109/.2005.1467221
Benner P, Byers R (1998) An exact line search method for solving generalized continuous-time algebraic Riccati equations. IEEE Trans Automat Control 43(1):101–107. doi:10.1109/9.654908
Shi ZJ (2004) Convergence of line search methods for unconstrained optimization. Appl Math Comput 157(2):393–405. doi:10.1016/j.amc.2003.08.058
Diniz-Ehrhardt MA, Martłnez JM, Raydan M (2008) A derivative-free nonmonotone line-search technique for unconstrained optimization. J Comput Appl Math 219(2):383–397. doi:10.1016/j.cam.2007.07.017
Mead C (1989) Analog VLSI and neural systems. Addison-Wesley, Reading
The MathWorks Inc (2008) Simulink 7 Getting Started Guide. Natick, MA
Carneiro NCF, Caloba LP (1995) A new algorithm for analog matrix inversion. In: Proceedings of the 38th midwest symposium on circuits systems, vol 1, pp 401–404. doi:10.1109/MWSCAS.1995.504461
Acknowledgments
This work is supported by the National Natural Science Foundation of China under Grants 60935001 and 60775050, by the Fundamental Research Funds for the Central Universities of China, and also by the opening Fund of Laboratory Sun Yat-sen University.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Zhang, Y., Yang, Y., Cai, B. et al. Zhang neural network and its application to Newton iteration for matrix square root estimation. Neural Comput & Applic 21, 453–460 (2012). https://doi.org/10.1007/s00521-010-0445-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-010-0445-x