Skip to main content
Log in

Zhang neural network and its application to Newton iteration for matrix square root estimation

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

A special class of recurrent neural networks (RNN) has recently been proposed by Zhang et al. for solving online time-varying matrix problems. Being different from conventional gradient-based neural networks (GNN), such RNN (termed specifically as Zhang neural networks, ZNN) are designed based on matrix-valued error functions, instead of scalar-valued norm-based energy functions. In this paper, we generalize and further investigate the ZNN model for time-varying matrix square root finding. For the purpose of possible hardware (e.g., digital circuit) realization, a discrete-time ZNN model is constructed and developed, which incorporates Newton iteration as a special case. Besides, to obtain an appropriate step-size value (in each iteration), a line-search algorithm is employed for the proposed discrete-time ZNN model. Computer-simulation results substantiate the effectiveness of the proposed ZNN model aided with a line-search algorithm, in addition to the connection and explanation to Newton iteration for matrix square root finding.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Iannazzo B (2003) A note on computing the matrix square root. Calcolo 40:273–283. doi:10.1007/s10092-003-0079-9

    Article  MathSciNet  MATH  Google Scholar 

  2. Charles RJ, Kazuyoshi O, Robert R (2001) Uniqueness of matrix square roots and an application. Linear Algebra Appl 323(1):51–60. doi:10.1016/S0024-3795(00)00243-3

    Article  MathSciNet  MATH  Google Scholar 

  3. Higham NJ (2008) Functions of matrices: theory and computation. Society for Industrial and Applied Mathematics, USA

    Book  MATH  Google Scholar 

  4. Mohammed AH, Ali AH, Syed R (2000) Fixed point iterations for computing square roots and the matrix sign function of complex matrices. In: Proceedings of IEEE Conference on Decision and Control, 4253–4258. doi:10.1109/CDC.2001.914567

  5. Denman E, Beavers N (1976) The matrix sign function and computations in systems. Appl Math Comput 2:63–94. doi:10.1016/0096-3003(76)90020-5

    Article  MathSciNet  MATH  Google Scholar 

  6. Meini B (2003) The matrix square root from a new functional perspective: theoretical results and computational issues. Technical Report 1455, Dipartimento di Matematica, Universitá di Pisa, Pisa

  7. Ortega JM (1972) Numerical analysis: a second course. Academic Press, New York

    MATH  Google Scholar 

  8. Long J, Hu X, Zhang L (2008) Newton’s method with exact line search for the square root of a matrix. J Phys Conf Ser 96:1–5. doi:10.1088/1742-6596/96/1/012034

    Google Scholar 

  9. Tank DW, Hopfield JJ (1986) Simple ’neural’ optimization networks: an A/D converter, signal decision circuit, and a linear programming. IEEE Trans Circuits Syst 33(5):533–541. doi:10.1109/TCS.1986.1085953

    Article  Google Scholar 

  10. Dan Z, Huang WZ, Huang Y (2010) Chaos and rigorous verification of horseshoes in a class of Hopfield neural networks. Neural Comput Appl 19(1):159–166. doi:10.1007/s00521-009-0269-8

    Article  Google Scholar 

  11. Zhang Y, Jiang D, Wang J (2002) A recurrent neural network for solving Sylvester equation with time-varying coefficients. IEEE Trans Neural Netw 13(5):1053–1063. doi:10.1109/TNN.2002.1031938

    Article  Google Scholar 

  12. Yildiz N (2005) Layered feedforward neural network is relevant to empirical physical formula construction: A theoretical analysis and some simulation results. Phys Lett A 345(1–3):69–87. doi:10.1016/j.physleta.2005.06.116

    Article  MATH  Google Scholar 

  13. Zhang Y, Li Z (2009) Zhang neural network for online solution of time-varying convex quadratic program subject to time-varying linear-equality constraints. Phys Lett A 373:1639–1643. doi:10.1016/j.physleta.2009.03.011

    Article  MATH  Google Scholar 

  14. Zhang Y, Ge SS (2005) Design and analysis of a general recurrent neural network model for time-varying matrix inversion. IEEE Trans Neural Netw 16(6):1477–1490. doi:10.1109/TNN.2005.857946

    Article  Google Scholar 

  15. Zhang Y, Ma W, Cai B (2009) From Zhang neural network to Newton iteration for matrix inversion. IEEE Trans Circuits Syst I 56(7):1405–1415. doi:10.1109/TCSI.2008.2007065

    Article  MathSciNet  Google Scholar 

  16. Zhang Y (2005) Revisit the analog computer and gradient-based neural system for matrix inversion. In: Proceedings of IEEE international symposium on intelligent control, pp 1411–1416. doi:10.1109/.2005.1467221

  17. Benner P, Byers R (1998) An exact line search method for solving generalized continuous-time algebraic Riccati equations. IEEE Trans Automat Control 43(1):101–107. doi:10.1109/9.654908

    Article  MathSciNet  MATH  Google Scholar 

  18. Shi ZJ (2004) Convergence of line search methods for unconstrained optimization. Appl Math Comput 157(2):393–405. doi:10.1016/j.amc.2003.08.058

    Article  MathSciNet  MATH  Google Scholar 

  19. Diniz-Ehrhardt MA, Martłnez JM, Raydan M (2008) A derivative-free nonmonotone line-search technique for unconstrained optimization. J Comput Appl Math 219(2):383–397. doi:10.1016/j.cam.2007.07.017

    Article  MathSciNet  MATH  Google Scholar 

  20. Mead C (1989) Analog VLSI and neural systems. Addison-Wesley, Reading

    Book  MATH  Google Scholar 

  21. The MathWorks Inc (2008) Simulink 7 Getting Started Guide. Natick, MA

    Google Scholar 

  22. Carneiro NCF, Caloba LP (1995) A new algorithm for analog matrix inversion. In: Proceedings of the 38th midwest symposium on circuits systems, vol 1, pp 401–404. doi:10.1109/MWSCAS.1995.504461

Download references

Acknowledgments

This work is supported by the National Natural Science Foundation of China under Grants 60935001 and 60775050, by the Fundamental Research Funds for the Central Universities of China, and also by the opening Fund of Laboratory Sun Yat-sen University.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yunong Zhang.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Zhang, Y., Yang, Y., Cai, B. et al. Zhang neural network and its application to Newton iteration for matrix square root estimation. Neural Comput & Applic 21, 453–460 (2012). https://doi.org/10.1007/s00521-010-0445-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-010-0445-x

Keywords

Navigation