Skip to main content
Log in

Nonlinearly Activated Recurrent Neural Network for Computing the Drazin Inverse

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Four gradient-based recurrent neural networks for computing the Drazin inverse of a square real matrix are developed. Theoretical analysis shows that any monotonically-increasing odd activation function ensures the global convergence performance of defined neural network models. The computer simulation results further substantiate that the considered neural networks could compute the Drazin inverse with accuracy and effectiveness. Moreover, the presented neural networks show superior convergence in the case when the power-sigmoid activation functions are used compared to linear models.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Campbell SL, Meyer CD Jr (1991) Generalized inverses of linear transformations. Dover Publications Inc, New York Corrected reprint of the 1979 original

    MATH  Google Scholar 

  2. Campbell SL, Rose NJ (1976) Applications of the Drazin inverse to linear systems of differential equations with singular constant coefficients. SIAM J Appl Math 31:411–425

    Article  MathSciNet  MATH  Google Scholar 

  3. Castro González N, Koliha JJ, Wei Y (2000) Perturbation of the Drazin inverse for matrices with equal eigenprojections at zero. Linear Algebra Appl 312:181–189

    Article  MathSciNet  MATH  Google Scholar 

  4. Chen YL (1997) Representation and approximation for the Drazin inverse \(A^{\text{ D }}\). Appl Math Comput 48:83–92

    Google Scholar 

  5. Cichocki A, Kaczorek T, Stajniak A (1992) Computation of the Drazin inverse of a singular matrix making use of neural networks. Bull Pol Acad Sci Tech Sci 40:387–394

    MATH  Google Scholar 

  6. Fu M, Wang S, Cong Y (2013) Swarm stability analysis of high-order linear time-invariant singular multiagent systems. Math Probl Eng. Article ID 469747, p 11

  7. Golub GH, Hansen PC, O’Leary DP (1999) Tikhonov regularization and total least squares. SIAM J Matrix Anal Appl 21:185–194

    Article  MathSciNet  MATH  Google Scholar 

  8. González NC, Koliha J, Wei Y (2002) Error bounds for perturbation of the Drazin inverse of closed operators with equal spectral projections. Appl Anal 81:915–928

    Article  MathSciNet  MATH  Google Scholar 

  9. Hartman P (2002) Ordinary differential equations. Society for Industrial and Applied Mathematics (SIAM), Philadelphia

    Book  MATH  Google Scholar 

  10. Hopfield JJ (1984) Neurons with graded response have collective computational properties like those of two-state neurons. Proc Natl Acad Sci 81:3088–3092

    Article  Google Scholar 

  11. Jang J, Lee S, Shin S (1988) An optimization network for matrix inversion. In: Anderson D (ed) Neural information processing systems. American Institute of Physics, New York, pp 397–401

  12. Ji J (1994) An alternative limit expression of Drazin inverse and its application. Appl Math Comput 61:151–156

    MathSciNet  MATH  Google Scholar 

  13. Ji J (2002) A finite algorithm for the Drazin inverse of a polynomial matrix. Appl Math Comput 130:243–251

    MathSciNet  MATH  Google Scholar 

  14. Kailath T (1980) Linear systems. Prentice-Hall, Englewood Cliffs

    MATH  Google Scholar 

  15. Li S, Chen S, Liu B (2013) Accelerating a recurrent neural network to finite-time convergence for solving time-varying Sylvester Equation by using a sign-bi-power activation function. Neural Process Lett 37:189–205

    Article  Google Scholar 

  16. Li Z, Zhang Y (2010) Improved Zhang neural network model and its solution of time-varying generalized linear matrix equations. Expert Syst Appl 37:7213–7218

    Article  Google Scholar 

  17. Liao B, Zhang Y (2014) Different complex ZFs leading to different complex ZNN models for time-varying complex generalized inverse matrices. IEEE Trans Neural Netw Learn Syst 25:1621–1631

    Article  Google Scholar 

  18. Lippmann RP (1987) An introduction to computing with neural nets. IEEE ASSP Mag 4:4–22

    Article  Google Scholar 

  19. Liu X, Zhong J (2010) Integral representation of the \(W\)-weighted Drazin inverse for Hilbert space operators. Appl Math Comput 216:3228–3233

    MathSciNet  MATH  Google Scholar 

  20. Luo FL, Bao Z (1992) Neural network approach to computing matrix inversion. Appl Math Comput 47:109–120

    MathSciNet  MATH  Google Scholar 

  21. Stanimirović PS, Živković IS, Wei Y (2015) Recurrent neural network approach based on the integral representation of the Drazin inverse. Neural Comput 27:2107–2131

    Article  Google Scholar 

  22. Stanimirović PS, Z̋ivković IS, Wei Y (2015) Recurrent neural network for computing the Drazin inverse. IEEE Trans Neural Netw Learn Syst 26:2830–2843

    Article  MathSciNet  Google Scholar 

  23. Wang J (1993) A recurrent neural network for real-time matrix inversion. Appl Math Comput 55:89–100

    MathSciNet  MATH  Google Scholar 

  24. Wang J (1993) Recurrent neural networks for solving linear matrix equations. Comput Math Appl 26:23–34

    Article  MathSciNet  MATH  Google Scholar 

  25. Wang J (1997) Recurrent neural networks for computing pseudoinverses of rank-deficient matrices. SIAM J Sci Comput 18:1479–1493

    Article  MathSciNet  MATH  Google Scholar 

  26. Wei Y (2000) Recurrent neural networks for computing weighted Moore–Penrose inverse. Appl Math Comput 116:279–287

    Article  MathSciNet  MATH  Google Scholar 

  27. Wei Y (2002) The Drazin inverse of a modified matrix. Appl Math Comput 125:295–301

    MathSciNet  MATH  Google Scholar 

  28. Wei Y, Wang G (1997) The perturbation theory for the Drazin inverse and its applications. Linear Algebra Appl 258:179–186

    Article  MathSciNet  MATH  Google Scholar 

  29. Wei Y, Wu H (2001) Challenging problems on the perturbation of Drazin inverse. Ann Oper Res 103:371–378

    Article  MathSciNet  MATH  Google Scholar 

  30. Zhang Y, Ge SS (2003) A general recurrent neural network model for time-varying matrix inversion. In: Proceedings of 42nd IEEE conference on decision and control, San Diego, vol 6, pp 6169–6174

  31. Zhang Y, Danchi J, Jun W (2002) A recurrent neural network for solving Sylvester equation with time-varying coefficients. IEEE Trans Neural Netw 13:1053–1063

    Article  Google Scholar 

  32. Zhang Y, Yang Y, Tan N, Cai B (2011) Zhang neural network solving for time-varying full-rank matrix Moore–Penrose inverse. Computing 92:97–121

    Article  MathSciNet  MATH  Google Scholar 

  33. Zhang Y, Shi Y, Chen K, Wang C (2009) Global exponential convergence and stability of gradient-based neural network for online matrix inversion. Appl Math Comput 215:1301–1306

    MathSciNet  MATH  Google Scholar 

  34. Zhang Y (2005) Design and analysis of a general recurrent neural network model for time-varying matrix inversion. IEEE Trans Neural Netw 16(6):1477–1490

    Article  Google Scholar 

  35. Zhang Y, Ma W, Cai B (2009) From Zhang neural network to Newton iteration for matrix inversion. IEEE Trans Circuits Syst I 56(7):1405–1415

    Article  MathSciNet  Google Scholar 

  36. Zielke G (1986) Report on test matrices for generalized inverses. Computing 36:105–162

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The author would like to thank the editor and three referees for their detailed comments which greatly improved the presentation of the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Haifeng Ma.

Additional information

Xue-Zhong Wang: This author is supported by Headmaster Foundation of Hexi University under grant XZ2014-18, University research funding projects in Gansu province under grant 2014A-110 and National Natural Science Foundation of China under grant 11171371 and 11461020.

Haifeng Ma: This author is supported by National Natural Science Foundation of China under grants 11401143, Oversea Returning Foundation of Hei Long Jiang Province under grant LC201402 and Scientific Research Foundation of Hei Long Jiang Province Education Department under grant 12541232.

Predrag S. Stanimirović: This author gratefully acknowledge support from the Research Project 174013 of the Serbian Ministry of Science.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, XZ., Ma, H. & Stanimirović, P.S. Nonlinearly Activated Recurrent Neural Network for Computing the Drazin Inverse. Neural Process Lett 46, 195–217 (2017). https://doi.org/10.1007/s11063-017-9581-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-017-9581-y

Keywords

Mathematics Subject Classification

Navigation