Elsevier

Neural Networks

Volume 18, Issue 10, December 2005, Pages 1293-1300
Neural Networks

A functional neural network computing some eigenvalues and eigenvectors of a special real matrix

https://doi.org/10.1016/j.neunet.2005.04.008Get rights and content

Abstract

How to quickly compute eigenvalues and eigenvectors of a matrix, especially, a general real matrix, is significant in engineering. Since neural network runs in asynchronous and concurrent manner, and can achieve high rapidity, this paper designs a concise functional neural network (FNN) to extract some eigenvalues and eigenvectors of a special real matrix. After equivalent transforming the FNN into a complex differential equation and obtaining the analytic solution, the convergence properties of the FNN are analyzed. If the eigenvalue whose imaginary part is nonzero and the largest of all eigenvalues is unique, the FNN will converge to the eigenvector corresponding to this special eigenvalue with general nonzero initial vector. If all eigenvalues are real numbers or there are more than one eigenvalue whose imaginary part equals the largest, the FNN will converge to zero point or fall into a cycle procedure. Comparing with other neural networks designed for the same domain, the restriction to matrix is very slack. At last, three examples are employed to illustrate the performance of the FNN.

Introduction

Quick extraction of the eigenvalues and eigenvectors of a matrix, specially, a general real matrix, is very important in engineering such as real-time signal processing (Luo et al., 1997, Ziegaus and Lang, 2004), primary component analysis (Luo, Unbehauen, & Li, 1995), etc. In many rapid computing methods, the neural network based approach is one of the most important, many literatures about this technique have been reported (Cichocki, 1992, Cichocki and Unbehauen, 1992, Helmke and Moore, 1994, Kakeya and Kindo, 1997, Kobayashi et al., 2001, Li, 1997, Liu et al., 2005a, Liu et al., 2005b, Luo and Li, 1995, Perfetti and Massarelli, 1997, Reddy et al., 1995, Samardzija and Waterland, 1991, Song and Yam, 1998, Zhang et al., 2004). But these works mainly concentrate on extracting eigenvalues and eigenvectors of a real symmetric, or anti-symmetric, matrix, this restriction is very strict. So this paper proposes a new functional neural network (FNN) to perform the computation, which has looser restrictions to the matrix. The FNN is expressed asdv(t)dt=[AB(t)]v(t),wheret0,v(t)R2n,A=(I0AAI0),B(t)=(IUnv(t)IUnv(t)IUnv(t)IUnv(t)),

I denotes a n×n identity matrix, I0 denotes a n×n zero matrix,Un=(1,,1n,0,,0n),Un=(0,,0n,1,,1n).

A is the real matrix requiring extracting some eigenvalue and eigenvector. When v(t) is seen as the states of neurons, [A′−B(t)] is looked as synaptic connection weights, and the activation functions are seen as pure linear functions, formula (1) describes a continuous time functional neural network.

Let v(t)=(xT(t)yT(t))T,x(t)=(x1(t),x2(t),,xn(t))TRnandy(t)=(y1(t),y2(t),,yn(t))TRn, formula (1) is equal to{dx(t)dt=Ay(t)j=1nxj(t)x(t)+j=1nyj(t)y(t)dy(t)dt=Ax(t)j=1nyj(t)x(t)j=1nxj(t)y(t)Denotez(t)=x(t)+iy(t),i denotes the imaginary unit, formula (2) is equivalent todx(t)dt+idy(t)dt=Ai[x(t)+iy(t)]j=1n[xj(t)+yj(t)i][x(t)+iy(t)],i.e.dz(t)dt=Az(t)ij=1nzj(t)z(t).

Since the identical transformations from formula (1) to formula (4), analyzing the convergence properties of formula (4) is equivalent to study those of FNN (1).

Section snippets

Some preliminaries

All eigenvalues of A are denoted as λ1R+λ1Ii,λ2R+λ2Ii,,λnR+λnIi(λkR,λkIR,1kn), corresponding eigenvectors and eigensubspaces are denoted as μ1,…,μn and V1,…,Vn.

Let ξCn denote the equilibrium vector. If ξ exists, there must exist the relation:ξ=limtz(t).

When the FNN reaches equilibrium state, it follows from formula (4) thatAξi=j=1nξjξ,when ξ is an eigenvector of A, λ̶R+λ̶Ii(λ̶R,λ̶IR) is assumed as the corresponding eigenvalue, thenAξ=(λ̶R+λ̶Ii)ξ.

From formula (6), (7), it follows thatλ̶

Analytic solution of FNN (1)

Theorem 1

Let Sj denote μj/|μj|, and zj(t) denote the projection value of z(t) onto Sj. Then the analytic solution of FNN (1) isz(t)=j=1nzj(0)exp[(iλjRλjI)t]Sj1+k=1nzk(0)0texp[(iλkRλkI)τ]dτfor t≥0.

Proof

Obviously, S1, S2,…,Sn construct a basis in n-dimensional complex vector space Cn, soz(t)=k=1nzk(t)Sk.

As Sk is an eigenvector of A and λkR+λkIi is the corresponding eigenvalue, substituting formula (10) into formula (4) gives thatj=1ndzj(t)dtSj=Aij=1nzj(t)Sjk=1nzk(t)j=1nzj(t)Sj=j=1nzj(t)(iλjRλjI

Convergence analysis

Let K̶ denote the index set (k1, k2,…,kN), the sign ‘⊕’ denotes the direct sum operator.

Theorem 2

If λ1I=λ2I==λnI=0, the FNN cannot obtain ξ when initial complex vector z(0)≠0.

Proof

When λ1I=λ2I==λnI=0, using theorem 1 gives thatz(t)=j=1nzj(0)exp(iλjRt)Sj1+k=1nzk(0)0texp(iλkRτ)dτ.

Depending upon the values of λ1R,λ2R,,λnR, there are two cases:

  • (1)

    If λjR0 for j=1, 2,…,n, we will prove that z(t) falls into a cycle procedure. Assume the period is T, then

z(t)=z(t+T)=j=1nzj(0)exp[iλjR(t+T)]Sj1+k=1nzk(0)1iλkR

Examples and discussion

In this section, we give three examples to illustrate the performance of the proposed FNN. The simulation platform is Matlab.

Example 1

A is randomly evaluated likeA=(0.63532.00460.63130.37921.01810.60140.49312.32520.94420.18210.55120.46201.23162.12041.52101.09980.32101.05560.64470.03840.08601.23660.11320.70431.2274).

When A is inputted into the FNN, the equilibrium vector isξ=(0.7803+4.7813i2.0183+5.9634i3.90104.3270i4.72812.4656i1.88743.6058i).

and i(ξ1++ξ5)=0.3463+2.2983i. The true

Conclusions

This paper proposes a functional neural network model adapt for computation of some eigenvalue of a special real matrix. In complex vector space, the analytic solution of the model is obtained. By the solution, the convergence properties are analyzed. The network is feasible to the real matrix, which has following features: its eigenvalues are not all real numbers, and there is unique eigenvalue whose imaginary part is the largest, the algebraic multiplicity of the eigenvalue is not restricted.

Acknowledgements

The authors would like to thank the referees for their helpful comments.

References (17)

There are more references available in the full text version of this article.

Cited by (13)

  • Another neural network based approach for computing eigenvalues and eigenvectors of real skew-symmetric matrices

    2010, Computers and Mathematics with Applications
    Citation Excerpt :

    Computation of eigenvalues and the corresponding eigenvectors has been an attractive topic for a long time, which is important both in theory and in many engineering fields such as image compression and signal processing, etc. Lots of neural network based methods have been proposed for solving this problem [1–16]. Two excellent review articles of this topic can be found in [17,18].

  • The Nonlinear workbook: Chaos, fractals, cellular automata, genetic algorithms, gene expression programming, support vector machine, wavelets, hidden markov models, fuzzy logic with C++, Java and SymbolicC++ programs, 6th edition

    2014, The Nonlinear Workbook: Chaos, Fractals, Cellular Automata, Genetic Algorithms, Gene Expression Programming, Support Vector Machine, Wavelets, Hidden Markov Models, Fuzzy Logic with C++, Java and SymbolicC++ Programs, 6th edition
View all citing articles on Scopus
View full text