Improved Zhang neural network with finite-time convergence for time-varying linear system of equations solving

https://doi.org/10.1016/j.ipl.2019.03.012Get rights and content

Highlights

  • An improved ZNN for time-varying linear system of equations solving is proposed.

  • The proposed neural network is proven to have finite-time convergence.

  • The upper bound of the convergence time is also analyzed and presented.

  • The proposed neural network outperforms other existing neural models.

Abstract

This paper proposes an improved Zhang neural network (IZNN) for time-varying linear system of equations solving. Such a neural network is activated by an array of continuous sign-bi-power function. Theoretical analysis is provided to show the desired finite-time convergence property of the proposed IZNN. As compared to Zhang neural network activated by an array of discontinuous signum-function, the solution synthesized by the proposed neural network can converge to theoretical solution, while the solution synthesized by the latter oscillates to some extent around the equilibrium point. Moreover, the remarkable finite-time convergence of the proposed IZNN model is corroborated by a simulative example. Simulation results also demonstrate that the proposed neural network is more suitable in engineering applications than Zhang neural network activated by the array of discontinuous signum-function.

Introduction

Being regarded as a fundamental issue that often arises in scientific and engineering fields, linear system of equations solving has a wide range of applications, such as robotics [1], electromagnetic analysis [2], optical flow [3], power network [4] and so on. Many efforts have thus been devoted to online solving this problem. Recent works [5], [6], [7], [8], [9], [10] show that, the approaches based on neural networks, which are of parallel and distributed storage nature, are highly efficient and powerful to online solution of linear system of equations and other matrix problems. Therefore, many neural network models have been proposed and investigated [1], [3], [6]. For example, in [1], a gradient-based neural network (GNN) was proposed by Wang to realtime solve constant linear system of equations. It is known that time-varying problems are difficult to solve and occur frequently in practice. While GNN is designed for solving static (time-invariant) problems, and generally can not be utilized to solve time-varying problems. Since 2002, Zhang et al. proposed a neural network (termed OZNN, original Zhang neural network), which can solve both time-invariant and time-varying matrix problems (including linear system of equations). OZNN and its variants are then viewed as the first systematic research on time-varying problems solving [11].

It is well known that convergence property plays a significant role in neural network. Unfortunately, the aforementioned GNN and OZNN can only achieve infinite time convergence [3], [5], [6], [9], which reflects that they need infinite time to obtain exact (theoretical) solution to the problem. This can not meet requirements of some realtime applications. Recently, the finite-time convergence of neural networks becomes a hot issue, and many neural models have thus been developed, investigated and exploited to solve multifarious matrix-related and optimization problems. For example, [12] proposed a recurrent neural network with finite-time convergence for real-time matrix square root finding. Qiao et al. [13] proposed two finite-time convergent ZNN models for the solution of time-varying complex matrix Drazin inverse. Some dual neural network models were developed to solve quadratic programming problems [14], [15]. In addition, based on deep study on neural network, Xiao et al. designed a novel evolution formula and accordingly proposed a FTZNN (finite-time Zhang neural network) to solve matrix problems [16], [17]. To pursue finite-time convergence for linear system of equations solving, by adopting a signum activation-function array, a new type of Zhang neural network (termed NTZNN) was proposed [18]. The NTZNN has been proven to possess finite-time convergence for both static and time-varying linear system of equations solving. However, because the signum function is a discontinuous function, the solution synthesized by the NTZNN model may oscillate to some extent around the equilibrium point. To eliminate (or alleviate) this phenomenon, we generalize our previous work [19] to the time-varying situation, and propose a novel ZNN model for time-varying linear system of equations solving in this paper.

Before ending this introductory section, it is worth pointing out here that the main contributions and novelties of this Letter lie in the following facts.

  • 1)

    The main objective of this paper is to online solve time-varying linear system of equations instead of easy-to-solved static one.

  • 2)

    This paper proposes and investigates a novel Zhang neural network for time-varying linear system of equations solving. By use of Lyapunov theory, the proposed neural network is theoretically proven to have finite-time convergence performance. Moreover, the upper bound of the convergence time is also analyzed and presented.

  • 3)

    The proposed neural network outperforms the NTZNN owing to the fact that, the solution generated by the former can well converge to the theoretical solution, while that generated by the latter oscillates to some extent around the equilibrium point.

  • 4)

    A simulation example is successfully performed to demonstrate the effectiveness and finite-time convergence of the proposed neural network for solving time-varying linear system of equations.

Section snippets

Problem formulation and related neural-network models

In this section, for completeness and further discussion, the problem formulation is first given. Then, two related neural network models, i.e., OZNN and NTZNN, are presented for the online solution of time-varying linear system of equations, respectively.

Improved Zhang neural network

Although the NTZNN model (4) has desired finite-time convergence, the resultant solution may oscillate to some extent due to the discontinuity of the sgn function (see ensuing section for details). To eliminate (or alleviate) this problem, the following improved ZNN (IZNN) is proposed:A(t)x˙(t)=A˙(t)x(t)κSBP(A(t)x(t)b(t))+b˙(t), where SBP():RnRn represents a vector-form nonlinear activation function and is defined asSBP(e(t))=[sbp(ei(t))]n:=[sgnq(ei(t))+sgn1/q(ei(t))]n, where ei(t) is the i

Illustrative example

In this section, for illustrative and comparative purposes, the following smoothly time-varying matrix A(t) and vector b(t) are taken into account.A(t)=[0.3cos(t)+0.20.3sin(t)0.3sin(t)0.3cos(t)0.2]andb(t)=[0.5cos(t)0.5sin(t)].

Besides, the theoretical (exact) solution x(t) of the linear system of equations (1) with the above coefficients is provided below for verification.x(t)=[x1(t)x2(t)]=[2cos(t)32sin(t)]. Without loss of generality, in the simulation, we set κ=1, q=0.25, and

Conclusions

This Letter puts forward an improved ZNN for time-varying linear system of equations solving. The OZNN model and NTZNN model are also presented. In addition, two propositions and one theorem are provided to show the convergence properties of the presented three models (i.e., OZNN, NTZNN and IZNN). It is shown that, both the NTZNN and IZNN can achieve finite-time convergence, while the OZNN only can achieve infinite-time convergence. Moreover, the IZNN is much superior to the NTZNN for

References (24)

  • J. Wang

    Electronic realisation of recurrent neural work for solving simultaneous linear equations

    Electron. Lett.

    (1992)
  • T. Mifune et al.

    Folded preconditioner: a new class of preconditioners for Krylov subspace methods to solve redundancy-reduced linear systems of equations

    IEEE Trans. Magn.

    (2009)
  • Cited by (0)

    View full text