Elsevier

Neurocomputing

Volume 116, 20 September 2013, Pages 22-29
Neurocomputing

The stability analysis for a novel feedback neural network with partial connection

https://doi.org/10.1016/j.neucom.2011.10.044Get rights and content

Abstract

This paper develops a new Partially Feedback Neural Network with partial connection, which is so-called “Partially Connected Feedback Neural Network” (PCFNN). The information capacity improves and there is more hidden information for partially connected systems because the connections between neurons are random and there can be more than one layer. The proving of the convergence of PCFNN is provided. Owing to the complexities in partially connected systems, two theorems of its stability are proved theoretically by constructing a novel energy function expectation. Three examples are provided to simulate various conditions of stability by constructing different activation functions and weight matrixes. The simulation results show that this novel neural network is stable under different conditions. The expressive space of the network architecture is also much larger than the original Hopfield neural network architecture in the partially connected neural network architecture.

Introduction

In the past decade, Hopfield neural networks (HNNs) [1] have been found useful in areas such as associative memory, image processing, pattern recognition, optimization, model identification and automatic control. The recurrent (or feedback) neural network (RNNs) with one or more hidden layers is used to build the model. RNNs unlike Feedforward Neural Networks, RNNs can use their internal memory to process arbitrary sequences of inputs. HNNs is not a general RNN, as it is not designed to process sequences of patterns. Instead it requires stationary inputs. It is a RNN in which all connections are symmetric. Many researchers are interested in the HNNs and have investigated the issue of stability of HNNs as in [2], [3], [4], [5], [6], [7], [8], [9], [10]. The authors in [2] have studied the global exponential stability of time delays of HNNs. Some new and sufficiently simple conditions have been derived in [3] for the existence, uniqueness, stability, and instability of the equilibrium state of a continuous-time HNN. The advantages HNNs basically include massive parallelism, convenient hardware implementation of the neural network architecture, and a general approach for solving various optimization problems [4]. Global exponential stability of discrete-time HNNs has been analyzed and a new global exponential stability result has been provided in [5]. An energy function in [6] is used to prove a mild hypothesis that any trajectory converges to a fixed point for the sequential iteration, and to a cycle of length 2 or a fixed point for the parallel iteration.

It is known that time delays are often met with in many fields because of the finite switching speed of amplifiers. Therefore, many researchers studied the stability of time delays of HNNs [11], [12], [13], [14], [15], [16], [17], [18], [19], [20], [21], [22], [23]. The authors in [11] have designed the feedback control for the synchronization of two multiple time delayed chaotic HNNs and simulated all the conclusions. In [12], the author has studied dynamical behaviors of High-order HNNs system with time delays by employing the Lyapunov method and linear matrix inequality technique and obtains some simple sufficient conditions ensuring global exponential stability and the existence of periodic solution of High-order HNNs. New exponential stability criteria are obtained in [13] for discrete-time neural networks with variable delays. The authors in [14] have presented new conditions ensuring a large upper bound for time delay based on a new class of Lyapunov–Krasovskii functional, which combines with the descriptor model transformation. The globally asymptotic stability of time delays of neutral-type neural networks has been investigated in [15]. The author in [16] has used an H approach to derive a tuning algorithm for delayed HNNs. A model of impulsive delayed HNNs is studied by [17]. Sufficient conditions have been obtained by [18] for the existence of the unique equilibrium of the system—an impulsive Hopfield-type neural network system with piecewise constant argument of generalized type. The authors in [19] have been concerned with boundedness and convergence of solution of a class of non-autonomous discrete-time delayed HNNs and obtained some sufficient conditions ensuring the boundedness of solutions of discrete-time delayed HNNs. Some sufficient conditions for the existence and uniqueness of the equilibrium and global exponential stability in the delayed HNNs have been obtained in [20]. The authors in [21] have applied a new kind of Lyapunov–Krasovskii function to derive a new criterion of asymptotic stability of delayed HNNs. The authors in [22] presented HNNs model for optimization on different dynamics which using discrete/continuous activation function and discrete/continuous dynamics to simulation.

As a neural network can be stable or unstable by certain stochastic inputs, stochastic HNNs were also studied in [24], [25], [26]. Reference [24] investigated the exponential stability of the Euler method and semi-implicit Euler method for stochastic HNNs with time delays and proved the stability of Euler scheme. Reference [25] investigated the problem of stability analysis for Markovian jumping HNNs with constant and distributed delays and applied a novel Lyapunov–Krasovskii functional to present new delay-dependent stochastic stability criteria. The authors in [26] have obtained the sufficient conditions to guarantee the exponential stability in the mean square of an equilibrium solution for stochastic delayed HNNs with Markovian switching. Based on their studies, various modifications of HNNS have extended the convergence of Hopfield neural networks. However, HNNs still have some basic problems and no satisfactory refinement has been found according to [27], [28].

In HNNs, the weights of connections between neurons are denoted by wij and there are certain restrictions with these weights, i.e.

  • wii=0 (no unit has a connection with itself).

  • wij=wji (connections between neurons are symmetric).

The convergence of Hopfield neural networks has been proved when wij meets the above condition for monotonic “sigmoid” neuron [29]. Hopfield nets have a scalar value associated with each state of unit si of the network referred to as the “energy”, E, of the network, whereE=12i,jwijsisj+iθisi

This value is called the “energy” because the definition ensures that if units are randomly chosen to update their activation values, the network will converge to states which are local minima in the energy function.

From the above-mentioned papers, many researchers have studied many contents of HNNs. However, to my best knowledge, a few people have studied a Partially Connected Feedback Neural Network so far, surprisingly. The purpose of the paper is to propose a novel Partially Connected Feedback Neural Network (PCFNN) architecture where the connections between neurons are random and there can be more than one layer. Therefore, the information capacity improves and there is more hidden information. PCFNN provides a better expressive ability than HNNs. The connectionism of PCFNN is inspired by the Partially Connected Neural Evolutionary model [30], [31], [32], [33], [34]. A partially connected Hopfield neural network model in [30] has been studied under the condition that w, which is the ratio of connections per site to the size of the system, remains finite as the size N to infinity with the connection structure at each site being the same. The results show that the information capacity per connection improves for partially connected systems. Reference [31] has introduced the “CAM-Brain Machine”, an FPGA-based piece of hardware the implements a genetic algorithm to evolve a cellular automat-based partially neural network circuit module (about 1000 neurons) in seconds. In addition, reference [32] has also introduced the partially neural network which includes 75 million neurons to control a lifesized Kitten Robot. Reference [33] has stated design of fully and partially connected random neural networks for pattern completion. The experimental results have examined the connectivity influence on the convergence of the learning algorithm and on the recognition rates. Reference [34] has described “partially connected neural evolutionary” model (PARCONE) that serves as the basis for the construction of China's first artificial brain. This paper further extends the concept of the Partially Connected Neural Network and proposes a new PCFNN and studies the stability and convergence of the PCFNN model by constructing a new energy function. On the other hand, the radial basis probabilistic neural networks (RBFNs) have high performance for pattern recognition [35], [36], [37]. However, this paper focuses on the partial connection among neurons in different layers to enhance the interpretability and stability of HNNs.

The remainder of the paper is organized as follows: Section 2 describes the architecture of PCFNN and provides some basic theories and properties of PCFNN model. Section 3 proves the convergence of PCFNN model and provides sufficient conditions about parallel and serial convergence. In Section 4, three examples are provided to demonstrate the convergence of PCFNN model. Finally, the conclusions are given in Section 5.

Section snippets

Problem statement and preliminaries

The section describes the architecture of a Partially Connected Feedback Neural Network in detail. In order to prove the convergence of PCFNN, some basic notations are defined as follows:

Stability analysis of PCFNN

This section firstly simplifies PCFNN model and gives hypothesis to prove the convergence of the model.

Illustrate examples

The testing procedure of PCFNN is depicted as in Fig. 2. W and ϕ(·) are designed to demonstrate the effectiveness of these theorems. Three examples with different combinations of WJ and ϕ(·) are provided to verify the convergence of the PCFNN model. Additionally, the system is implemented using the Microsoft Visual C++6.0.

In computer programming, the generation of pseudo-random numbers is an important and common task. Pseudo-random number generators (PRNG) are applied to create long runs of

Conclusions

We have proposed a new connectionism network, i.e., a Partially Connected Feedback Neural Network. It improves the expressive space of HNNs because the connections between neurons in PCFNN are random and there can be hidden layers in this network. The proposed architectures based on the Partially Connected Neural Evolutionary model and Hopfield neural networks (HNNs). We have proved the stability of simplified PCFNN model through constructing a novel expectation of energy function. In order to

Acknowledgment

This work was supported in part by the National Natural Science Foundation of China under Grant nos. 60970060, and 61033013, and by the Natural Science Foundation of Jiangsu Province of China under Grant no. BK2011284.

Didi Wang received his Master degree from the Cognitive Science Department, Fujian Key Laboratory of the Brain-like Intelligent Systems, Xiamen University in China. He has been an exchange student in the Department of Information Management of Yuan Ze Univeristy in Taiwan. His research interests are in neural networks, soft computing applied in manufacturing problems and development of genetic algorithms.

References (37)

  • G. Joya et al.

    Hopfield neural networks for optimization: study of the different dynamics and identification applications

    Neurocomputing

    (2002)
  • Z. Wang et al.

    Robust decentralized adaptive control for stochastic delayed Hopfield neural networks

    Neurocomputing

    (2011)
  • H.Y. Liu et al.

    Stochastic stability of Markovian jumping Hopfield neural networks with constant and distributed delays

    Neurocomputing

    (2009)
  • H. de Garis et al.

    The CAM-brain machine (CBM) an FPGA based hardware tool which evolves a 1000 neuron net circuit module in seconds and updates a 75 million neuron artificial brain for real time robot control

    Neurocomputing

    (2002)
  • J.J. Hopfield

    Neural networks and physical systems with emergent collective computational abilities

    Proc. Natl. Acad. Sci.

    (1982)
  • X.L. Liu et al.

    Global exponential stability of equilibrium point of Hopfield neural network with delay

    Adv. Neural Networks

    (2010)
  • Z.H. Guan et al.

    On equilibria, stability, and instability of Hopfield neural networks

    IEEE Trans. Neural Networks

    (2000)
  • X.C. Zeng et al.

    A new relaxation procedure in the Hopfield neural networks for solving optimization problems

    Neuron Process. Lett.

    (1999)
  • Didi Wang received his Master degree from the Cognitive Science Department, Fujian Key Laboratory of the Brain-like Intelligent Systems, Xiamen University in China. He has been an exchange student in the Department of Information Management of Yuan Ze Univeristy in Taiwan. His research interests are in neural networks, soft computing applied in manufacturing problems and development of genetic algorithms.

    Pei-Chann Chang received his MS and Ph.D. degrees from the Department of Industrial Engineering of Lehigh University in 1985 and 1989, respectively. He is a Chair Professor in Yuan-Ze University in Taiwan. His research interests include fuzzy neural networks, production scheduling, time series data forecasting, evolutionary computation and applications of soft computing. He has published his research works in international journals, such as IEEE Transactions on SMC part C, Neurocomputing, Decision Support Systems, Applied Soft Computing, European Journal of Operational Research, International Journal of Production Economics, Computers and Industrial Engineering and Computers and Operations Research.

    Li Zhang received the B.S. degree in 1997 and the Ph.D. degree in 2002 in electronic engineering from Xidian University, Xi'an, China. From 2003 to 2005, she was a post doctor at the Institute of Automation of Shanghai Jiao Tong University, Shanghai, China. From 2005 to 2010, she worked at Xidian University, Xi'an, China. Now she is a professor of Soochow University in Suzhou, China. Her research interests have been in the areas of machine learning, pattern recognition, neural networks and intelligent information processing.

    Jheng-Long Wu currently is a Ph.D. student in the Department of Information Management of Yuan Ze University in Taiwan. His research interests are in emotional computation, neural networks, financial time series data forecasting, soft computing applied in manufacturing problems and development of genetic algorithms.

    Changle Zhou is the Dean of School of Information and Technology at Xiamen University, chief editor of Mind and Computation, director of Mind-Art-Computation Laboratory. He received his Ph.D. from Peking University in 1990. His research interests lie in the areas of artificial intelligence, especially computational linguistics, brain theory and computational musicology with an emphasis on computational models for metaphor, visual awareness and music creativity, and intelligent techniques for Traditional Chinese Medicine information processing. In addition, he is conducting pioneering research on computational Chinese Lutics, aesthetic cognition and robot dancing. His philosophical works lie in ancient oriental thoughts of Chinese, such as ZEN, TAO, and YI etc., viewed from science.

    View full text