Elsevier

Neurocomputing

Volume 73, Issues 10–12, June 2010, Pages 2196-2202
Neurocomputing

A fast multi-output RBF neural network construction method

https://doi.org/10.1016/j.neucom.2010.01.014Get rights and content

Abstract

This paper investigates the center selection of multi-output radial basis function (RBF) networks, and a multi-output fast recursive algorithm (MFRA) is proposed. This method can not only reveal the significance of each candidate center based on the reduction in the trace of the error covariance matrix, but also can estimate the network weights simultaneously using a back substitution approach. The main contribution is that the center selection procedure and the weight estimation are performed within a well-defined regression context, leading to a significantly reduced computational complexity. The efficiency of the algorithm is confirmed by a computational complexity analysis, and simulation results demonstrate its effectiveness.

Introduction

Radial basis function (RBF) neural networks have been widely used in nonlinear function approximation, data classification, systems modelling and control [1], [2], [3], [4], [5], [6], [7], [8], [9], [10]. An RBF network is a two-layered network with a nonlinear hidden layer and a linear output layer. Training of RBF neural networks involves the selection of the RBF centers in the hidden and estimation of linear weights connecting the hidden lay and the output layer. Each hidden node in an RBF network produces a radially symmetric response around a node parameter vector called a center. The performance of RBF networks critically replies on the choice of RBF centers. The conventional strategy is to randomly select some input data as the RBF centers, the output weights being estimated using the least-squares method [11]. This simple hybrid method however may produce a network of poor performance, an alternative is the clustering techniques [12], [13], which determine the center locations using both input and output data. The RBF centers can be optimized using multi-objective evolutionary algorithms [14], and the Fisher ratio class separability measurement has also been used to choose the RBF centers [15].

In contrast to the above approaches, stepwise selection approaches formulate the construction of an RBF network as a linear-in-the-parameters problem, where all training samples are often used as the candidate RBF centers. These methods can be categorized into two groups, i.e., backward selection methods [16], [17] and forward selection methods. Forward selection algorithms are thought to be superior to backward methods in terms of computational efficiency, since they do not need to solve the equations explicitly for the full set of initial candidate centers. Orthogonal least squares (OLS) [18], [19], [20], [21], [22], [23], [24] is a popular approach in the literature for RBF network construction, which is used to select centers (regressors) based on their contributions to maximizing the model error reduction ratio for single-output RBF neural network. This algorithm has been extended to multi-output cases [25], [26]. Further, recursive OLS algorithm has also been proposed for the construction of multi-output RBF neural networks [27].

Unlike OLS which uses QR decomposition on the regression matrix, the recently proposed fast recursive algorithm (FRA) [28], [29], [30], [31] requires less computational effort and has shown to be numerically more stable. However, the proposed FRA method was originally proposed for single-output cases in the system identification domain. The main objective of this paper is to extend the FRA to the construction of multi-output RBF networks. Unlike the OLS-based multi-output RBF network construction approaches [26], the new multi-output fast recursive algorithm (MFRA) relies on the novel regression context proposed in the FRA methods [28], [29], [30], [31]. The proposed MFRA can not only select the centers of multi-output RBF network and estimate the network weights simultaneously, but also offers a significant computational efficiency.

The paper is organized as follows. Section 2 is the problem formulation, followed by the MFRA method proposed for multi-output RBF network construction in Section 3. Section 4 gives computational complexity analysis of the proposed method, and Section 5 presents three numerical examples to illustrate the effectiveness of the proposed algorithm. The paper is concluded in Section 6.

Section snippets

Problem formulation

Consider a multi-output nonlinear system to be modelled by a multi-output RBF network with M hidden nodes [26], [32]y^i=j=1Mwj,iφj(xcj;σ),1ip,where x is the input vector, cj and σ are the RBF centers and width, φj(·;σ) is a nonlinear mapping from R+ to R and has a radially symmetric shape, y^i is the neural network output, and wj,i is the linear output weight, p is the number of outputs.

Eq. (1) has two type of adjustable parameters, including the centers cj and width σ of the RBF nodes.

Multi-output fast recursive algorithm

In the forward subset selection procedure, the model size k increases by one if a new regressor term is added. Suppose k RBF basis vectors φ1,,φk from the full regression matrix Φ=[φ1,,φM] have been selected, and the remaining ones in Φ are denoted as φk+1,,φM. From (5), (6), for a multi-output RBF network with k nodes, it follows thatW^k=(ΦkTΦk)1ΦkTY,J2(W^k)=tr((YΦkW^k)T(YΦkW^k))=tr(YT(IΦk(ΦkTΦk)1ΦkT)Y),where Φk=[φ1,,φk].

If a new RBF basis vector φ{φk+1,,φM} is now chosen, the

Computational complexity

Since the computation time is mainly used on multiplication/division operations, only these numbers are counted in the following. The computation involved in the algorithm is dominated by the selection of p output RBF network centers. Suppose there are initially M candidate RBF centers, from which only m hidden nodes are eventually chosen (mM), and N data samples are used for training. Then the total number of multiplication/division operations for the OLS isCOLS=(2p+2)mNM(2p+2)NM(p+1)Nm2+(2p

Simulation examples

Example 1

Consider the following single-input two-output nonlinear system [26], [34]: y1(k)=0.5y1(k1)+u(k1)+0.4tanh(u(k2))+0.1sin(πy1(k2))y2(k1)+ε1(k),y2(k)=0.3y2(k1)+0.1y2(k2)y1(k1)+0.4exp(u2(k1))y1(k2)+ε2(k)where E=[ε1(k),ε2(k)]T are zero-mean Gaussian white noise with covariance 0.01I2, and the system input u(k) is uniformly distributed within (−0.5, 0.5). Initial conditions were set as y1(0)=y1(−1)=y2(0)=y2(−1)=0, u(0)=u(−1)=0, and 2000 data points were generated to train the multi-output

Conclusion

A multi-output fast recursive algorithm has been proposed for the construction of multi-output RBF networks. The proposed algorithm can not only reveal the significance of each candidate center based on the reduction in the trace of the error covariance matrix, but also can estimate the network weights simultaneously using back substitution. It has also been shown that with the introduction of a proper regression context, the computational complexity can be significantly reduced. Simulation

Dajun Du received the B.Sc. and M.Sc. degrees all from the Zhengzhou University, China, in 2002 and 2005, respectively. From September 2008 to September 2009, he was a visiting PhD student at Intelligent Systems and Control (ISAC) Research Group at Queen's University Belfast, UK. He is currently a PhD student in Shanghai University. His main research interests include neural networks, system modelling and identification and networked control system.

References (37)

  • K.Z. Mao et al.

    Neuron selection for RBF neural network classifier based on data structure preserving criterion

    IEEE Trans. Neural Networks

    (2005)
  • R.J. Schilling et al.

    Approximation of nonlinear systems with radial basis function neural networks

    IEEE Trans. Neural Networks

    (2001)
  • D.S. Broomhead et al.

    Multivariable functional interpolation and adaptive networks

    Complex Syst.

    (1988)
  • W. Pedrycz

    Conditional fuzzy clustering in the design of radial basis function neural networks

    IEEE Trans. Neural Networks

    (1998)
  • J. Gonzales et al.

    Multiobjective evolutionary optimization of the size, shape and position parameters of radial basis function networks for function approximation

    IEEE Trans. Neural Networks

    (2003)
  • K.Z. Mao

    RBF neural network center selection based on Fisher ratio class separability measure

    IEEE Trans. Neural Networks

    (2002)
  • X. Hong et al.

    Givens rotation based fast backward elimination algorithm for RBF neural network pruning

    Proc. Inst. Elect. Eng. Control Theory Appl.

    (1997)
  • X. Hong et al.

    Backward elimination methods for associative memory network pruning

    Int. J. Hybrid Intell. Syst.

    (2004)
  • Cited by (57)

    • Metamodel-based generative design of wind turbine foundations

      2022, Automation in Construction
      Citation Excerpt :

      Li et al. [46] developed a multi-output RF model to predict structural damages and demonstrated that RF is a promising ML method for this multi-output problem. Artificial Neural Network (ANN) has also shown to have potentials to deal with the multi-output regression problems [47–49]. For instance, Feedforward Neural Network (FFNN) has been adopted broadly in previous studies.

    • Efficient characterization of dynamic response variation using multi-fidelity data fusion through composite neural network

      2021, Engineering Structures
      Citation Excerpt :

      Alternatively, neural network based methods have also been attempted in meta-modeling of structural dynamic response. Actually, neural network can allow directly the multi-response emulation through designing an architecture with multiple neurons/nodes at the output layer [17–19]. The correlation of multiple responses can be implicitly established by mutual interaction of different layers.

    • Performance prediction of HCCI engines with oxygenated fuels using artificial neural networks

      2015, Applied Energy
      Citation Excerpt :

      Thus, LM optimization training algorithm is used in this study for training. RBF neural network: radial basis function neural networks have very wide applications in modeling systems, data classifications and function approximations [65]. RBF networks are made of a nonlinear hidden layer and a linear output layer.

    View all citing articles on Scopus

    Dajun Du received the B.Sc. and M.Sc. degrees all from the Zhengzhou University, China, in 2002 and 2005, respectively. From September 2008 to September 2009, he was a visiting PhD student at Intelligent Systems and Control (ISAC) Research Group at Queen's University Belfast, UK. He is currently a PhD student in Shanghai University. His main research interests include neural networks, system modelling and identification and networked control system.

    Kang Li is a Reader in Intelligent Systems and Control at Queen's University Belfast. His research interests include advanced algorithms for training and construction of neural networks, fuzzy systems and support vector machines, as well as advanced evolutionary algorithms, with applications to non-linear system modelling and control, microarray data analysis, systems biology, environmental modelling and monitoring, and polymer extrusion. He has produced over 150 research papers and co-edited seven conference proceedings in his field. He is a Chartered Engineer, a member of the IEEE and the InstMC and the current Secretary of the IEEE UK and Republic of Ireland Section.

    Minrui Fei received his B.S. and M.S. degrees in Industrial Automation from the Shanghai University of Technology in 1984 and 1992, respectively, and his PhD degree in Control Theory and Control Engineering from Shanghai University in 1997. Since 1998, he has been a Professor and Doctoral Supervisor at Shanghai University. His current research interests are in the areas of intelligent control, complex system modeling, networked control systems, field control systems, etc.

    The work of D. Du was supported by Shanghai University “11th Five-Year Plan” 211 Construction Project and Innovation Fund. The work of K. Li was supported by EPSRC, UK under Grant EP/G042594/1 and EP/F021070/1. The work of M. Fei was supported by National Science Foundation of China under Grant No. 60774059 and No. 60834002, the Excellent discipline Leader plan Project of shanghai under Grant No. 08XD14018.

    View full text