Elsevier

Neurocomputing

Volume 7, Issue 3, April 1995, Pages 311-317
Neurocomputing

Letter
Second derivative dependent placement of RBF centers

https://doi.org/10.1016/0925-2312(94)00082-4Get rights and content

Abstract

To improve the approximation capabilities of RBF networks, centers of adequate widths should be placed more densely on regions of higher than on regions of lower absolute second derivative of the function to be approximated. This new concept is called second derivative dependent center placement. Using a test function especially designed to study this relationship, several RBF networks are designed and compared.

References (8)

There are more references available in the full text version of this article.

Cited by (15)

  • An improved radial basis function neural network for object image retrieval

    2015, Neurocomputing
    Citation Excerpt :

    This makes particles find the global optimal solution. Although the setting of the basis function centers has been highly addressed by the previous works on RBFNN learning [25–27], the learning of the basis function widths has not been much studied. The existing previous works discussed the effect of widths of radial basis functions on performances of classification and function approximation [4,28,29].

  • Generalized multiscale radial basis function networks

    2007, Neural Networks
    Citation Excerpt :

    Unlike the separate learning procedure, combined learning approaches, which are often implemented by means of supervised learning algorithms, aim to simultaneously estimate all the three kinds of unknown parameters by performing appropriate nonlinear optimization techniques including gradient descent search (Karayiannis, 1999; McLoone, Brown, Irwin, & Lightbody, 1998), expectation-maximization (EM) estimation (Lazaro, Santamaria, & Pantaleon, 2003), and evolutionary algorithms (Billings & Zheng, 1995; Gonzalez et al., 2003; Whitehead & Choate, 1996). While several efficient algorithms have been introduced to determine kernel centres (Billings & Chen, 1998; Haykin, 1999; Moody & Darken, 1989; Orr, 1995; Sanchez, 1995; Schwenker et al., 2001), few algorithms are available to effectively determine the kernel widths of the basis functions in the network for the general purpose of nonlinear system identification problems. In fact, the optimization of kernels is often coupled with the estimation of other parameters in most existing learning strategies.

  • On the design of a class of neural networks

    1996, Journal of Network and Computer Applications
View all citing articles on Scopus
View full text