Skip to main content
Log in

Adaptive radial basis function networks with kernel shape parameters

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Radial basis function network (RBFN), commonly used in the classification applications, has two parameters, kernel center and radius that can be determined by unsupervised or supervised learning. But it has a disadvantage that it considers that all the independent variables have the equal weights. In that case, the contour lines of the kernel function are circular, but in fact, the influence of each independent variable on the model is so different that it is more reasonable if the contour lines are oval. To overcome this disadvantage, this paper presents an adaptive radial basis function network (ARBFN) with kernel shape parameters and derives the learning rules from supervised learning. To verify that this architecture is superior to that of the traditional RBFN, we make a comparison between three artificial and fifteen real examples in this study. The results show that ARBFN is much more accurate than the traditional RBFN, illustrating that the shape parameters can actually improve the accuracy of RBFN.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Abbreviations

C :

Center of kernel of Gaussian unit

E :

Error function (Energy function)

f :

Transfer function

f′:

Differential of transfer function

h :

Output of the hidden unit

net:

Net of the processing unit

N :

Number of the processing unit

Q :

Reciprocal of the kernel radius of Gaussian unit

t :

Target output of output unit

V :

Weight of input variables in Gaussian unit

W :

Connection weight from Gaussian unit to the output unit

x :

Input

y :

Inference output of the output unit

δ :

Gap between the target output and the inference output

Δ(•):

Modification of parameter

η :

Learning rate

i :

Processing unit of input layer

j :

Processing unit of output layer

k :

Processing unit of hidden layer

References

  1. Moody J, Darken C (1989) Fast learning in networks of locally-tuned processing units. Neural Comput 1(2):281–294

    Article  Google Scholar 

  2. Carse B, Pipe AG, Fogarty TC, Hill T (1995) Evolving radial basis function neural networks using a genetic algorithm. In: IEEE international conference on evolutionary computation, Perth, WA, Australia, pp 300–305

  3. Haykin S (1999) Neural networks: a comprehensive foundation. Prentice Hall, Upper Saddle River

    MATH  Google Scholar 

  4. Whitehead BA, Choate TD (1996) Cooperative-competitive genetic evolution of radial basis function centers and widths for time series prediction. IEEE Trans Neural Netw 7(4):869–880

    Article  Google Scholar 

  5. Xu L (1993) Rival penalized competitive learning for clustering analysis RBF net, and curve detection. IEEE Trans Neural Netw 4(4):636–648

    Article  Google Scholar 

  6. Berthold MR (1994) The TDRBF: a shift invariant radial basis function network. In: Proceedings of the Irish neural network conference, pp 7–12

  7. Cheng YM (1997) Adaptive rival penalized competitive learning and combined linear predictor model for financial forecast and investment. Int J Neural Syst 8(5):517–534

    Article  Google Scholar 

  8. Bugmann G (1998) Normalized Gaussian radial basis function networks. Neurocomputing (Special Issue on Radial Basis Function Networks) 20(1):97–110

  9. Webb AR, Shannon S (1998) Shape-adaptive radial basis functions. IEEE Trans Neural Netw 9(6):1155–1166

    Article  Google Scholar 

  10. Shibata K, Ito K (1999) Gauss-sigmoid neural network. In: International joint conference on neural networks, Washington, DC, pp 1203–1208

  11. Gomm JB, Dimg LY (2000) Selecting radial basis function network centers with recursive orthogonal least squares training. IEEE Trans Neural Netw 11(2):306–314

    Article  Google Scholar 

  12. Han M, Xi J (2002) Radial basis perception network and its applications for pattern recognition. In: International joint conference on neural networks, Honolulu, HI, pp 669–674

  13. Gao D, Yang G (2002) Adaptive RBF neural networks for pattern classifications. In: Proceedings of the 2002 international joint conference on neural networks, pp 846–851

  14. Park J, Sandberg IW (1991) Universal approximation using radial-basis-function networks. Neural Comput 3(2):246–257

    Article  Google Scholar 

  15. Park J, Sandberg IW (1993) Approximation and radial-basis-function networks. Neural Comput 5(3):305–316

    Article  Google Scholar 

  16. Park J, Harley RG, Venayagamoorthy GK (2004) Indirect adaptive control for synchronous Generator: comparison of MLP/RBF neural networks approach with Lyapunov stability analysis. IEEE Trans Neural Netw 15(2):460–464

    Article  Google Scholar 

  17. Huang GB, Saratchandran P, Sundararajan N (2005) A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation. IEEE Trans Neural Netw 16(1):57–67

    Article  Google Scholar 

  18. Mao KZ, Huang GB (2005) Neuron selection for RBF neural network classifier based on data structure preserving criterion. IEEE Trans Neural Netw 16(6):1531–1540

    Article  Google Scholar 

  19. Pedrycz W, Park HS, Oh SK (2008) A granular-oriented development of functional radial basis function neural networks. Neurocomputing 72(1–3):420–435

    Article  Google Scholar 

  20. Peng JX, Li K, Huang DS (2006) A hybrid forward algorithm for RBF neural network construction. IEEE Trans Neural Netw 17(6):1439–1451

    Article  Google Scholar 

  21. Peng JX, Li K, Irwin GW (2007) A novel continuous forward algorithm for RBF neural modeling. IEEE Trans Autom Control 52(1):117–122

    Article  MathSciNet  Google Scholar 

  22. Falcao AO, Langloisa T, Wicherta A (2006) Flexible kernels for RBF networks. Neurocomputing 69(16–18):2356–2359

    Article  Google Scholar 

  23. Perez-Godoy MD, Rivera AJ, Berlanga FJ, Del Jesus MJ (2010) CO2RBFN: an evolutionary cooperative–competitive RBFN. Soft Comput 14(9):953–971

    Article  Google Scholar 

  24. Chen CT, Chang WD (1996) A feedforward neural network with function shape automating. Neural Netw 9(4):627–641

    Article  Google Scholar 

  25. Lang KJ, Witbrock MJ (1998) Learning to tell two spirals apart. In: Proceedings 1988 connectionist models summer school, Morgan Kaufmann, Los Altos, CA, pp 52–59

  26. Asuncion A, Newman DJ (2007) UCI Machine Learning Repository http://www.ics.uci.edu/~mlearn/MLRepository.html. University of California, School of Information and Computer Science, Irvine, CA

  27. Haykin S (1999) Neural networks: a comprehensive foundation, chap 7, Radial-Basis Function Networks, Prentice Hall, Upper Saddle River, New Jersey

  28. Haykin S (1999) Neural networks: a comprehensive foundation, chap 6, Multilayer Perceptrons, Prentice Hall, Upper Saddle River, New Jersey

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to I-Cheng Yeh.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yeh, IC., Chen, CC., Zhang, X. et al. Adaptive radial basis function networks with kernel shape parameters. Neural Comput & Applic 21, 469–480 (2012). https://doi.org/10.1007/s00521-010-0485-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-010-0485-2

Keywords

Navigation