Skip to main content
Log in

Plane-Gaussian artificial neural network

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Multilayer perceptrons (MLPs) and radial basis functions networks (RBFNs) have been widely concerned in recent years. In this paper, based on k-plane clustering (kPC) algorithm, we propose a novel artificial network model termed as Plane-Gaussian network to enlarge the arsenal of the neural networks. This network adopts a so-called Plane-Gaussian activation function (PGF) in hidden neurons. Replacing traditional central point of Gaussian radial basis function (RBF) with central hyperplane, PGF forms a band-shaped rather than spheral-shaped receptive field in RBF, which makes PGF able to express its peculiar geometrical characteristics: locality and globality. Importantly, it is also proved that PGF network (PGFN) having one hidden layer is capable of universal approximation. As a universal approximator, PGFN gives an informal way of bridging the gap between MLP and RBFN. The experiments report comparison between training time and classification accuracies on some artificial and UCI datasets and conclude that (1) PGFN runs significantly faster than MLP and (2) PGFN has comparable or better classification performance than MLP and RBFN, especially in subspace-distributed datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. To distinguish with the foresaid parameter k such as kPC and k-means, here let k 1 denotes the number of nearest neighbors to a given sample.

References

  1. Haykin S (1999) Neural networks: a comprehensive foundation, 2nd edn. Prentice Hall, NJ

    MATH  Google Scholar 

  2. Bishop CM (1995) Neural networks and pattern recognition. Oxford University Press, Oxford

    Google Scholar 

  3. Barreto AMS, Barbosa HJC, Ebecken NFF (2006) GOLS-Genetic orthogonal least squares algorithm for training RBF networks. Neurocomputing 69(16–18):2041–2064

    Article  Google Scholar 

  4. Sarimveis H, Doganis P, Alexandridis A (2006) A classification technique based on radial basis function neural networks. Adv Eng Softw 37(4):218–221

    Article  Google Scholar 

  5. Smyrnakis MG, Evans DJ (2007) Classifying Ischemic events using a Bayesian inference multilayer percetron and input variable evaluation using automatic relevance determination. Comput Cardiol 34:305–308

    Google Scholar 

  6. Cybenko G (1989) Approximation by superpositions of a sigmoidal function. Math Control Signals Syst 5(4):303–314

    Article  MathSciNet  Google Scholar 

  7. Funahashi K (1989) On the approximate realization of continuous mappings by neural networks. Neural Netw 2(3):183–192

    Article  MATH  Google Scholar 

  8. Hornik K, Stinchcombe M, White H (1990) Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks. Neural Networks 3(5):551–560

    Article  Google Scholar 

  9. Park J, Sandberg IW (1991) Universal approximation using radial-basis-function networks. Neural Comput 3(2):246–257

    Article  Google Scholar 

  10. Nam MD, Thanh TC (2003) Approximation of function and its derivatives using radial basis function networks. Appl Math Modell 27(3):197–220

    Article  MATH  Google Scholar 

  11. Lehtokangas M, Saarinen J (1998) Centroid based multilayer perceptron networks. Neural Process Lett 7:101–106

    Article  Google Scholar 

  12. Irigoyen E, Pinzolas M (in press) Numerical bounds to assure initial local stability of NARX multilayer perceptrons and radial basis functions. Neurocomputing

  13. Oliveira ALI, Melo BJM, Meira SRL (2005) Improving constructive training of RBF networks through selective pruning and model selection. Neurocomputing 64:537–541

    Article  Google Scholar 

  14. Delogu R, Fanni A, Montisci A (2008) Geometrical synthesis of MLP neural networks. Neurocomputing 71(4–6):919–930

    Article  Google Scholar 

  15. De Silva CR, Ranganath S, De Silva LC (2008) Cloud basis function neural network: a modified RBF network architecture for holistic facial expression recognition. Pattern Recogn 41(4):1241–1253

    Article  MATH  Google Scholar 

  16. Qu N, Wang L, Zhu M et al (2008) Radial basis function networks combined with genetic algorithm applied to nondestructive determination of compound erythromycin ethylsuccinate powder. Chemom Intell Lab Syst 90(2):145–152

    Article  Google Scholar 

  17. Huan HX, Hien DTT, Huynh HT (2007) A novel efficient two-phase algorithm for training interpolation radial basis function networks. Signal Process 87(11):2708–2717

    Article  MATH  Google Scholar 

  18. Yeung DS, Chan PPK, Ng WWY (2009) Radial basis function network learning using localized generalization error bound. Inf Sci 179:3199–3217

    Article  MATH  Google Scholar 

  19. Yeung DS, Wang D, Ng WWY, Tsang ECC, Wang X (2007) Structured large margin machines: sensitive to data distributions. Mach Learn 68(2):171–200

    Article  Google Scholar 

  20. Duda RO, Hart RE, Stock DG (2001) Pattern classification, 2nd edn. Wiley, New York

    MATH  Google Scholar 

  21. Bezdek JC (1981) Pattern recognition with fuzzy objective function algorithms. Plenum Press, New York

    MATH  Google Scholar 

  22. Bradley PS, Mangasarian OL (2000) k-plane clustering. J Global Optim 16(1):23–32

    Article  MathSciNet  MATH  Google Scholar 

  23. Castillo PA, Merelo JJ, Arenas MG, Romero G (2007) Comparing evolutionary hybrid systems for design and optimization of multilayer perceptron structure along training parameters. Inf Sci 177(14):2884–2905

    Article  Google Scholar 

  24. Gao D, Ji Y (2005) Classification methodologies of multilayer perceptrons with sigmoid activation functions. Pattern Recognit 38(10):1469–1482

    Google Scholar 

  25. Kiernan L, Mason JD, Warwick K (1996) Robust initialization of Gaussian radial basis function networks using partitioned k-means clustering. Electron Lett 32(7): 671–673

    Google Scholar 

  26. Bruzzone L, Prieto DF (1999) A technique for the selection of kernel-function parameters in RBF neural networks for classification of remote-sensing images. IEEE Trans Vol Geosci Remote Sensing 37(2):1179–1184

    Article  Google Scholar 

  27. Jeffreys H, Jeffreys BS (1988) Methods of mathematical physics, 3rd edn. Cambridge University Press, Cambridge

    Google Scholar 

  28. Chen TP, Chen H (1995) Approximation capability to functions of several variables nonlinear functionals and operators by radial basis function neural networks. IEEE Trans Neural Netw 6(4):904–910

    Article  Google Scholar 

  29. Rudin W (1987) Real and complex analysis, 3rd edn. McGraw-Hill, Inc., New York

    MATH  Google Scholar 

  30. Blake C, Keogh E, Merz CJ (1998) UCI repository of machine learning databases [http://www.ics.uci.edu/~mlearn/MLRepository.html]. Department of Information and Computer Science, University of California, Irvine

  31. Draghici S (2002) On the capabilities of neural networks using limited precision weights. Neural Netw 15:395–414

    Article  Google Scholar 

  32. Mirchandani G, Cao W (1989) On hidden nodes for neural Nets. IEEE Trans Circuits Syst 36(5):661–664

    Article  MathSciNet  Google Scholar 

  33. Huang GB, Babri HA (1998) Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions. IEEE Trans Neural Netw 9(1):224–229

    Article  Google Scholar 

  34. Teoh EJ, Xiang C, Tan KC (2006) Estimating the number of hidden neurons in a feedforward network using the singular value decomposition. LNCS 3971. Springer, Berlin, pp 858–865

    Google Scholar 

  35. Trenn S (2008) Multilayer perceptrons: approximation order and necessary number o hidden units. IEEE Trans Neural Netw 19(5):836–844

    Article  Google Scholar 

  36. Mehrabi S, Maghsoudloo M, Arabalibeik H et al (2009) Application of multilayer perceptron and radial basis function neural networks in differentiation between chronic obstructive pulmonary and congestive heart failure diseases. Expert Syst Appl 36:6956–6959

    Article  Google Scholar 

  37. Bartlett PL (1998) The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE Trans Inf Theory 44(2):525–536

    Article  MathSciNet  MATH  Google Scholar 

  38. Bishop CM, Nabney I (2004) Netlab neural network software. Neural computing research group, Information engineering, Aston University

  39. Moody TJ, Darken CJ (1988) Learning with localized receptive fields. In: Hinton G, Sejnowski T, and Touretzsky D (eds) Proceedings of the 1988 connectionist models summer school. Morgan Kaufmann, pp 133–143

  40. Bellman RE, Roth RS (1969) Curve fitting by segmented straight lines. Am Stat Assoc J 64:1079–1084

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgments

We thank the anonymous reviewers for their valuable comments and suggestions. We are grateful to the Neural Computing Research Group of Aston university for allowing us to freely use Netlab software. This research was supported by Natural Science Foundation of China (60773061, 60903130), the Jiangsu Science Foundation BK2009393, and Science Foundation of Nanjing Forestry University 163070053 and 163070657.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Songcan Chen.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yang, X., Chen, S. & Chen, B. Plane-Gaussian artificial neural network. Neural Comput & Applic 21, 305–317 (2012). https://doi.org/10.1007/s00521-011-0546-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-011-0546-1

Keywords

Navigation