Skip to main content
Log in

Neural network learning with generalized-mean based neuron model

  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

The advances in biophysics of computations and neurocomputing models have brought the foreground importance of dendritic structure of neuron. These structures are assumed as basic computational units of the neuron, capable of realizing the various mathematical operations. The well structured higher order neurons have shown improved computational power and generalization ability. However, these models are difficult to train because of a combinatorial explosion of higher order terms as the number of inputs to the neuron increases. In this paper we present a neural network using new neuron architecture i.e., generalized mean neuron (GMN) model. This neuron model consists of an aggregation function which is based on the generalized mean of all the inputs applied to it. The resulting neuron model has the same number of parameters with improved computational power as the existing multilayer perceptron (MLP) model. The capability of this model has been tested on the classification and time series prediction problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. McCulloch WS, Pitts W (1943) A logical calculation of the ideas immanent in nervous activity. Bull Math Biophys 5:115–133

    Google Scholar 

  2. Koch C (1999) Biophysics of computation: information processing in single neurons. Oxford University Press, New York

  3. Schmitt M (2001) On the complexity of computing and learning with multiplicative neural networks. Neural Comput 14:241–301

    Google Scholar 

  4. Shin Y, Ghosh J (2001) Ridge polynomial networks. IEEE Trans Neural Netw 6:610–622

    Google Scholar 

  5. Zhang CN, Zhao M, Wang M (2000) Logic operations based on single neuron ratioal model. IEEE Trans Neural Netw 11:739–747

    Google Scholar 

  6. Basu M, Ho TK (1999) Learning behavior of single neuron classifiers on linearly separable or nonseparable inputs. IEEE IJCNN'99 2:1259–1264

    Google Scholar 

  7. Labib R (1999) New single neuron structure for solving nonlinear problems. IEEE IJCNN'99 1:617–620

    Google Scholar 

  8. Iyoda EM, Nobuhara H, Hirota K (2003) A Solution for the N-bit parity problem using a single translated multiplicative neuron. Neural Processing Lett 18:233–238

    Google Scholar 

  9. Hoppensteadt F, Izhikevich E (2001) Canonical neuron models. In: Arbib MA (ed), Brain theory and neural networks. MIT Press, Cambridge

  10. Box GEP, Jenkins GM, Reinse GC (1994) Time series analysis: forecasting and control. Prentice Hall, Englewood Cliffs

  11. Mackey M, Glass L (1997) Oscillation and chaos in physiological control systems. Science 197:287–289

    Google Scholar 

  12. Guler M, Sahin E (1994) A new higher-order binary-input neural unit: learning and generalizing effectively via using minimal number of monomials. In: Proceedings of third turkish symposium on artificial intelligence and neural networks, pp 51–60

  13. Proakis JG (2001) Digital communications. McGraw Hill International, Singapore

  14. Ripley BD (1996) Pattern recognition and neural networks. Cambridge University Press, Cambridge

  15. Schreiner K (2001) Neuron function: the mystery persists. IEEE Intll Syst 16:4–7

    Google Scholar 

  16. Murata N, Yoshizawa S, Amari S (1994) Network information criterion-determining the number of hidden units for and artificial neural networks model. IEEE Tran Neural Netw 5:865–872

    Google Scholar 

  17. Plate TA (2000) Randomly connected sigma-pi neurons can form associator networks. NETCNS: Network: Comput Neural Syst 11:321–332

    Google Scholar 

  18. Piegat A (2001) Fuzzy modeling and control. Physica-Verlag, Heidelberg, New York

  19. Ripley DB (1994) Neural networks and related methods of classification. J Roy Stat Soc Ser B56:409–456

    Google Scholar 

  20. Akaike H (1974) A new look at the statistical model identification. IEEE Tran Appl Comp AC-19: 716–723

    Google Scholar 

  21. Fogel DB (1991) An information criterion for optimal neural network selection. IEEE Tran Neural Netw 2:490–497

    Google Scholar 

  22. Hardy GH, Wright EM (1979) Some notations. In: An introduction to the theory of numbers 5th ed. Clarendon Press, Oxford pp. 7–8

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to R. N. Yadav.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yadav, R., Kalra, P. & John, J. Neural network learning with generalized-mean based neuron model. Soft Comput 10, 257–263 (2006). https://doi.org/10.1007/s00500-005-0479-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-005-0479-7

Keywords

Navigation