Abstract
The advances in biophysics of computations and neurocomputing models have brought the foreground importance of dendritic structure of neuron. These structures are assumed as basic computational units of the neuron, capable of realizing the various mathematical operations. The well structured higher order neurons have shown improved computational power and generalization ability. However, these models are difficult to train because of a combinatorial explosion of higher order terms as the number of inputs to the neuron increases. In this paper we present a neural network using new neuron architecture i.e., generalized mean neuron (GMN) model. This neuron model consists of an aggregation function which is based on the generalized mean of all the inputs applied to it. The resulting neuron model has the same number of parameters with improved computational power as the existing multilayer perceptron (MLP) model. The capability of this model has been tested on the classification and time series prediction problems.
Similar content being viewed by others
References
McCulloch WS, Pitts W (1943) A logical calculation of the ideas immanent in nervous activity. Bull Math Biophys 5:115–133
Koch C (1999) Biophysics of computation: information processing in single neurons. Oxford University Press, New York
Schmitt M (2001) On the complexity of computing and learning with multiplicative neural networks. Neural Comput 14:241–301
Shin Y, Ghosh J (2001) Ridge polynomial networks. IEEE Trans Neural Netw 6:610–622
Zhang CN, Zhao M, Wang M (2000) Logic operations based on single neuron ratioal model. IEEE Trans Neural Netw 11:739–747
Basu M, Ho TK (1999) Learning behavior of single neuron classifiers on linearly separable or nonseparable inputs. IEEE IJCNN'99 2:1259–1264
Labib R (1999) New single neuron structure for solving nonlinear problems. IEEE IJCNN'99 1:617–620
Iyoda EM, Nobuhara H, Hirota K (2003) A Solution for the N-bit parity problem using a single translated multiplicative neuron. Neural Processing Lett 18:233–238
Hoppensteadt F, Izhikevich E (2001) Canonical neuron models. In: Arbib MA (ed), Brain theory and neural networks. MIT Press, Cambridge
Box GEP, Jenkins GM, Reinse GC (1994) Time series analysis: forecasting and control. Prentice Hall, Englewood Cliffs
Mackey M, Glass L (1997) Oscillation and chaos in physiological control systems. Science 197:287–289
Guler M, Sahin E (1994) A new higher-order binary-input neural unit: learning and generalizing effectively via using minimal number of monomials. In: Proceedings of third turkish symposium on artificial intelligence and neural networks, pp 51–60
Proakis JG (2001) Digital communications. McGraw Hill International, Singapore
Ripley BD (1996) Pattern recognition and neural networks. Cambridge University Press, Cambridge
Schreiner K (2001) Neuron function: the mystery persists. IEEE Intll Syst 16:4–7
Murata N, Yoshizawa S, Amari S (1994) Network information criterion-determining the number of hidden units for and artificial neural networks model. IEEE Tran Neural Netw 5:865–872
Plate TA (2000) Randomly connected sigma-pi neurons can form associator networks. NETCNS: Network: Comput Neural Syst 11:321–332
Piegat A (2001) Fuzzy modeling and control. Physica-Verlag, Heidelberg, New York
Ripley DB (1994) Neural networks and related methods of classification. J Roy Stat Soc Ser B56:409–456
Akaike H (1974) A new look at the statistical model identification. IEEE Tran Appl Comp AC-19: 716–723
Fogel DB (1991) An information criterion for optimal neural network selection. IEEE Tran Neural Netw 2:490–497
Hardy GH, Wright EM (1979) Some notations. In: An introduction to the theory of numbers 5th ed. Clarendon Press, Oxford pp. 7–8
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Yadav, R., Kalra, P. & John, J. Neural network learning with generalized-mean based neuron model. Soft Comput 10, 257–263 (2006). https://doi.org/10.1007/s00500-005-0479-7
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00500-005-0479-7