Skip to main content
Log in

On the Computational Power of Max-Min Propagation Neural Networks

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

We investigate the computational power of max-min propagation (MMP) neural networks, composed of neurons with maximum (Max) or minimum (Min) activation functions, applied over the weighted sums of inputs. The main results presented are that a single-layer MMP network can represent exactly any pseudo-Boolean function F:{0,1}n → [0,1], and that two-layer MMP neural networks are universal approximators. In addition, it is shown that several well-known fuzzy min-max (FMM) neural networks, such as Simpson's FMM, are representable by MMP neural networks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Baturone, I., Huertas, J. L., Barriga, A. and Sánchez-Lozano, S.:Current-mode multiple-input max circuit, Electronic Lett. 30 (1994), 678–680.

    Google Scholar 

  2. Cheney, E. W.:Introduction to Approximation Theory, Chap. 6, Sec. 4, McGraw-Hill, New York, 1966.

    Google Scholar 

  3. Estévez, P. A. and Okabe, Y.:Max-min propagation nets:Learning by delta rule for the Chebyshev norm, In:Proceedings of the IEEE-INNS International Joint Conference on Neural Networks, Vol. 1, pp. 524–527, Nagoya, Japan, 1993.

    Google Scholar 

  4. Estévez, P. A.:Max-Min Propagation Neural Networks:Representation Capabilities, Learning Algorithms and Evolutionary Structuring, Ph. D. Thesis, The University of Tokyo, Tokyo, Japan, 1995.

    Google Scholar 

  5. Estévez, P. A. and Nakano, R.:Hierarchical mixture of experts and max-min propagation neural networks, In:Proceedings of the IEEE International Conference on Neural Networks, Vol. 1, pp. 651–656, Perth, Australia, 1995.

    Google Scholar 

  6. Gabrys, B. and Bargiela, A.:General fuzzy min-max neural network for clustering and classification, IEEE Trans. Neural Networks, 11 (3) (2000), 769–783.

    Google Scholar 

  7. Gallant, S. I.:Neural Network Learning and Expert Systems, MIT Press, Cambridge, MA, 1993.

    Google Scholar 

  8. Hajnal, A., Maass, W., Pudlák, P., Szegedy, M. and Turán, G.:Threshold circuits of bounded depth, J. Comput. and Syst. Sci. 46 (1993), 129–154.

    Google Scholar 

  9. Hassoun, M. H. and Nabha, A. M.:Implementation of O(n)complexity max/min circuits for fuzzy and connectionist computing, In:Proceedings of the IEEE International Confer-ence on Neural Networks, pp. 998–1003, March 1993.

  10. Leshno, M., Lin, V. Y., Pinkus, A. and Schocken, S.:Multilayer feedforward network with a non-polynomial activation function can approximate any function, Neural Networks 6 (1993), 861–867.

    Google Scholar 

  11. Likas, A.:Reinforcement learning using the stochastic fuzzy min-max neural network, Neural Process. Lett. 13 (3) (2001), 213–220.

    Google Scholar 

  12. Lippman, R. P.:An introduction to computing with neural nets, IEEE ASSP Magazine, 4 (1987), 4–22.

    Google Scholar 

  13. Maass, W.:Bounds for the computational power and learning complexity of analog neural nets, SIAM J. Comput. 26 (1997), 708–732.

    Google Scholar 

  14. Maass, W.:On the computational power of winner-take-all, Neural Comput. 12 (2000), 2519–2535.

    Google Scholar 

  15. Maass, W.:Neural computation with winner-take-all as the only nonlinear operation, In: S. A. Solla, T. K. Leen, and K.-R. Müller (eds), Adv. Neural Inf. Process. Syst. 12, The MIT Press, Cambridge, MA, pp. 293–299, 2000.

    Google Scholar 

  16. Machado, R. J., Barbosa, V. C. and Neves, P. A.:Learning in the combinatorial neural model, IEEE Trans. Neural Networks, 9 (5) (1998), 831–847.

    Google Scholar 

  17. Rizzi, A., Panella, M. and Mascioli, F. M. F.:Adaptive resolution min-max classifiers, IEEE Trans. Neural Networks, 13 (2) (2002), 402–414.

    Google Scholar 

  18. Scarselli, F. and Tsoi, A. C.:Universal approximation using feedforward neural networks: A survey of some existing methods and some new results, Neural Networks, 11 (1) (1998), 15–37.

    Google Scholar 

  19. Síma, J.:The Computational Theory of Neural Networks. Technical Report No. 823, Institute of Computer Science, Academy of Sciences of the Czech Republic, 2000.

  20. Simpson, P. K.:Fuzzy min-max neural networks–Part 1:Classification, IEEE Trans. Neural Networks, 3 (5) (1992), 776–786.

    Google Scholar 

  21. Simpson, P. K.:Fuzzy min-max neural networks–Part 2:Clustering, IEEE Trans. Fuzzy Syst. 3 (5) (1993), 32–45.

    Google Scholar 

  22. Siu, K. Y., Roychowdhury, V. P. and Kailath, T.:Depth-size tradeoffs for neural compu-tation, IEEE Trans. Computers, 40 (1991), 1402–1412.

    Google Scholar 

  23. Siu, K. Y., Roychowdhury, V. P. and Kailath, T.:Discrete Neural Computation: A Theoretical Foundation, Prentice Hall, Englewood Cliffs, NJ, 1995.

    Google Scholar 

  24. Teow, L. N. and Loe, K. F.:Effective learning in recurrent max-min neural networks, Neural Networks, 11 (3) (1998), 535–547.

    Google Scholar 

  25. Urahama, K. and Nagao, T.:K-winners-take-all circuit with O(n)complexity, IEEE Trans. Neural Networks, 6 (1995), 776–778.

    Google Scholar 

  26. Yu, A. J., Giese, M. A. and Poggio, T. A.:Biophysiologically plausible implementations of the maximum operation, Neural Comput. 14 (2002), 2857–2881.

    Google Scholar 

  27. Zhang, X. and Hang, C. C.:The min-max function differentiation and training of fuzzy neural networks, IEEE Trans. Neural Networks, 7 (5) (1996), 1139–1149.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pablo A. Estévez.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Estévez, P.A., Okabe, Y. On the Computational Power of Max-Min Propagation Neural Networks. Neural Processing Letters 19, 11–23 (2004). https://doi.org/10.1023/B:NEPL.0000016837.13436.d3

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/B:NEPL.0000016837.13436.d3

Navigation