Skip to main content
Log in

The upper bound of the minimal number of hidden neurons for the parity problem in binary neural networks

  • Research Paper
  • Published:
Science China Information Sciences Aims and scope Submit manuscript

Abstract

Binary neural networks (BNNs) have important value in many application areas. They adopt linearly separable structures, which are simple and easy to implement by hardware. For a BNN with single hidden layer, the problem of how to determine the upper bound of the number of hidden neurons has not been solved well and truly. This paper defines a special structure called most isolated samples (MIS) in the Boolean space. We prove that at least 2n−1 hidden neurons are needed to express the MIS logical relationship in the Boolean space if the hidden neurons of a BNN and its output neuron form a structure of AND/OR logic. Then the paper points out that the n-bit parity problem is just equivalent to the MIS structure. Furthermore, by proposing a new concept of restraining neuron and using it in the hidden layer, we can reduce the number of hidden neurons to n. This result explains the important role of restraining neurons in some cases. Finally, on the basis of Hamming sphere and SP function, both the restraining neuron and the n-bit parity problem are given a clear logical meaning, and can be described by a series of logical expressions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Wang D, Chaudhair N S. A fast modified constructive-covering algorithm for binary multi-layer neural networks. Neurocomputing, 2006, 70: 445–461

    Article  Google Scholar 

  2. Zhang J Y, Liang J L, Bao Z. Classifier design for heavy-overlap patterns based on capture/inhibition principle (in Chinese). Acta Electron Sin, 2006, 34: 2154–2160

    Google Scholar 

  3. Iyoda E M, Nobuhara H, Hirota K. A solution for the n-bit parity problem using a single translated multiplicative neuron. Neur Process Lett, 2003, 18: 213–218

    Google Scholar 

  4. Shen Y J, Wang B W. A fast learning algorithm of neural network with tunable activation function. Sci China Ser F-Inf Sci, 2004, 47: 126–136

    Article  Google Scholar 

  5. Grochowski M, Duch W. Learning highly non-separable boolean functions using constructive feedforward neural network. In: Proceedings of ICANN’07: the 17th International Conference on Artificial Networks. Porto, Portugal, 2007. 180–189

  6. Lavtrtsky E. On the exact solution of the parity-N problem using ordered neural networks. Neural Netw, 2000, 13: 643–649

    Article  Google Scholar 

  7. Wilamowski B M, Hunter D, Malinowski A. Solving parity-n problems with feed-forward neural networks. Neural Netw, 2003, 4: 2546–2551

    Google Scholar 

  8. Liu D R, Hohil M E, Smith S H. N-bit parity neural networks:new solutions based on linear programming. Neurocomputing, 2002, 48: 477–488

    Article  MATH  Google Scholar 

  9. Moraga C. Design of neural networks. Lecture Notes in Artif Intell, 2007, 4692: 26–33

    Google Scholar 

  10. Duch W. K-Separability. Lecture Notes Comput Sci, 2006, 4131: 188–197

    Article  Google Scholar 

  11. Setiono R. On the solution of the parity problem by a single hidden layer feedforward neural network. Neurocomputing, 1997, 16: 225–235

    Article  Google Scholar 

  12. Stork D G, Allen J D. How to solve the n-bit parity problem with two hidden units. Neur Netw, 1992, 5: 923–926

    Article  Google Scholar 

  13. Brown D A. Neural network letter. Neur Netw, 1993, 6: 607–608

    Article  Google Scholar 

  14. Minor J M. Parity with two layer feedforward nets. Neur Netw, 1993, 6: 705–707

    Article  Google Scholar 

  15. Hohil M E, Liu D R, Smith S H. Solving the n-bit parity problem using neural networks. Neur Netw, 1999, 12: 1321–1323

    Article  Google Scholar 

  16. Huang S C, Huang Y F. Bounds on the number of hidden neurons in multilayer perceptrons. IEEE Trans Neur Netw, 1991, 2: 47–55

    Article  Google Scholar 

  17. Gray D L, Michel A N. A training algorithm for binary feedforward neural networks. IEEE Trans Neur Netw, 1992, 3: 176–194

    Article  Google Scholar 

  18. Abe T, Saito T. An approach to prediction of spatio-temporal patterns based on binary neural networks and celltclar automata. In: IEEE International Joint Conference on Neural Networks, 2008. 2494–2499

  19. Tian H, Chen S L, Zhang J Y. Study on the classification ability of feedforward neural network through experiments (in Chinese). Microelectron Comput, 2004, 21: 99–101

    Google Scholar 

  20. Cui R Y, Hong B R. On constructing a hidden layer for three-layered feedforward neural networks (in Chinese). J Comput Res Develop, 2004, 41: 524–530

    Google Scholar 

  21. Erhan O. Sign-representation of Boolean functions using a small number of monomials. Neur Netw, 2009, 22: 938–948

    Article  Google Scholar 

  22. Erhan O. An upper bound on the minimum number of monomials required to separate dichotomies of {−1, 1}n. Neur Comput, 2006, 18: 3119–3138

    Article  MATH  Google Scholar 

  23. Fung H K, Li L K. Minimal feedforward parity networks using threshold gates. Neur Comput, 2001, 13: 319–326

    Article  MATH  Google Scholar 

  24. Lu Y, Han J H, Gao J. Research on the minimal upper bound of the number of hidden nodes in binary neural networks (in Chinese). Patt Recog Artif Intell, 2000, 13: 254–257

    Google Scholar 

  25. Ma X M, Yang Y X, Zhang Z Z, et al. An efficient algorithm for Boolean neural network (in Chinese). J China Inst Commun, 1999, 20: 13–18

    Google Scholar 

  26. Lu Y, Han J H, Wei Z. A general judging and constructing method of SP functions in binary neural networks. ACTA Autom Sin, 2003, 29: 234–241

    MathSciNet  Google Scholar 

  27. Kim J H, Park S K. The geometrical learning of binary neural networks. IEEE Trans Neur Netw, 1995, 6: 237–247

    Article  Google Scholar 

  28. Lu Y, Wei Z, Gao J, et al. Logical meaning of hamming sphere and its general judgement method in binary neural networks (in Chinese). J Comput Res Develop, 2002, 39: 79–86

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yang Lu.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lu, Y., Yang, J., Wang, Q. et al. The upper bound of the minimal number of hidden neurons for the parity problem in binary neural networks. Sci. China Inf. Sci. 55, 1579–1587 (2012). https://doi.org/10.1007/s11432-011-4405-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11432-011-4405-6

Keywords

Navigation