Loading [MathJax]/extensions/MathMenu.js
Solving parity-N problems with feedforward neural networks | IEEE Conference Publication | IEEE Xplore

Solving parity-N problems with feedforward neural networks


Abstract:

Several neural network architectures for computing parity problems are described. Feedforward networks with one hidden layer require N neurons in the hidden layer. If ful...Show More

Abstract:

Several neural network architectures for computing parity problems are described. Feedforward networks with one hidden layer require N neurons in the hidden layer. If fully connected feedforward networks are considered, the number of neurons in the hidden layer is reduced to N/2. In the case of fully connected networks with neurons connected in cascade, the minimum number of hidden neurons is between log/sub 2/(N+1)-1 and log/sub 2/(N+1). This paper also describes hybrid neuron architectures with linear and threshold-like activation functions. These hybrid architectures require the least number of weights. The described architectures are suitable for hardware implementation since the majority of weights equal +1 and weight multiplication is not required. The simplest network structures are pipeline architectures where all neurons and their weights are identical. All presented architectures and equations were verified with MATLAB code for parity-N problems as large as N=100.
Date of Conference: 20-24 July 2003
Date Added to IEEE Xplore: 26 August 2003
Print ISBN:0-7803-7898-9
Print ISSN: 1098-7576
Conference Location: Portland, OR, USA

Contact IEEE to Subscribe

References

References is not available for this document.