Abstract
This paper describes an optimal synthesis method of binary neural network for pattern recognition. Our object is to minimize the number of connections and the number of neurons in hidden layer by using a Newly Expanded and Truncated Learning Algorithm (NETLA) for the multilayered neural networks. The synthesis method in NETLA uses the Expanded Sum of Product (ESP) of the boolean expressions and is based on multilayer perceptron. It has an ability to optimize a given binary neural network in the binary space without any iterative learning as the conventional Error Back Propagation (EBP) algorithm. Furthermore, NETLA can reduce the number of the required neurons in hidden layer and the number of connections. Therefore, this learning algorithm can speed up training for pattern recognition problems. The superiority of NETLA to other learning algorithms is demonstrated by an application to the approximation problem of a circular region.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
A.W. Andersen, S.S. Christensen, T.M. Jorgensen.: An Active Vision System for Robot Guidance using a Low Cost Neural Network Board. In European Robotics and Intelligent Systems Conference. (1994) 480–488
P.L. Bartlett, T. Downs.: Using Random Weight to rain Multilayer Network of Hardlimiting Units. IEEE Trans. Neural Networks. (1992) 202–210
M.L. Brady, R. Rayhavan, J. Slawny.: Back Propagation fails to separate where Perceptrons Succeed. IEEE Trans. Circuits Systems. (1989) 665–674
S. Park, J.H. Kim, H. Chang.: A Learning Algorithm for Discrete Multilayer Perceptron. in Proc. Int. Sysp. Circuit Systems. Singapore(1991)
J.H. Kim, S.K. Park, Han, H. Oh, M.S. Han.: The Geometrical Learning of Binary Neural Network. IEEE Trans. Neural Networks. vol. 6.no. 1. (1995) 237–247
Z. Kohavi.: Switching and Finite Automata Theory. (1986) 2nd ed. McGraw-Hill
Donald L. Gray, Anthony N. Michel.:A Training Algorithm for Binary Feedforward Neural Networks. IEEE Trans. Neural Networks. (1992) 176–194
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Sung, SK., Jung, JW., Lee, JT., Choi, WJ., Ji, SJ. (2002). Optimal Synthesis Method for Binary Neural Network Using NETLA. In: Pal, N.R., Sugeno, M. (eds) Advances in Soft Computing — AFSS 2002. AFSS 2002. Lecture Notes in Computer Science(), vol 2275. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45631-7_32
Download citation
DOI: https://doi.org/10.1007/3-540-45631-7_32
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-43150-3
Online ISBN: 978-3-540-45631-5
eBook Packages: Springer Book Archive