Abstract
Recursive branching network (RBN) was proposed in [1] to solve linearly non-separable problems using output-coded perceptrons. It relies on splitting the training patterns, at random, between parallel perceptrons. However, the random splitting mechanism can trap the perceptron with conflicting patterns. Optimizing the splitting methods, through clustering, is proposed here to ensure meaningful way of distributing the patterns between the perceptrons. We propose four splitting methods which use different similarity measures between patterns. We examine these methods on five standard data sets. In general, these methods enhance the performance of RBN and in many cases contribute to reducing the network complexity compared with random-splitting RBN.
Similar content being viewed by others
References
Al-Mashouq, K.: Recursive branching net, Proceedings of the IEEE Conference on Systems,Man and Cybernetics, Orlando, Florida.
Cover, T. M.: Geometrical and statistical properties of linear threshold devices, Ph.D. thesis, Tech. Rep. 6107-1, Stanford Electron. Labs, Stanford, CA.
Murphy, P. and Aha, D.: UCI repository of machine learning databases [machine-readable data repository], Technical Report, University of California, Irvine.
Strang, G.: Linear algebra and its applications, Academic Press, Inc., 1976.
Al-Mashouq, K. and Abu-Mostafa, Yaser: VC dimension of error tolerant neural net, World Congress on Neural Networks (WCNN-93), Portland, Oregon.
Baum, E. B. and Haussler, D.: What size net gives valid generalization? Neural Computation, 1 (1999), 151-160.
Haykin, S.: Neural Networks, Prentice Hall, Inc., New Jersey, 1999.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Al-Mashouq, K.A. Clustered Recursive Branching Network. Neural Processing Letters 12, 59–69 (2000). https://doi.org/10.1023/A:1009665713438
Issue Date:
DOI: https://doi.org/10.1023/A:1009665713438