Abstract
We study various ensemble methods for hybrid neural networks. The hybrid networks are composed of radial and projection units and are trained using a deterministic algorithm that completely defines the parameters of the network for a given data set. Thus, there is no random selection of the initial (and final) parameters as in other training algorithms. Network independent is achieved by using bootstrap and boosting methods as well as random input sub-space sampling. The fusion methods are evaluated on several classification benchmark data-sets. A novel MDL based fusion method appears to reduce the variance of the classification scheme and sometimes be superior in its overall performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
L. Breiman. Bagging predictors. Machine Learning, 24:123–140, 1996.
L. Breiman. Arcing classifiers. The Annals of Statistics, 26(3):801–849, 1998.
L. Breiman. Random forests. Technical Report, Statistic Department University of California, Berkeley, 2001.
S. Cohen and N. Intrator. Automatic model selection in a hybrid perceptron/radial network. Information Fusion Journal, 3(4), December 2002.
S. Cohen and N. Intrator. A hybrid projection based and radial basis function architecture: Initial values and global optimization. Pattern Analysis and Applications special issue on Fusion of Multiple Classifiers, 2:113–120, 2002.
T. Cover and J. Thomas. Elements of Information Theory. Wiley, 1991.
D.H. Deterding. Speaker Normalisation for Automatic Speech Recognition. PhD thesis, University of Cambridge, 1989.
R. A. Fisher. The use of multiple measurements in taxonomic problems. Annals of Eugenics, 7:179–188, 1936.
G.W. Flake. Square unit augmented, radially extended, multilayer percpetrons. In G. B. Orr and K. Müller, editors, Neural Networks: Tricks of the Trade, pages 145–163. Springer, 1998.
Y. Freund and R.E. Schapire. A decision theorethic generalization of on-line learning and application to boosing. Journal of Computer and System Sciences, 55(1):119–139, 1995.
G. Giacinto and F. Roli. Dynamic classifier selection. In First International workshop on Multiple Classifier Systems, pages 177–189, 2000.
G. E. Hinton and D. van Camp. Keeping neural networks simple by minimizing the description length of the weights. In Sixth ACM conference on Computational Learning Theory, pages 5–13, July 1993.
M. P. Perrone and Leon N Cooper. When networks disagree: Ensemble method for neural networks. In R. J. Mammone, editor, Neural Networks for Speech and Image processing. Chapman-Hall, 1993.
Y. Raviv and N. Intrator. Bootstrapping with noise: An effective regularization technique. Connection Science, Special issue on Combining Estimators, 8:356–372, 1996.
B. D. Ripley. Pattern Recognition and Neural Networks. Oxford Press, 1996.
J. Rissanen. A universal prior for integers and estimation by minimum description length. The Annals of Statistics, 11:416–431, 1983.
F. Roli and G. Fumera. Analysis of linear and order statistic for combiners for fusion of imbalanced classifiers. In Third International workshop on Multiple Classifier Systems, pages 252–261, 2002.
C. E. Shannon. A mathematical theory of communication. Bell Syst. Tech. J., 27:379–423 and 623–656, 1948.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Cohen, S., Intrator, N. (2003). A Study of Ensemble of Hybrid Networks with Strong Regularization. In: Windeatt, T., Roli, F. (eds) Multiple Classifier Systems. MCS 2003. Lecture Notes in Computer Science, vol 2709. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44938-8_23
Download citation
DOI: https://doi.org/10.1007/3-540-44938-8_23
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40369-2
Online ISBN: 978-3-540-44938-6
eBook Packages: Springer Book Archive