Abstract
The competitive associative net called CAN2 has been shown effective in many applications, such as function approximation, control, rainfall estimation, time-series prediction, and so on, but the learning method has been constructed basically for reducing the training (empirical) error. In order to reduce prediction (generalization) error, we, in this article, try to apply the ensemble scheme to the CAN2 and present a method to select an effective number of units for the ensemble. We show the result of numerical experiments and examine the effectiveness of the present method.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Ahalt, A.C., Krishnamurthy, A.K., Chen, P., Melton, D.E.: Competitive learning algorithms for vector quantization. Neural Networks 3, 277–290 (1990)
Kohonen, T.: Associative Memory. Springer, Heidelberg (1977)
Kurogi, S., Ren, S.: Competitive associative networks for function approximation and control of plants. In: Proc. NOLTA 1997, pp. 775–778 (1997)
Kurogi, S., Tou, M., Terada, S.: Rainfall estimation using competitive associative net. In: Proc. 2001 IEICE General Conference (in Japanese). SD-1, pp. 260–261 (2001)
Kurogi, S.: Asymptotic optimality of competitive associative nets for their learning in function approximation. In: Proc. ICONIP 2002, vol. 1, pp. 507–511 (2002)
Kurogi, S.: Asymptotic optimality of competitive associative nets and its application to incremental learning of nonlinear functions. Trans. of IEICE D-II (in Japanese) J86-D-II(2) , 184–194 (2003)
Kurogi, S., Ueno, T., Sawa, M.: A batch learning method for competitive associative net and its application to function approximation. In: Proc. of SCI 2004 V, pp. 24–28 (2004)
Kurogi, S., Ueno, T., Sawa, M.: Batch learning competitive associative net and its application to time series prediction. In: Proc. of IJCNN 2004 (2004) (in CD-ROM)
Farmer, J.D., Sidorowich, J.J.: Predicting chaotic time series. Phys. Rev. Lett. 59, 845–848 (1987)
Chandrasekaran, H., Manry, M.T.: Convergent design of a piecewise linear neural network. In: Proc. IJCNN 1999, vol. 2, pp. 1339–1344 (1999)
Jacobs, R.A., Jordan, M.I., Nowlan, S.J., Hinton, G.E.: Adaptive mixtures of local experts. Neural Computation 3, 79–87 (1991)
Friedman, J.H.: Multivariate adaptive regression splines. Ann. Stat. 19, 1–50 (1991)
Opitz, D., Maclin, R.: Popular ensemble methods: an empirical study. Journal of Artificial Intelligence Research 11, 169–198 (1999)
Lendasse, A., Werz, V., Simon, G., Verleysen, M.: Fast bootstrap applied to LS-SVM for long term prediction of time series. In: Proc. of IJCNN 2004, pp. 705–710 (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kurogi, S., Tanaka, S., Koyama, R., Nishida, T. (2006). Ensemble of Competitive Associative Nets and a Method to Select an Effective Number of Units. In: King, I., Wang, J., Chan, LW., Wang, D. (eds) Neural Information Processing. ICONIP 2006. Lecture Notes in Computer Science, vol 4232. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11893028_78
Download citation
DOI: https://doi.org/10.1007/11893028_78
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-46479-2
Online ISBN: 978-3-540-46480-8
eBook Packages: Computer ScienceComputer Science (R0)