Abstract
In this paper, we propose a novel distributed learning algorithm built upon the Frontier Vector Quantization based on Information Theory (FVQIT) method. The FVQIT is very effective in classification problems but it shows poor training time performance. Thus, distributed learning is appropriate here to speed up training. One of the most promising lines of research towards learning from distributed data sets is separated learning and model integration. Separated learning avoids moving raw data around the distributed nodes. The integration of local models is implemented in this research using a genetic algorithm. The results obtained from twelve classification data sets demonstrate the efficacy of the proposed method. In average, the distributed FVQIT performs 13.56 times faster than the FVQIT and improves classification accuracy by 5.25%.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Principe, J.C.: Information theoretic learning: Renyi’s entropy and kernel perspectives. Springer (2010)
Porto-Díaz, I., Martínez-Rego, D., Alonso-Betanzos, A., Fontenla-Romero, O.: Information theoretic learning and local modeling for binary and multiclass classification. In: Progress in Artificial Intelligence, pp. 1–14 (2012)
Dasarathy, B.V., Sheela, B.V.: A composite classifier system design: concepts and methodology. Proceedings of the IEEE 67(5), 708–713 (1979)
Martínez-Rego, D., Fontenla-Romero, O., Porto-Díaz, I., Alonso-Betanzos, A.: A new supervised local modelling classifier based on information theory. In: International Joint Conference on Neural Networks (IJCNN), pp. 2014–2020 (2009)
Porto-Dıaz, I., Alonso-Betanzos, A., Fontenla-Romero, O.: A multiclass classifier based on local modeling and information theoretic learning. In: Conference of the Spanish Association for Artificial Intelligence (CAEPIA) (2011)
Chu, C., Kim, S.K., Lin, Y.-A., Yu, Y., Bradski, G., Ng, A.Y., Olukotun, K.: Map-reduce for machine learning on multicore. In: Advances in Neural Information Processing Systems, vol. 19, p. 281 (2007)
Principe, J.C., Xu, D., Zhao, Q., Fisher, J.W.: Learning from Examples with Information Theoretic Criteria. The Journal of VLSI Signal Processing 26(1), 61–77 (2000)
Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE Transactions on Information Theory 13(1), 21–27 (1967)
Castillo, E., Fontenla-Romero, O., Guijarro-Berdiñas, B., Alonso-Betanzos, A.: A Global Optimum Approach for One-Layer Neural Networks. Neural Computation 14(6), 1429–1449 (2002)
Goldberg, D.E.: Genetic algorithms in search, optimization, and machine learning (1989)
Frank, A., Asuncion, A.: UCI machine learning repository (2010)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Peteiro-Barral, D., Guijarro-Berdiñas, B. (2013). A Distributed Learning Algorithm Based on Frontier Vector Quantization and Information Theory. In: Mladenov, V., Koprinkova-Hristova, P., Palm, G., Villa, A.E.P., Appollini, B., Kasabov, N. (eds) Artificial Neural Networks and Machine Learning – ICANN 2013. ICANN 2013. Lecture Notes in Computer Science, vol 8131. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40728-4_16
Download citation
DOI: https://doi.org/10.1007/978-3-642-40728-4_16
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-40727-7
Online ISBN: 978-3-642-40728-4
eBook Packages: Computer ScienceComputer Science (R0)