Skip to main content

A Distributed Learning Algorithm Based on Frontier Vector Quantization and Information Theory

  • Conference paper
Artificial Neural Networks and Machine Learning – ICANN 2013 (ICANN 2013)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8131))

Included in the following conference series:

  • 6110 Accesses

Abstract

In this paper, we propose a novel distributed learning algorithm built upon the Frontier Vector Quantization based on Information Theory (FVQIT) method. The FVQIT is very effective in classification problems but it shows poor training time performance. Thus, distributed learning is appropriate here to speed up training. One of the most promising lines of research towards learning from distributed data sets is separated learning and model integration. Separated learning avoids moving raw data around the distributed nodes. The integration of local models is implemented in this research using a genetic algorithm. The results obtained from twelve classification data sets demonstrate the efficacy of the proposed method. In average, the distributed FVQIT performs 13.56 times faster than the FVQIT and improves classification accuracy by 5.25%.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Principe, J.C.: Information theoretic learning: Renyi’s entropy and kernel perspectives. Springer (2010)

    Google Scholar 

  2. Porto-Díaz, I., Martínez-Rego, D., Alonso-Betanzos, A., Fontenla-Romero, O.: Information theoretic learning and local modeling for binary and multiclass classification. In: Progress in Artificial Intelligence, pp. 1–14 (2012)

    Google Scholar 

  3. Dasarathy, B.V., Sheela, B.V.: A composite classifier system design: concepts and methodology. Proceedings of the IEEE 67(5), 708–713 (1979)

    Article  Google Scholar 

  4. Martínez-Rego, D., Fontenla-Romero, O., Porto-Díaz, I., Alonso-Betanzos, A.: A new supervised local modelling classifier based on information theory. In: International Joint Conference on Neural Networks (IJCNN), pp. 2014–2020 (2009)

    Google Scholar 

  5. Porto-Dıaz, I., Alonso-Betanzos, A., Fontenla-Romero, O.: A multiclass classifier based on local modeling and information theoretic learning. In: Conference of the Spanish Association for Artificial Intelligence (CAEPIA) (2011)

    Google Scholar 

  6. Chu, C., Kim, S.K., Lin, Y.-A., Yu, Y., Bradski, G., Ng, A.Y., Olukotun, K.: Map-reduce for machine learning on multicore. In: Advances in Neural Information Processing Systems, vol. 19, p. 281 (2007)

    Google Scholar 

  7. Principe, J.C., Xu, D., Zhao, Q., Fisher, J.W.: Learning from Examples with Information Theoretic Criteria. The Journal of VLSI Signal Processing 26(1), 61–77 (2000)

    MATH  Google Scholar 

  8. Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE Transactions on Information Theory 13(1), 21–27 (1967)

    Article  MATH  Google Scholar 

  9. Castillo, E., Fontenla-Romero, O., Guijarro-Berdiñas, B., Alonso-Betanzos, A.: A Global Optimum Approach for One-Layer Neural Networks. Neural Computation 14(6), 1429–1449 (2002)

    Article  MATH  Google Scholar 

  10. Goldberg, D.E.: Genetic algorithms in search, optimization, and machine learning (1989)

    Google Scholar 

  11. Frank, A., Asuncion, A.: UCI machine learning repository (2010)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Peteiro-Barral, D., Guijarro-Berdiñas, B. (2013). A Distributed Learning Algorithm Based on Frontier Vector Quantization and Information Theory. In: Mladenov, V., Koprinkova-Hristova, P., Palm, G., Villa, A.E.P., Appollini, B., Kasabov, N. (eds) Artificial Neural Networks and Machine Learning – ICANN 2013. ICANN 2013. Lecture Notes in Computer Science, vol 8131. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40728-4_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-40728-4_16

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-40727-7

  • Online ISBN: 978-3-642-40728-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics