Skip to main content

Maximizing the Ratio of Information to Its Cost in Information Theoretic Competitive Learning

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3697))

Abstract

In this paper, we introduce costs in the framework of information maximization and try to maximize the ratio of information to its associated cost. We have shown that competitive learning is realized by maximizing mutual information between input patterns and competitive units. One shortcoming of the method is that maximizing information does not necessarily produce representations faithful to input patterns. Information maximizing primarily focuses on some parts of input patterns used to distinguish between patterns. Thus, we introduce the ratio of information to its cost that represents distance between input patterns and connection weights. By minimizing the ratio, final connection weights reflect well input patterns. We applied unsupervised information maximization to a voting attitude problem and supervised learning to a chemical data analysis. Experimental results confirmed that by minimizing the ratio, the cost is decreased with better generalization performance.

An erratum to this chapter can be found at http://dx.doi.org/10.1007/11550907_163 .

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Marc, M., Hulle, M.V.: Faithful representations and topographic maps. John Wiley and Sons, Inc., New York (2000)

    Google Scholar 

  2. Grossberg, S.: Competitive learning: From interactive activation to adaptive resonance. Cognitive Science 11, 23–63 (1987)

    Article  Google Scholar 

  3. Rumelhart, D.E., Zipser, D.: Feature discovery by competitive learning. In: Rumelhart, D.E., et al. (eds.) Parallel Distributed Processing., vol. 1, pp. 151–193. MIT Press, Cambridge (1986)

    Google Scholar 

  4. Rumelhart, D.E., McClelland, J.L.: On learning the past tenses of English verbs. In: Rumelhart, D.E., Hinton, G.E., Williams, R.J. (eds.) Parallel Distributed Processing., vol. 2, pp. 216–271. MIT Press, Cambrige (1986)

    Google Scholar 

  5. Grossberg, S.: Competitive learning: from interactive activation to adaptive resonance. Cognitive Science 11, 23–63 (1987)

    Article  Google Scholar 

  6. DeSieno, D.: Adding a conscience to competitive learning. In: Proceedings of IEEE International Conference on Neural Networks, San Diego, pp. 117–124. IEEE, Los Alamitos (1988)

    Chapter  Google Scholar 

  7. Ahalt, S.C., Krishnamurthy, A.K., Chen, P., Melton, D.E.: Competitive learning algorithms for vector quantization. Neural Networks 3, 277–290 (1990)

    Article  Google Scholar 

  8. Xu, L.: Rival penalized competitive learning for clustering analysis, RBF net, and curve detection. IEEE Transaction on Neural Networks 4(4), 636–649 (1993)

    Article  Google Scholar 

  9. Luk, A., Lien, S.: Properties of the generalized lotto-type competitive learning. In: Proceedings of International conference on neural information processing, San Mateo, CA, pp. 1180–1185. Morgan Kaufmann Publishers, San Francisco (2000)

    Google Scholar 

  10. Hulle, M.M.V.: The formation of topographic maps that maximize the average mutual information of the output responses to noiseless input signals. Neural Computation 9(3), 595–606 (1997)

    Article  Google Scholar 

  11. Linsker, R.: Self-organization in a perceptual network. Computer 21, 105–117 (1988)

    Article  Google Scholar 

  12. Linsker, R.: How to generate ordered maps by maximizing the mutual information between input and output. Neural Computation 1, 402–411 (1989)

    Article  Google Scholar 

  13. Linsker, R.: Local synaptic rules suffice to maximize mutual information in a linear network. Neural Computation 4, 691–702 (1992)

    Article  Google Scholar 

  14. Kamimura, R., Kamimura, T., Shultz, T.R.: Information theoretic competitive learning and linguistic rule acquistion. Transactions of the Japanese Society for Artificial Intelligence 16(2), 287–298 (2001)

    Article  Google Scholar 

  15. Kamimura, R., Kamimura, T., Uchida, O.: Flexible feature discovery and structural information. Connection Science 13(4), 323–347 (2001)

    Article  Google Scholar 

  16. Kamimura, R., Kamimura, T., Takeuchi, H.: Greedy information acquisition algorithm: A new information theoretic approach to dynamic information acquisition in neural networks. Connection Science 14(2), 137–162 (2002)

    Article  Google Scholar 

  17. Kamimura, R.: Progressive feature extraction by greedy network-growing algorithm. Complex Systems 14(2), 127–153 (2003)

    MATH  MathSciNet  Google Scholar 

  18. Gatlin, L.L.: Information Theory and Living Systems. Columbia University Press, Englewood Cliffs (1972)

    Google Scholar 

  19. Yoshikawa, M., Ikegami, Y., Hayasaka, S., Ishii, K., Ito, A., Sano, K., Suzuki, T., Togawa, T., Yoshida, H., Soda, H., Oka, M., Kohno, S., Sawada, S., Ishikawa, T., Tanabe, S.: Novel camtothecin analogues that circumvent abcg2-associated drug resistance in human tumor. Int. J. Cancer 110, 921–927 (2004)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kamimura, R., Aida-Hyugaji, S. (2005). Maximizing the Ratio of Information to Its Cost in Information Theoretic Competitive Learning. In: Duch, W., Kacprzyk, J., Oja, E., Zadrożny, S. (eds) Artificial Neural Networks: Formal Models and Their Applications – ICANN 2005. ICANN 2005. Lecture Notes in Computer Science, vol 3697. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11550907_35

Download citation

  • DOI: https://doi.org/10.1007/11550907_35

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28755-1

  • Online ISBN: 978-3-540-28756-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics