Skip to main content

Feature Discovery by Enhancement and Relaxation of Competitive Units

  • Conference paper
Intelligent Data Engineering and Automated Learning – IDEAL 2008 (IDEAL 2008)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 5326))

Abstract

In this paper, we introduce a new concept of enhancement and relaxation to discover features in input patterns in competitive learning. We have introduced mutual information to realize competitive processes. Because mutual information is an average over all input patterns and competitive units, it cannot be used to detect detailed feature extraction. To examine in more detail how a network is organized, we introduce the enhancement and relaxation of competitive units through some elements in a network. With this procedure, we can estimate how the elements are organized with more detail. We applied the method to a simple artificial data and the famous Iris problem to show how well the method can extract the main features in input patterns. Experimental results showed that the method could more explicitly extract the main features in input patterns than the conventional techniques of the SOM.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Gokcay, E., Principe, J.: Information theoretic clustering. IEEE Transactions on Pattern Analysis and Machine 24(2), 158–171 (2002)

    Article  Google Scholar 

  2. Lehn-Schioler, D.E.T., Hegde, A., Principe, J.C.: Vector-quantization using information theoretic concepts. Natural Computation 4(1), 39–51 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  3. Torkkola, K.: Feature extraction by non-parametric mutual information maximization. Journal of Machine Learning Research 3, 1415–1438 (2003)

    MathSciNet  MATH  Google Scholar 

  4. Linsker, R.: Self-organization in a perceptual network. Computer 21, 105–117 (1988)

    Article  Google Scholar 

  5. Linsker, R.: How to generate ordered maps by maximizing the mutual information between input and output. Neural Computation 1, 402–411 (1989)

    Article  Google Scholar 

  6. Bell, A.J., Sejnowski, T.J.: An information-maximization approach to blind separation and blind deconvolution. Neural Computation 7(6), 1129–1159 (1995)

    Article  Google Scholar 

  7. Kamimura, R.: Information-theoretic competitive learning with inverse euclidean distance. Neural Processing Letters 18, 163–184 (2003)

    Article  Google Scholar 

  8. Kamimura, R.: Unifying cost and information in information-theoretic competitive learning. Neural Networks 18, 711–718 (2006)

    Article  Google Scholar 

  9. Mozer, M.C., Smolensky, P.: Using relevance to reduce network size automatically. Connection Science 1(1), 3–16 (1989)

    Article  Google Scholar 

  10. Karnin, E.D.: A simple procedure for pruning back-propagation trained neural networks. IEEE Transactions on Neural Networks 1(2) (1990)

    Google Scholar 

  11. Le Cun, J.S.D.Y., Solla, S.A.: Optimal brain damage. In: Advanced in Neural Information Processing, pp. 598–605 (1990)

    Google Scholar 

  12. Reed, R.: Pruning algorithms-a survey. IEEE Transactions on Neural Networks 4(5) (1993)

    Google Scholar 

  13. Rumelhart, D.E., Zipser, D.: Feature discovery by competitive learning. Cognitive Science 9, 75–112

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kamimura, R. (2008). Feature Discovery by Enhancement and Relaxation of Competitive Units. In: Fyfe, C., Kim, D., Lee, SY., Yin, H. (eds) Intelligent Data Engineering and Automated Learning – IDEAL 2008. IDEAL 2008. Lecture Notes in Computer Science, vol 5326. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-88906-9_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-88906-9_19

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-88905-2

  • Online ISBN: 978-3-540-88906-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics