Skip to main content
Log in

Efficient Vector Quantization Using the WTA-Rule with Activity Equalization

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

We propose a new algorithm for vector quantization, the Activity Equalization Vector quantization (AEV). It is based on the winner takes all rule with an additional supervision of the average node activities over a training interval and a subsequent re-positioning of those nodes with low average activities. The re-positioning is aimed to both an exploration of the data space and a better approximation of already discovered data clusters by an equalization of the node activities. We introduce a learning scheme for AEV which requires as previous knowledge about the data only their bounding box. Using an example of Martinetz et al. [1], AEV is compared with the Neural Gas, Frequency Sensitive Competitive Learning (FSCL) and other standard algorithms. It turns out to converge much faster and requires less computational effort.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Martinetz, T., Berkovich, S. G. and Schulten, K.: `Neural-gas' network for vector quantization and its application to time-series prediction, IEEE Transactions on Neural Networks 4(4) (1993), 558–569.

    Google Scholar 

  2. Buhmann, J. and Khnel, H.: Vector quantization with complexity costs, IEEE Trans-actions on Information Theory 39(4) (1993), 1133–1145.

    Google Scholar 

  3. Kohonen, T.: Self-organized formation of topologically correct feature maps, Biol. Cybernetics 43 (1982), 59–69.

    Google Scholar 

  4. Kohonen, T.: Self-Organizing Maps, Springer-Verlag, Berlin, Heidelberg, New York, 1995.

    Google Scholar 

  5. Kohonen, T.: Learning vector quantization, In: M. A. Arbib (ed.), The Handbook of Brain Theory and Neural Networks, MIT Press (1995), pp. 537–540.

  6. Grossberg, S.: Competitive learning: From interactive activation to adaptive resonance, Cognitive Sci. 11 (1987), 23–63.

    Google Scholar 

  7. Choy, C. S.-T. and Siu, W.-C.: Fast Sequential Implementation of `Neural-Gas' Network for Vector Quantization, IEEE Transactions on Communications 46(3) (1998), 301–304.

    Google Scholar 

  8. MacQueen, J.: Some methods for classification and analysis of multivariate obser-vations, In: Proc. 5th Berkeley Symp. Math. Stat. Probab., Vol. 1 (1965), pp. 281–297.

    Google Scholar 

  9. Rose, K., Gurewitz, F. and Fox, G.: Statistical mechanics and phase transitions in clustering, Phys. Rev. Lett. 65(8) (1990), 945–948

    Google Scholar 

  10. Heidemann, G.: Ein £exibel einsetzbares Objekterkennung r neuronaler Netze. PhD thesis, Universität Bielefeld, Technische Fakultät, Infix, DISKI 190 (1998).

  11. Heidemann, G. and Ritter, H.: Combining multiple neural nets for visual feature selection and classification, In: Proceedings of ICANN 99 (1999), pp. 365–370.

  12. Ahalt, S. C., Krisnamurthy, A. K., Chen, P. and Melton, D. E.: Competitive learning algorithms for vector quantization, Neural Networks 3 (1990), 277–290.

    Google Scholar 

  13. Galanopoulos, A. S. and Ahalt, S. C.: Codeword Distribution for Frequency Sensitive Competitive Learning with One-Dimensional Input Data, IEEE Trans. on Neural Networks 7(3) (1996), 752–756.

    Google Scholar 

  14. Galanopoulos, A. S. Moses, R. L. and Ahalt, S. C.: Diffusion approximation of Fre-quency Sensitive Competitive Learning, IEEE Trans. on Neural Networks 8(5) (1997), 1026–1030.

    Google Scholar 

  15. Scheunders, P. and De Backer, S.: High-dimensional clustering using frequency sensitive competitive learning, Pattern Recognition 32 (1999), 193–202.

    Google Scholar 

  16. Wang, J.-H. and Hsiao, C.-P.: Representation-burden Conservation Network Applied to Learning VQ, Neural Processing Letters 5(3) (1997), 209–217.

    Google Scholar 

  17. Wang, J.-H. and Sun, W.-D.: Improved Representation-burden Conservation Network for Learning Non-stationary VQ, Neural Processing Letters 8(1) (1998), 41–53.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Heidemann, G., Ritter, H. Efficient Vector Quantization Using the WTA-Rule with Activity Equalization. Neural Processing Letters 13, 17–30 (2001). https://doi.org/10.1023/A:1009678928250

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1009678928250

Navigation