Abstract
We propose a new algorithm for vector quantization, the Activity Equalization Vector quantization (AEV). It is based on the winner takes all rule with an additional supervision of the average node activities over a training interval and a subsequent re-positioning of those nodes with low average activities. The re-positioning is aimed to both an exploration of the data space and a better approximation of already discovered data clusters by an equalization of the node activities. We introduce a learning scheme for AEV which requires as previous knowledge about the data only their bounding box. Using an example of Martinetz et al. [1], AEV is compared with the Neural Gas, Frequency Sensitive Competitive Learning (FSCL) and other standard algorithms. It turns out to converge much faster and requires less computational effort.
Similar content being viewed by others
References
Martinetz, T., Berkovich, S. G. and Schulten, K.: `Neural-gas' network for vector quantization and its application to time-series prediction, IEEE Transactions on Neural Networks 4(4) (1993), 558–569.
Buhmann, J. and Khnel, H.: Vector quantization with complexity costs, IEEE Trans-actions on Information Theory 39(4) (1993), 1133–1145.
Kohonen, T.: Self-organized formation of topologically correct feature maps, Biol. Cybernetics 43 (1982), 59–69.
Kohonen, T.: Self-Organizing Maps, Springer-Verlag, Berlin, Heidelberg, New York, 1995.
Kohonen, T.: Learning vector quantization, In: M. A. Arbib (ed.), The Handbook of Brain Theory and Neural Networks, MIT Press (1995), pp. 537–540.
Grossberg, S.: Competitive learning: From interactive activation to adaptive resonance, Cognitive Sci. 11 (1987), 23–63.
Choy, C. S.-T. and Siu, W.-C.: Fast Sequential Implementation of `Neural-Gas' Network for Vector Quantization, IEEE Transactions on Communications 46(3) (1998), 301–304.
MacQueen, J.: Some methods for classification and analysis of multivariate obser-vations, In: Proc. 5th Berkeley Symp. Math. Stat. Probab., Vol. 1 (1965), pp. 281–297.
Rose, K., Gurewitz, F. and Fox, G.: Statistical mechanics and phase transitions in clustering, Phys. Rev. Lett. 65(8) (1990), 945–948
Heidemann, G.: Ein £exibel einsetzbares Objekterkennung r neuronaler Netze. PhD thesis, Universität Bielefeld, Technische Fakultät, Infix, DISKI 190 (1998).
Heidemann, G. and Ritter, H.: Combining multiple neural nets for visual feature selection and classification, In: Proceedings of ICANN 99 (1999), pp. 365–370.
Ahalt, S. C., Krisnamurthy, A. K., Chen, P. and Melton, D. E.: Competitive learning algorithms for vector quantization, Neural Networks 3 (1990), 277–290.
Galanopoulos, A. S. and Ahalt, S. C.: Codeword Distribution for Frequency Sensitive Competitive Learning with One-Dimensional Input Data, IEEE Trans. on Neural Networks 7(3) (1996), 752–756.
Galanopoulos, A. S. Moses, R. L. and Ahalt, S. C.: Diffusion approximation of Fre-quency Sensitive Competitive Learning, IEEE Trans. on Neural Networks 8(5) (1997), 1026–1030.
Scheunders, P. and De Backer, S.: High-dimensional clustering using frequency sensitive competitive learning, Pattern Recognition 32 (1999), 193–202.
Wang, J.-H. and Hsiao, C.-P.: Representation-burden Conservation Network Applied to Learning VQ, Neural Processing Letters 5(3) (1997), 209–217.
Wang, J.-H. and Sun, W.-D.: Improved Representation-burden Conservation Network for Learning Non-stationary VQ, Neural Processing Letters 8(1) (1998), 41–53.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Heidemann, G., Ritter, H. Efficient Vector Quantization Using the WTA-Rule with Activity Equalization. Neural Processing Letters 13, 17–30 (2001). https://doi.org/10.1023/A:1009678928250
Issue Date:
DOI: https://doi.org/10.1023/A:1009678928250