Skip to main content
Log in

Representation-burden Conservation Network Applied to Learning VQ (NPL270)

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

A self-creating network effective in learning vector quantization, called RCN (Representation-burden Conservation Network) is developed. Each neuron in RCN is characterized by a measure of representation-burden. Conservation is achieved by bounding the summed representation-burden of all neurons at constant 1, as representation-burden values of all neurons are updated after each input presentation. We show that RCN effectively fulfills the conscience principle [1] and achieves biologically plausible self-development capability. In addition, conservation in representation-burden facilitates systematic derivations of learning parameters, including the adaptive learning rate control useful in accelerating the convergence as well as in improving node-utilization. Because it is smooth and incremental, RCN can overcome the stability-plasticity dilemma. Simulation results show that RCN displays superior performance over other competitive learning networks in minimizing the quantization error.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. S.C. Ahalt, A.K. Krishnamurthy, P. Chen and D. E. Melton, “Competitive learning algorithms for vector quantization,” Neural Networks., Vol. 3, pp. 277–290, 1990.

    Google Scholar 

  2. Y. Matsuyama, “Harmonic competition: a self-organizing multiple criteria optimization,” IEEE Trans. Neural Networks, Vol. 7, No. 3, pp. 652–668, May, 1996.

    Google Scholar 

  3. Y. Linde, A. Buzo and R.M. Gray, “An algorithm for vector quantizer design,” IEEE Trans. Comm., Vol. COM-28, No. 1, pp. 84–95, 1980.

    Google Scholar 

  4. T. Kohonen, Self-Organization and Associative Memory, Springer Series in Information Sciences, Vol. 8. Springer-Verlag, New York, 1984.

    Google Scholar 

  5. S.A. Galanopoulos and S.C. Ahalt, “Codeword distribution for frequency sesitive competitive learning with one-dimensional input data,” IEEE Trans. Neural Networks, Vol. 7, No. 3, pp. 752–756, May, 1996.

    Google Scholar 

  6. S. Haykin, Neural networks, Macmillan College, 1994.

  7. D.I. Choi and S.H. Park, “Self-creating and organizing neural networks,” IEEE Trans. Neural Networks, Vol. 5, No. 4, pp. 561–575, 1994.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Wang, JH., Hsiao, CP. Representation-burden Conservation Network Applied to Learning VQ (NPL270). Neural Processing Letters 5, 209–217 (1997). https://doi.org/10.1023/A:1009651012418

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1009651012418

Navigation