Skip to main content

Vector quantization and projection neural network

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 686))

Abstract

Classical data analysis techniques are generally linear. They fail to reduce the dimension of data sets where dependence between observed variables is non-linear. However, for numerous scientific, industrial and economic areas, it should be desirable to obtain a low-dimensional parametric representation of the data set. Model fitting is a way to obtain a usable representation of an observed phenomenon, but it requires expert knowledge about the phenomenon. Moreover, hidden relations between observables could be not revealed. Kohonen maps are shown to be an alternative techniques, able to map even strongly non-linear data sets [1]. Unfortunately, they have an a priori fixed shape and neighbourhood structure, thus their use requires some informations about the shape and the dimension of the underlying parameters space. We propose here a new self-organizing neural network, composed of two connections layers. The first one quantizes an input data set, and the second one progressively constructs the projected shape and neighbourhood on an output space of any chosen dimension. We illustrate the algorithm for various applications.

This is a preview of subscription content, log in via an institution.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Blayo F., Demartines P.: Data analysis: How to compare Kohonen neural networks to other techniques? Artificial Neural Networks, International Workshop IWANN'91. A. Prieto Ed. Lecture Notes in Computer Science, Vol. 540, pp. 469–476. Springer-Verlag, 1991.

    Google Scholar 

  2. Cherkassky V., Lari-Najafi H.: Constrained Topological Mapping for Nonparametric Regression Analysis. Neural Networks, Vol. 4, pp. 27–40, 1991.

    Google Scholar 

  3. Demartines P.: Organization measures and representations of the Kohonen maps. Proc. of the First IFIP Working Group-10.6 Workshop. J. Hérault Ed. Grenoble, 1992.

    Google Scholar 

  4. Demartines P., Blayo F.: Kohonen Self-Organizing Maps: Is the Normalization Necessary? Complex Systems, Vol. 6, No. 2, pp. 105–123, 1992.

    Google Scholar 

  5. DeSieno D.: Adding a Conscience to Competitive Learning. Neural Networks, Vol. 1, pp. 117–124, San Diego, 1988.

    Google Scholar 

  6. Hertz J., Krogh A., Palmer K. G.: Introduction to the Theory of Neural Computation. Santa Fe Institute Lecture Notes Volume I, Addison-Wesley Publishing Company, 1991.

    Google Scholar 

  7. Kohonen T.: Self-Organization and Associative Memory (3rd ed.). Springer-Verlag, Berlin, 1989.

    Google Scholar 

  8. Kohonen T.: The Self-Organizing Map. Proc. of the IEEE, Vol. 78, No. 9, pp. 1464–1480, 1990.

    Google Scholar 

  9. Linde Y., Buzo A., Gray R. M.: An algorithm for vector quantizer design. IEEE Trans. Commun., Vol. COM-28, pp. 84–95, 1980.

    Google Scholar 

  10. Oja E.: A Simplified Neuron Model as a Principal Component Analyzer. IEEE International Conference on Neural Networks, Vol. 15, pp. 267–273, 1982.

    Google Scholar 

  11. Samardzija N., Waterland R. L.: A neural network for computing eigenvectors and eigenvalues. Biological Cybernetics 65, pp. 211–214, 1991.

    Google Scholar 

  12. Sanger T. D..: Optimal Unsupervised Learning in a Single-Layer Linear Feedforward Neural Network. Neural Networks, Vol. 2, pp. 459–473, 1989.

    Google Scholar 

  13. Shepard R. N., Carroll J. D.: Parametric Representation of Nonlinear Data Structures. Proc. of an International Symposium on Mullivariate Analysis, pp. 561–592. P. R. Krishnaiah Ed. Academic Press, New-York and London, 1965.

    Google Scholar 

  14. Zeger K., Vaisey J., Gersho A.: Globally Optimal Vector Quantizer Design by Stochastic Relaxation. IEEE Transactions on Signal Processing, Vol. 40, No. 2, pp. 310–322, 1992.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

José Mira Joan Cabestany Alberto Prieto

Rights and permissions

Reprints and permissions

Copyright information

© 1993 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Demartines, P., Hérault, J. (1993). Vector quantization and projection neural network. In: Mira, J., Cabestany, J., Prieto, A. (eds) New Trends in Neural Computation. IWANN 1993. Lecture Notes in Computer Science, vol 686. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-56798-4_168

Download citation

  • DOI: https://doi.org/10.1007/3-540-56798-4_168

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-56798-1

  • Online ISBN: 978-3-540-47741-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics