Skip to main content

Improved Minimax Bounds on the Test and Training Distortion of Empirically Designed Vector Quantizers

  • Conference paper
Learning Theory (COLT 2005)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3559))

Included in the following conference series:

  • 3449 Accesses

Abstract

It is shown by earlier results that the minimax expected (test) distortion redundancy of empirical vector quantizers with three or more levels designed from n independent and identically distributed data points is at least \(\Omega(1/\sqrt n)\) for the class of distributions on a bounded set. In this paper, a much simpler construction and proof for this are given with much better constants. There are similar bounds for the training distortion of the empirically optimal vector quantizer with three or more levels. These rates, however, do not hold for a one-level quantizer. Here the two-level quantizer case is clarified, showing that it already shares the behavior of the general case. Given that the minimax bounds are proved using a construction that involves discrete distributions, one suspects that for the class of distributions with uniformly bounded continuous densities, the expected distortion redundancy might decrease as \(o(1/\sqrt n)\) uniformly. It is shown as well that this is not so, proving that the lower bound for the expected test distortion remains true for these subclasses.

This research was supported in part by the NATO Science Fellowship.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Pollard, D.: Strong consistency of k-means clustering. Annals of Statistics 9, 135–140 (1981)

    Article  MATH  MathSciNet  Google Scholar 

  2. Pollard, D.: Quantization and the method of k-means. IEEE Transactions on Information Theory IT-28, 199–205 (1982)

    Article  MATH  MathSciNet  Google Scholar 

  3. Linder, T., Lugosi, G., Zeger, K.: Rates of convergence in the source coding theorem, in empirical quantizer design, and in universal lossy source coding. IEEE Transactions on Information Theory 40, 1728–1740 (1994)

    Article  MATH  MathSciNet  Google Scholar 

  4. Linder, T.: On the training distortion of vector quantizers. IEEE Trans. Inform. Theory IT-46, 1617–1623 (2000)

    Article  MATH  Google Scholar 

  5. Linder, T.: Learning-theoretic methods in vector quantization. In: Györfi, L. (ed.) Principles of nonparametric learning. CISM Courses and Lectures, vol. 434, pp. 163–210. Springer, Wien (2002)

    Google Scholar 

  6. Linder, T., Lugosi, G., Zeger, K.: Empirical quantizer design in the presence of source noise or channel noise. IEEE Trans. Inform. Theory IT-43, 612–623 (1997)

    Article  Google Scholar 

  7. Merhav, N., Ziv, J.: On the amount of side information required for lossy data compression. IEEE Trans. Inform. Theory IT-43, 1112–1121 (1997)

    Article  MathSciNet  Google Scholar 

  8. Zeevi, A.J.: On the performance of vector quantizers empirically designed from dependent sources. In: Storer, J., Cohn, M. (eds.) Proceedings of Data Compression Conference, DCC 1998, pp. 73–82. IEEE Computer Society Press, Los Alamitos (1998)

    Google Scholar 

  9. Bartlett, P., Linder, T., Lugosi, G.: The minimax distortion redundancy in empirical quantizer design. IEEE Transactions on Information Theory IT-44, 1802–1813 (1998)

    Article  MathSciNet  Google Scholar 

  10. Antos, A., Györfi, L., György, A.: Improved convergence rates in empirical vector quantizer design. In: Proceedings 2004 IEEE International Symposium on Information Theory, Chicago, IL, June 28–July 2, p. 301. IEEE, IEEE Information Theory Society, Los Alamitos (2004) Full paper submitted

    Google Scholar 

  11. Chou, P.A.: The distortion of vector quantizers trained on n vectors decreases to the optimum as O p (1/n). In: Proceedings 1994 IEEE International Symposium on Information Theory, Trondheim, Norway, June 27–July 1, p. 457. IEEE, IEEE Information Theory Society, Los Alamitos (1994)

    Chapter  Google Scholar 

  12. Pollard, D.: A central limit theorem for k-means clustering. Annals of Probability 10, 919–926 (1982)

    Article  MATH  MathSciNet  Google Scholar 

  13. Devroye, L., Györfi, L., Lugosi, G.: A Probabilistic Theory of Pattern Recognition. In: Applications of Mathematics, Stochastic Modelling and Applied Probability, vol. (31), Springer, Heidelberg (1996)

    Google Scholar 

  14. Slud, E.V.: Distribution inequalities for the binomial law. Annals of Probability 5, 404–412 (1977)

    Article  MATH  MathSciNet  Google Scholar 

  15. Kolmogorov, A.N., Tikhomirov, V.M.: ε-entropy and ε-capacity of sets in function spaces. Translations of the American Mathematical Society 17, 277–364 (1961)

    Google Scholar 

  16. Devroye, L., Györfi, L.: Nonparametric Density Estimation: The L 1 View. John Wiley, New York (1985)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Antos, A. (2005). Improved Minimax Bounds on the Test and Training Distortion of Empirically Designed Vector Quantizers. In: Auer, P., Meir, R. (eds) Learning Theory. COLT 2005. Lecture Notes in Computer Science(), vol 3559. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11503415_36

Download citation

  • DOI: https://doi.org/10.1007/11503415_36

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-26556-6

  • Online ISBN: 978-3-540-31892-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics