Abstract
It is shown by earlier results that the minimax expected (test) distortion redundancy of empirical vector quantizers with three or more levels designed from n independent and identically distributed data points is at least \(\Omega(1/\sqrt n)\) for the class of distributions on a bounded set. In this paper, a much simpler construction and proof for this are given with much better constants. There are similar bounds for the training distortion of the empirically optimal vector quantizer with three or more levels. These rates, however, do not hold for a one-level quantizer. Here the two-level quantizer case is clarified, showing that it already shares the behavior of the general case. Given that the minimax bounds are proved using a construction that involves discrete distributions, one suspects that for the class of distributions with uniformly bounded continuous densities, the expected distortion redundancy might decrease as \(o(1/\sqrt n)\) uniformly. It is shown as well that this is not so, proving that the lower bound for the expected test distortion remains true for these subclasses.
This research was supported in part by the NATO Science Fellowship.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Pollard, D.: Strong consistency of k-means clustering. Annals of Statistics 9, 135–140 (1981)
Pollard, D.: Quantization and the method of k-means. IEEE Transactions on Information Theory IT-28, 199–205 (1982)
Linder, T., Lugosi, G., Zeger, K.: Rates of convergence in the source coding theorem, in empirical quantizer design, and in universal lossy source coding. IEEE Transactions on Information Theory 40, 1728–1740 (1994)
Linder, T.: On the training distortion of vector quantizers. IEEE Trans. Inform. Theory IT-46, 1617–1623 (2000)
Linder, T.: Learning-theoretic methods in vector quantization. In: Györfi, L. (ed.) Principles of nonparametric learning. CISM Courses and Lectures, vol. 434, pp. 163–210. Springer, Wien (2002)
Linder, T., Lugosi, G., Zeger, K.: Empirical quantizer design in the presence of source noise or channel noise. IEEE Trans. Inform. Theory IT-43, 612–623 (1997)
Merhav, N., Ziv, J.: On the amount of side information required for lossy data compression. IEEE Trans. Inform. Theory IT-43, 1112–1121 (1997)
Zeevi, A.J.: On the performance of vector quantizers empirically designed from dependent sources. In: Storer, J., Cohn, M. (eds.) Proceedings of Data Compression Conference, DCC 1998, pp. 73–82. IEEE Computer Society Press, Los Alamitos (1998)
Bartlett, P., Linder, T., Lugosi, G.: The minimax distortion redundancy in empirical quantizer design. IEEE Transactions on Information Theory IT-44, 1802–1813 (1998)
Antos, A., Györfi, L., György, A.: Improved convergence rates in empirical vector quantizer design. In: Proceedings 2004 IEEE International Symposium on Information Theory, Chicago, IL, June 28–July 2, p. 301. IEEE, IEEE Information Theory Society, Los Alamitos (2004) Full paper submitted
Chou, P.A.: The distortion of vector quantizers trained on n vectors decreases to the optimum as O p (1/n). In: Proceedings 1994 IEEE International Symposium on Information Theory, Trondheim, Norway, June 27–July 1, p. 457. IEEE, IEEE Information Theory Society, Los Alamitos (1994)
Pollard, D.: A central limit theorem for k-means clustering. Annals of Probability 10, 919–926 (1982)
Devroye, L., Györfi, L., Lugosi, G.: A Probabilistic Theory of Pattern Recognition. In: Applications of Mathematics, Stochastic Modelling and Applied Probability, vol. (31), Springer, Heidelberg (1996)
Slud, E.V.: Distribution inequalities for the binomial law. Annals of Probability 5, 404–412 (1977)
Kolmogorov, A.N., Tikhomirov, V.M.: ε-entropy and ε-capacity of sets in function spaces. Translations of the American Mathematical Society 17, 277–364 (1961)
Devroye, L., Györfi, L.: Nonparametric Density Estimation: The L 1 View. John Wiley, New York (1985)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Antos, A. (2005). Improved Minimax Bounds on the Test and Training Distortion of Empirically Designed Vector Quantizers. In: Auer, P., Meir, R. (eds) Learning Theory. COLT 2005. Lecture Notes in Computer Science(), vol 3559. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11503415_36
Download citation
DOI: https://doi.org/10.1007/11503415_36
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26556-6
Online ISBN: 978-3-540-31892-7
eBook Packages: Computer ScienceComputer Science (R0)