Skip to main content

New Lower Bounds for Statistical Query Learning

  • Conference paper
  • First Online:
Book cover Computational Learning Theory (COLT 2002)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2375))

Included in the following conference series:

Abstract

We prove two lower bounds on the Statistical Query (SQ) learning model. The first lower bound is on weak-learning. We prove that for a concept class of SQ-dimension d, a running time of Ω(d/logd) is needed. The SQ-dimension of a concept class is defined to be the maximum number of concepts that are “uniformly correlated”, in that each pair of them have nearly the same correlation. This lower bound matches the upper bound in [BFJ+94], up to a logarithmic factor. We prove this lower bound against an “honest SQ-oracle”, which gives a stronger result than the ones against the more frequently used “adversarial SQ-oracles”. The second lower bound is more general. It gives a continuous trade-off between the “advantage” of an algorithm in learning the target function and the number of queries it needs to make, where the advantage of an algorithm is the probability it succeeds in predicting a label minus the probability it doesn’t. Both lower bounds extend and/or strengthen previous results, and solved an open problem left in [Y01].

Partially supported by the CMU SCS Alumni Fellowship and the NSF Aladdin center, Grant CCR-0122581.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Javed A. Aslam and Scott E. Decatur, Specification and Simulation of Learning Algorithms for Efficiency and Noise-Tolerance, In COLT 1995, pages 437–446. ACM Press, July 1995.

    Google Scholar 

  2. Christopher Bishop, Neural Networks for Pattern Recognition, Oxford University Press, 1995.

    Google Scholar 

  3. Avrim Blum, Merrick Furst, Jeffrey Jackson, Michael Kearns, Yishay Man-sour, and Steven Rudich. Weakly Learning DNF and Characterizing Statistical Query Learning Using Fourier Analysis. STOC 1994, pages 253–262, 1994.

    Google Scholar 

  4. Christopher Chatfield and Alexander Collins, Introduction to Multivariate Analysis, Chapman and Hall, 1980.

    Google Scholar 

  5. Roger Horn and Charles Johnson, Matrix Analysis, Cambridge University Press, 1985.

    Google Scholar 

  6. S. Kullback, Information Theory and Statistics, New York: Dover Publications, 1959.

    MATH  Google Scholar 

  7. S. Kullback and R. A. Leibler, On Information and Sufficiency, Annals of Mathematical Statistics 22, pp. 79–86, 1951.

    Article  MathSciNet  MATH  Google Scholar 

  8. Jeff Jackson On the Efficiency of Noise-Tolerant PAC Algorithms Derived from Statistical Queries. In Proceedings of the 13th Annual Workshop on Computational Learning Theory, 2000.

    Google Scholar 

  9. Michael Kearns. Efficient noise-tolerant learning from statistical queries. In Proceedings of the 25th Annual ACM Symposium on Theory of Computing, pp. 392–401, 1993.

    Google Scholar 

  10. Gilbert Strang, Linear Algebra and Its Applications, third Edition, Har-court Bruce Jovanovich Inc., 1988.

    Google Scholar 

  11. Leslie Valiant, A theory of the Leanable. In Communications of the ACM, 27(11): 1134–1142, November 1984.

    Google Scholar 

  12. Ke Yang, On Learning Correlated Boolean Concepts Using Statistical Query, In the Proceedings of the 12th International Conference on Algorithmic Learning Theory (ALT’01), LNAI 2225, pp. 59–76, 2001.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Yang, K. (2002). New Lower Bounds for Statistical Query Learning. In: Kivinen, J., Sloan, R.H. (eds) Computational Learning Theory. COLT 2002. Lecture Notes in Computer Science(), vol 2375. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45435-7_16

Download citation

  • DOI: https://doi.org/10.1007/3-540-45435-7_16

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-43836-6

  • Online ISBN: 978-3-540-45435-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics