skip to main content
10.1145/1374376.1374466acmconferencesArticle/Chapter ViewAbstractPublication PagesstocConference Proceedingsconference-collections
research-article

On agnostic boosting and parity learning

Published: 17 May 2008 Publication History

Abstract

The motivating problem is agnostically learning parity functions, i.e., parity with arbitrary or adversarial noise. Specifically, given random labeled examples from an *arbitrary* distribution, we would like to produce an hypothesis whose accuracy nearly matches the accuracy of the best parity function. Our algorithm runs in time 2O(n/log n), which matches the best known for the easier cases of learning parities with random classification noise (Blum et al, 2003) and for agnostically learning parities over the uniform distribution on inputs (Feldman et al, 2006).
Our approach is as follows. We give an agnostic boosting theorem that is capable of nearly achieving optimal accuracy, improving upon earlier studies (starting with Ben David et al, 2001). To achieve this, we circumvent previous lower bounds by altering the boosting model. We then show that the (random noise) parity learning algorithm of Blum et al (2000) fits our new model of agnostic weak learner. Our agnostic boosting framework is completely general and may be applied to other agnostic learning problems. Hence, it also sheds light on the actual difficulty of agnostic learning by showing that full agnostic boosting is indeed possible.

References

[1]
M. Ajtai, R. Kumar, and D. Sivakumar. A sieve algorithm for the shortest lattice vector problem. In Proceedings of the 33rd Annual ACMSymposium on Theory of Computing, 2001.]]
[2]
Shai Ben-David, Philip M. Long, and Yishay Mansour. Agnostic boosting. In Proceedings of the 14th Annual Conference on Computational Learning Theory, pages 507--516, 2001.]]
[3]
A. Blum, M. Furst, M. Kearns, and R. Lipton. Cryptographic Primitives Based on Hard Learning Problems. In Advances in Cryptology -- CRYPTO '93, pages 278--291, 1993.]]
[4]
A. Blum, A. Kalai, and H. Wasserman. Noise-tolerant learning, the parity problem, and the statistical query model. J. ACM, 50(4):506--519, 2003.]]
[5]
V. Feldman, P. Gopalan, S. Khot, and A. K. Ponnuswami. New results for learning noisy parities and halfspaces. In Proc. 47th IEEE Symp. on Foundations of Computer Science, 2006.]]
[6]
Dmitry Gavinsky. Optimally-smooth adaptive boosting and application to agnostic learning. Journal of Machine Learning Research, 4:101--117, 2003.]]
[7]
Oded Goldreich, Ronitt Rubinfeld, and Madhu Sudan. Learning polynomials with queries: the highly noisy case. SIAM J. of Discrete Mathematics, 13(4):535--570, 2000.]]
[8]
David Haussler, Michael J. Kearns, Nick Littlestone, and Manfred K. Warmuth. Equivalence of models for polynomial learnability. Information and Computation, 95(2):129--161, 1991.]]
[9]
Nicholas J. Hopper and Manuel Blum. Secure human identification protocols. In ASIACRYPT, pages 52--66, 2001.]]
[10]
Ari Juels and Stephen A. Weis. Authenticating pervasive devices with human protocols. In CRYPTO, pages 293--308, 2005.]]
[11]
A. Kalai, A. Klivans, Y. Mansour, and R. Servedio. Agnostically learning halfspaces. In Proceedings of the $46^th$ IEEE Symp. on Foundations of Computer Science, 2005.]]
[12]
A. Kalai and R. Servedio. Boosting in the presence of noise. To appear in Journal of Computer and System Sciences, 2005.]]
[13]
M. Kearns. Efficient noise-tolerant learning from statistical queries. Journal of the ACM, 45(6):983--1006, 1998.]]
[14]
M. Kearns and Y. Mansour. On the boosting ability of top-down decision tree learning algorithms. In Proceedings of the Twenty-Eighth Annual Symposium on Theory of Computing, pages 459--468, 1996.]]
[15]
M. Kearns and Y. Mansour. On the boosting ability of top-down decision tree learning algorithms. Journal of Computer and System Sciences, 58(1):109--128, 1999.]]
[16]
M. Kearns, R. Schapire, and L. Sellie. Toward Efficient Agnostic Learning. Machine Learning, 17(2/3):115--141, 1994.]]
[17]
Vadim Lyubashevsky. The parity problem in the presence of noise, decoding random linear codes, and the subset sum problem. In APPROX--RANDOM, pages 378--389, 2005.]]
[18]
Y. Mansour and D. McAllester. Boosting using branching programs. Journal of Computer and System Sciences, 64(1):103--112, 2002.]]
[19]
E. Mossel and S. Roch. Learning nonsingular phylogenies and hidden markov models. In To appear in Proceedings of the 37th Annual Symposium on Theory of Computing (STOC), 2005.]]
[20]
Oded Regev. On lattices, learning with errors, random linear codes, and cryptography. In Proceedings of the thirty-seventh annual ACM Symposium on Theory of Computing (STOC-05), pages 84--93, New York, 2005.]]
[21]
R. Schapire. The strength of weak learnability. Machine Learning, 5(2):197--227, 1990.]]
[22]
L. Valiant. A theory of the learnable. Communications of the ACM, 27(11):1134--1142, 1984.]]

Cited By

View all
  • (2023)Omnipredictors for constrained optimizationProceedings of the 40th International Conference on Machine Learning10.5555/3618408.3618956(13497-13527)Online publication date: 23-Jul-2023
  • (2022)Online agnostic multiclass boostingProceedings of the 36th International Conference on Neural Information Processing Systems10.5555/3600270.3602149(25908-25920)Online publication date: 28-Nov-2022
  • (2022)NP-Hardness of Learning Programs and Partial MCSP2022 IEEE 63rd Annual Symposium on Foundations of Computer Science (FOCS)10.1109/FOCS54457.2022.00095(968-979)Online publication date: Oct-2022
  • Show More Cited By

Index Terms

  1. On agnostic boosting and parity learning

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    STOC '08: Proceedings of the fortieth annual ACM symposium on Theory of computing
    May 2008
    712 pages
    ISBN:9781605580470
    DOI:10.1145/1374376
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 17 May 2008

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. agnostic boosting
    2. agnostic learning
    3. learning parity with noise
    4. sub-exponential algorithms

    Qualifiers

    • Research-article

    Conference

    STOC '08
    Sponsor:
    STOC '08: Symposium on Theory of Computing
    May 17 - 20, 2008
    British Columbia, Victoria, Canada

    Acceptance Rates

    STOC '08 Paper Acceptance Rate 80 of 325 submissions, 25%;
    Overall Acceptance Rate 1,469 of 4,586 submissions, 32%

    Upcoming Conference

    STOC '25
    57th Annual ACM Symposium on Theory of Computing (STOC 2025)
    June 23 - 27, 2025
    Prague , Czech Republic

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)41
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 18 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Omnipredictors for constrained optimizationProceedings of the 40th International Conference on Machine Learning10.5555/3618408.3618956(13497-13527)Online publication date: 23-Jul-2023
    • (2022)Online agnostic multiclass boostingProceedings of the 36th International Conference on Neural Information Processing Systems10.5555/3600270.3602149(25908-25920)Online publication date: 28-Nov-2022
    • (2022)NP-Hardness of Learning Programs and Partial MCSP2022 IEEE 63rd Annual Symposium on Foundations of Computer Science (FOCS)10.1109/FOCS54457.2022.00095(968-979)Online publication date: Oct-2022
    • (2022)Physically Unclonable Functions and AISecurity and Artificial Intelligence10.1007/978-3-030-98795-4_5(85-106)Online publication date: 8-Apr-2022
    • (2020)Online agnostic boosting via regret minimizationProceedings of the 34th International Conference on Neural Information Processing Systems10.5555/3495724.3495779(644-654)Online publication date: 6-Dec-2020
    • (2020)Algorithms and lower bounds for de morgan formulas of low-communication leaf gatesProceedings of the 35th Computational Complexity Conference10.4230/LIPIcs.CCC.2020.15(1-41)Online publication date: 28-Jul-2020
    • (2020)An Efficient, Parallelized Algorithm for Optimal Conditional Entropy-Based Feature SelectionEntropy10.3390/e2204049222:4(492)Online publication date: 24-Apr-2020
    • (2019)MultiaccuracyProceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society10.1145/3306618.3314287(247-254)Online publication date: 27-Jan-2019
    • (2019)Learning from Outcomes: Evidence-Based Rankings2019 IEEE 60th Annual Symposium on Foundations of Computer Science (FOCS)10.1109/FOCS.2019.00016(106-125)Online publication date: Nov-2019
    • (2016)On Public Key Encryption from Noisy CodewordsProceedings, Part II, of the 19th IACR International Conference on Public-Key Cryptography --- PKC 2016 - Volume 961510.1007/978-3-662-49387-8_16(417-446)Online publication date: 6-Mar-2016
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media