Skip to main content
Log in

Trial and error

A new approach to space-bounded learning

  • Published:
Acta Informatica Aims and scope Submit manuscript

Abstract

A pac-learning algorithm isd-space bounded, if it stores at mostd examples from the sample at any time. We characterize thed-space learnable concept classes. For this purpose we introduce the compression parameter of a concept classb and design our Trial and Error Learning Algorithm. We show:

b isd-space learnable if and only if the compression parameter ofb is at mostd. This learning algorithm does not produce a hypothesis consistent with the whole sample as previous approaches e.g. by Floyd, who presents consistent space bounded learning algorithms, but has to restrict herself to very special concept classes. On the other hand our algorithm needs large samples; the compression parameter appears as exponent in the sample size.

We present several examples of polynomial time space bounded learnable concept classes:

  • - all intersection closed concept classes with finite VC-dimension.

  • - convexn-gons in ℝ2.

  • - halfspaces in ℝn.

  • - unions of triangles in ℝ2.

We further relate the compression parameter to the VC-dimension, and discuss variants of this parameter.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Anselm Blumer, Andrzej Ehrenfeucht, David Haussler, Manfred K. Warmuth: Learnability and the Vapnik-Chervonenkis dimension.Journal of the Association on Computing Machinery 36(4): 929–965, 1989

    MATH  Google Scholar 

  2. Stéphane Boucheron, Jean Sallantin: Some remarks about space-complexity of learning, and circuit complexity of recognizing. InProceedings of the 4th Annual Workshop on Computational Learning Theory, pp. 125–138, 1988

  3. Sally Floyd: On Space-bounded Learning and the Vapnik-Chervonenkis Dimension. Technical report, ICSI Berkeley, 1989

  4. David Haussler: Space Efficient Learning Algorithms. Technical report, UCSC, 1988

  5. David Haussler: Generalizing the pac model: Sample size bounds from metric-dimension based uniform convergence results. InProceedings of the 30’th Annual Symposium on the Foundations of Computer Science, pp. 40–46, 1989

  6. David Helmbold, Robert Sloan, Manfred K. Warmuth: Learning nested differences of intersection-closed concept classes. InProceedings of the 2nd Annual Workshop on Computational Learning Theory, pp. 41–56, 1989

  7. Michael J. Kearns, Robert E. Schapire, Linda M. Sellie: Toward efficient agnostic learning. InProceedings of the 5th Annual Workshop on Computational Learning Theory, pp. 341–353, 1992

  8. Nick Littlestone, Manfred Warmuth: Relating Data Compression and Learnability. Technical report, UCSC, 1987

  9. Balas K. Natarajan: On learning boolean functions. InProceedings 19th ACM Symposium on Theory of Computing, pp. 296–304, 1987

  10. Robert E. Schapire: The strength of weak learnability.Machine Learning 5: 197–227, 1990

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Foued Ameur.

Additional information

Supported in part by the ESPRIT Basic Research Action No 7141 (ALCOM II) and by the DFG grant Di 412-1

Supported by the DFG grant We 1066/6-1 and by Bundesministerium für Forschung und Technologie grant 01IN102C/2.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ameur, F., Fischer, P., Höffgen, K.U. et al. Trial and error. Acta Informatica 33, 621–630 (1996). https://doi.org/10.1007/BF03036467

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF03036467

Keywords

Navigation