Abstract
We continued the study of learning an approximation to the desired function. Rather than measure the variance between the desired function and the approximation, we accounted for the difficulty of deciding membership in the set points comprising the variance. Our results indicate that the more complex a decision procedure is allowed, the larger the class of functions that become inferrible.
A preliminary version of this work appeared at the Workshop on Computational Learning Theory, Cambridge MA, 1988.
Supported, in part, by National Science Foundation Grant CCR 8803641.
Much of this work was done while the second author was affiliated with the University of Maryland Department of Computer Science.
Supported, in part, by National Science Foundation Grant CCR 870110. Much of this work was done while the third author was on leave at the National Science Foundation. Any opinions, findings, and conclusions or recommendations expressed in this publication are those of the authors and do not necessarily reflect the views of the National Science Foundation.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Angluin, D. and Smith, C. H. Inductive inference: theory and methods. Computing Surveys 15 (1983), 237–269.
Blum, L. and Blum, M. Toward a mathematical theory of inductive inference. Information and Control 28 (1975), 125–155.
Case, J. and Smith, C. Comparison of identification criteria for machine inductive inference. Theoretical Computer Science 25, 2 (1983), 193–220.
Freivalds, R., Smith, C., and Velauthapillai, M. Trade—offs amongst parameters effecting the inductive inferribility of classes of recursive functions. Information and Computation (1989). To appear.
Geske, J., Huynh, D., and Selman, A. Hierarchy Theorems for Almost Everywhere Hard Sets. Lecture Notes in Computer Science (To Appear) (1987). STACS 1987 Proceedings.
Gold, E. M. Language identification in the limit. Information and Control 10 (1967), 447–474.
Hopcroft, J. and Ullman, J.Introduction to Automata Theory, Languages, and Computation. Addison-Wesley Publishing Co., Reading, Mass., 1979.
Kleene, S. On notation for ordinal numbers. Journal of Symbolic Logic 3 (1938), 150–155.
Machtey, M. and Young, P.An Introduction to the General Theory of Algorithms. North-Holland, New York, New York, 1978.
Pitt, L. A Characterization of Probabilistic Inference. Journal of the ACM(1978). To appear.
Pitt, L. and Smith, C. Probability and plurality for aggregations of learning machines. Information and Computation 77 (1988), 77–92.
Putnam, H. Probability and confirmation. In Mathematics, Matter and Method, 1, Cambridge University Press, 1975. Originally appeared in 1963 as a Voice of America Lecture.
Rogers, H. Jr.Theory of Recursive Functions and Effective Computability. McGraw Hill, New York, 1967.
Royer, J. S. Inductive inference of approximations. Information and Control 70, 2/3 (1986), 156–178.
Smith, C. H. The power of pluralism for automatic program synthesis. Journal of the ACM 29, 4 (1982), 1144–1165.
Smith, C. H. and Velauthapillai, M. On the inference of approximate explanations. Theoretical Computer Science(1982). To appear.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1989 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Gasarch, W.I., Sitaraman, R.K., Smith, C.H., Velauthapillai, M. (1989). Learning programs with an easy to calculate set of errors. In: Jantke, K.P. (eds) Analogical and Inductive Inference. AII 1989. Lecture Notes in Computer Science, vol 397. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-51734-0_55
Download citation
DOI: https://doi.org/10.1007/3-540-51734-0_55
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-51734-4
Online ISBN: 978-3-540-46798-4
eBook Packages: Springer Book Archive