Abstract
In the setting of learning indexed families, probabilistic learning under monotonicity constraints is more powerful than deterministic learning under monotonicity constraints even if the probability is close to 1 provided the learning machines are restricted to proper or class preserving hypothesis spaces (cf. [19]). In this paper, we investigate the relation between probabilistic learning and oracle identification under monotonicity constraints. In particular, we deal with the question how much “additional information” provided by oracles is necessary in order to compensate the additional power of probabilistic learning.
If the oracle machines have access to K-oracle, then they can compensate the power of monotonic (conservative) probabilistic machines completely, provided the probability p is greater than 2/3 (1/2). Furthermore, we show that for every recursively enumerable oracle A, there exists a learning problem which is strong-monotonically learnable by an oracle machine having access to A, but not conservatively or monotonically learnable with any probability p < 0. A similar result holds for Peano-complete oracles. However, probabilistic learning under monotonicity constraints is ”rich“ enough to encode every recursively enumerable set in a characteristic learning problem, i.e., for every recursively enumerable set A, and every p < 2/3, there exists a learning problem la which is monotonically learnable with probability p, and monotonically learnable with oracle B if and only if A is Turing-reducible to B. The same result holds for conservative probabilistic learning with p < 1/2, and strong-monotonic learning with probability p = 2/3. In particular, it follows that probabilistic learning under monotonicity constraints cannot be characterized in terms of oracle identification. Moreover, we close an open problem that appeared in [19] by showing that the probabilistic hierarchies of class preserving monotonic and conservative probabilistic learning are dense.
Finally, we show that these probability bounds are strict, i.e., in the case of monotonic probabilistic learning with probability p = 2/3, conservative probabilistic learning with probability p = 1/2, and strong-monotonic probabilistic learning with probability p = 1/2, K is not suficient to compensate the power of probabilistic learning under monotonicity constraints.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
D. Angluin, Inductive Inference of formal languages from positive data, Information and Control 45 (1980) 117–135.
D. Angluin, C. Smith, Inductive inference: theory and methods, Comp. Survey 15,3 (1983) 237–269.
M. Blum, Machine independent theory of complexity of recursive functions, Journal of the ACM 14 (1967) 322–336.
J. Case, C. Smith, Comparison of Identification Criteria for Machine Inductive Inference, Theoretical Computer Science 25,2 (1983) 193–220.
R. Freivalds, Finite identification of general recursive functions by probabilistic strategies, in: Proc. of the Conf. on Fundamentals of Computation Theory (Akademie-Verlag, Berlin, 1979) 138–145.
L. Fortnow, M. Gasarch, S. Jain, E. B. Kinber, M. Kummer, S. Kurtz, M. Pleszkoch, T. Slaman, R. Solovay, F. Stephan, Extremes in the degrees in inferability, Ann. Pure Appl. Logic 66 (1994) 231–276.
Gasarch, W., Plezkoch, M., Learning via Queries to an oracle, in: Proc. 2th ACM Conf. on Comp. Learning Theory (ACM Press, Santa Cruz, July 1989) 175–188.
Gasarch, W., Smith, C., Learning via queries, Journal of the ACM 391 (1992), 649–675.
E.M. Gold, Language identification in the limit, Information and Control 10 (1967) 447–474.
J. Hopcroft, J. Ullman, Introduction to Automata Theory Languages and Computation (Addison-Wesley Publ. Company, 1979).
S. Jain, A. Sharma, Probability is more powerful than team for language identification, in: Proc. 6th ACM Conf. on Comp. Learning Theory (ACM Press, Santa Cruz, July 1993) 192–198.
S. Jain, A. Sharma, On monotonic strategies for learning r.e. languages, Annals of Mathematics and Artificial Intelligence (1994)
K.P. Jantke, Monotonic and non-monotonic inductive inference, New Generation Computing 8 349–360.
K.P. Jantke, Monotonic and non-monotonic inductive inference of functions an patterns, in: J. Dix, K.P. Jantke, P.H. Schmitt, eds., Proc. 1nd Int. Workshop on Nonmonotonic and Inductive Logics, Lecture Notes in AI 543 (Springer, Berlin, 1991) 161–177.
M. Kummer, F. Stephan, On the structure of degrees of inferability, in: Journal of Computer and System Sciences 52 (1996) 214–238.
S. Lange, T. Zeugmann, Monotonic versus non-monotonic language learning, in: G. Brewka, K.P. Jantke, P.H. Schmitt, eds., Proc. 2nd Int. Workshop on Nonmonotonic and Inductive Logics, Lecture Notes in AI 659 (Springer, Berlin, 1993) 254–269.
S. Lange, T. Zeugmann, Language learning in the dependence on the space of hypotheses, in: Proc. of the 6th ACM Conf. on Comp. Learning Theory (ACM Press, Santa Cruz, July 1993) 127–136.
L. Meyer, Probabilistic language learning under monotonicity constraints, in: K.P. Jantke, T. Shinohara, T. Zeugmann, eds., Proc. of ALT’95, Lect. notes in AI 997 (Springer, Berlin, 1995) 169–185.
L. Meyer, Probabilistic language learning under monotonicity constraints, in: TCS 185 (1997) 81–128.
L. Meyer, Aspects of complexity of conservative probabilistic learning, to appear (COLT’98).
L. Meyer, Aspects of complexity of probabilistic language learning (Institutsbericht, IIG, Freiburg, to appear).
P. Odifreddi, Classical Recursion Theory (North Holland, 1989).
D. Osherson, M. Stob, S. Weinstein, Systems that Learn, An Introduction to Learning Theory for Cognitive and Computer Scientists (MIT Press, Cambridge MA, 1986).
L. Pitt, Probabilistic Inductive Inference, J. of the ACM 36,2 (1989) 383–433.
T. Slaman, R. Solovay, When oracles do not help, Proc. of the 4th ACM Conf. on Comp. Learning Theory (ACM Press, Santa Cruz, July 1991), 379–383.
R. Soare, Recursively Enumerable Sets and Degrees (Springer, 1987)
F. Stephan, Noisy inference and oracles, in: TCS 185 (1997) 129–157.
R. Wiehagen, A Thesis in Inductive Inference, in: J. Dix, K.P. Jantke, P.H. Schmitt, eds., Proc. First International Workshop on Nonmonotonic and Inductive Logic, Lecture Notes in Artificial Intelligence 534 (Springer, Berlin, 1990) 184–207.
R. Wiehagen, R. Freivalds, E.B. Kinber, On the Power of Probabilistic Strategies in Inductive Inference, Theoretical Computer Science 28 (1984), 111–133.
R. Wiehagen, R. Freivalds, E.B. Kinber, Probabilistic versus Deterministic Inductive Inference in Nonstandard Numberings, Zeitschr. f. math. Logik und Grundlagen d. Math. 34 (1988) 531–539.
T. Zeugmann, S. Lange, A Guided Tour Across the Boundaries of Learning Recursive Languages, in: K.P. Jantke and S. Lange, eds., Algorithmic Learning for Knowledge-Based Systems, Lecture Notes in Artificial Intelligence 961 (Springer, Berlin, 1995) 193–262.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1998 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Meyer, L. (1998). Comparing the Power of Probabilistic Learning and Oracle Identification Under Monotonicity Constraints. In: Richter, M.M., Smith, C.H., Wiehagen, R., Zeugmann, T. (eds) Algorithmic Learning Theory. ALT 1998. Lecture Notes in Computer Science(), vol 1501. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-49730-7_23
Download citation
DOI: https://doi.org/10.1007/3-540-49730-7_23
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-65013-3
Online ISBN: 978-3-540-49730-1
eBook Packages: Springer Book Archive