Abstract
Shinohara, Arimura, and Krishna Rao have shown learnability in the limit of minimal models of classes of logic programs from positive only data. In most cases, these results involve logic programs in which the “size” of the head yields a bound on the size of the body literals. However, when local variables are present, such a bound on body literal size cannot directly be ensured. The above authors achieve such a restriction using technical notions like mode and linear inequalities. The present paper develops a conceptually clean framework where the behavior of local variables is controlled by nonlocal ones. It is shown that for certain classes of logic programs, learnablity from positive data is equivalent to limiting identification of bounds for the number of clauses and the number of local variables. This reduces the learning problem finding two integers. This cleaner framework generalizes all the known results and establishes learnability of new classes.
Supported by the Australian Research Council Grant A49803051.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Arimura, H.: Completeness of depth-bounded resolution in logic programming. In: Proceedings of the 6th Conference, Japan Soc. Software Sci. Tech. (1989) 61–64
Arimura, H.: Learning Acyclic First-Order Horn Sentences from Entailment In: Li, M., Maruoka, A. (eds.): Algorithmic Learning Theory: Eighth International Workshop (ALT’ 97). LNAI, Vol. 1316. Springer-Verlag (1997) 432–445
Arimura, H., Shinohara, T.: Inductive inference of Prolog programs with linear data dependency from positive data. In: Jaakkola, H., Kangassalo, H., Kitahashi, T., Markus, A. (eds.): Proc. Information Modelling and Knowledge Bases V. IOS Press (1994) 365–375
Cohen, W.W.: PAC-Learning non-recursive Prolog clauses. Artificial Intelligence 79 (1995) 1–38
Cohen, W.W.: PAC-Learning Recursive Logic Programs: Efficient Algorithms. Journal of Artificial Intelligence Research 2 (1995) 501–539
De Raedt, L., Dzeroski, S.: First-order jk-clausal theories are PAC-learnable. Artificial Intelligence 70 (1994) 375–392
Dzeroski, S., Muggleton, S., Russell, S.: PAC-Learnability of constrained nonrecursive logic programs. In: Proc. of the 3rd International Workshop on Computational Learning Theory and Natural Learning Systems. Wisconsin, Madison (1992)
Dzeroski, S., Muggleton, S., Russell, S.: PAC-Learnability of determinate logic programs. In: Proceedings of the Fifth Annual Workshop on Computational Learning Theory. ACM Press (1992) 128–135
Frisch, A., Page, C.D.: Learning constrained atoms. In: Proceedings of the Eighth International Workshop on Machine Learning. Morgan Kaufmann (1991)
Jain, S., Sharma, A.: Mind Change Complexity of Learning Logic Programs. In: Proceedings of the 1999 European Conference on Computational Learning Theory. Lecture Notes in Artificial Intelligence. Springer-Verlag (1999) (to appear)
Khardon, R.: Learning first-order universal Horn expressions. In: Proceedings of the Eleventh Annual Conference on Computational Learning Theory. ACM Press (1998) 154–165
Kietz, J.-U.: Some computational lower bounds for the computational complexity of inductive logic programming. In: Proceedings of the 1993 European Conference on Machine Learning. Vienna (1993)
Krishna Rao, M.: A class of Prolog programs inferable from positive data. In: Arikawa, A., Sharma, A. (eds.): Algorithmic Learning Theory: Seventh International Workshop (ALT’ 96). Lecture Notes in Artificial Intelligence, Vol. 1160. Springer-Verlag (1996) 272–284
Krishna Rao, M., Sattar, A.: Learning from entailment of logic programs with local variables. In: Richter, M., Smith, C., Wiehagen, R., Zeugmann, T. (eds.): Algorithmic Learning Theory: Ninth International Workshop (ALT’ 97). Lecture Notes in Artificial Intelligence. Springer-Verlag (1998) (to appear)
Maass, W., Turán, Gy.: On learnability and predicate logic. NeuroCOLT Technical Report NC-TR-96-023 (1996)
Muggleton, S., Page, C.D.: A Learnability Model for Universal Representations. Technical Report PRG-TR-3-94. Oxford University Computing Laboratory, Oxford (1994)
Shapiro, E.: Inductive Inference of Theories from Facts. Technical Report 192. Computer Science Department, Yale University (1981)
Shinohara, T.: Inductive Inference of Monotonic Formal Systems From Positive Data. New Generation Computing 8 (1991) 371–384
Generalized unification as background knowledge in learning logic programs. In: Jantke, K., Kobayashi, S., Tomita, E., Yokomori, T. (eds.): Algorithmic Learning Theory: Fourth International Workshop (ALT’ 93). Lecture Notes in Artificial Intelligence, Vol. 744. Springer-Verlag (1993) 111–122
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Martin, E., Sharma, A. (1999). On Sufficient Conditions for Learnability of Logic Programs from Positive Data. In: Džeroski, S., Flach, P. (eds) Inductive Logic Programming. ILP 1999. Lecture Notes in Computer Science(), vol 1634. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-48751-4_19
Download citation
DOI: https://doi.org/10.1007/3-540-48751-4_19
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-66109-2
Online ISBN: 978-3-540-48751-7
eBook Packages: Springer Book Archive