Abstract
We present exact learning algorithms that learn several classes of (discrete) boxes in {0,..., ℓ−1}n. In particular we learn: (1) The class of unions of O(log n) boxes in time poly(n, log ℓ) (solving an open problem of [15, 11]). (2) The class of unions of disjoint boxes in time poly(n, t,log ℓ), where t is the number of boxes. (Previously this was known only in the case where all boxes are disjoint in one of the dimensions). In particular our algorithm learns the class of decision trees (over n variables that take values in {0,..., ℓ−1}) with comparison nodes in time poly (n, t, log ℓ), where t is the number of leaves (this was an open problem in [8] which was shown in [3] to be learnable in time poly(n, t, ℓ)). (3) The class of unions of O(1)-degenerate boxes (that is, boxes that depend only on O(1) variables) in time poly(n, t, log ℓ) (generalizing the learnability of O(1)-DNF and of boxes in O(1) dimensions). The algorithm for this class uses only equivalence queries and it can also be used to learn the class of unions of O(1) boxes (from equivalence queries only).
Part of this research was done while the author was a Ph.D. student at the Technion.
This research was supported by Technion V.P.R. Fund 120-872 and by Japan Technion Society Research Fund
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
D. Angluin. Queries and concept learning. Machine Learning, 2(4):319–342, 1988.
P. Auer. On-line learning of rectangles in noisy environments. In Proc. of 6th Annu. ACM Workshop on Comput. Learning Theory, pages 253–261, 1993.
A. Beimel, F. Bergadano, N. H. Bshouty, E. Kushilevitz, and S. Varricchio. On the applications of multiplicity automata in learning. In Proc. of 37th Annu. IEEE Symp. on Foundations of Computer Science, pages 349–358, 1996.
S. Ben-David, N. H. Bshouty, and E. Kushilevitz. A composition theorem for learning algorithms with applications to geometric concept classes. manuscript, 1996.
F. Bergadano, D. Catalano, and S. Varricchio. Learning sat-k-DNF formulas from membership queries. In Proc. of 28th Annu. ACM Symp. on the Theory of Computing, pages 126–130, 1996.
A. Blum and S. Rudich. Fast learning of k-term DNF formulas with queries. In Proc. of 24th ACM Symp. on Theory of Computing, pages 382–389, 1992.
A. Blumer, A. Ehrenfeucht, D. Haussler, and M. K. Warmuth. Learnability and the Vapnik-Chervonenkis dimension. Journal of the ACM, 36:929–965, 1989.
N. H. Bshouty. Exact learning via the monotone theory. In Proc. of 34th Annu. IEEE Symp. on Foundations of Computer Science, pages 302–311, 1993. Journal version: Information and Computation, 123(1):146–153, 1995.
N. H. Bshouty. Simple learning algorithms using divide and conquer. In Proc. of 8th Annu. ACM Workshop on Comput. Learning Theory, pages 447–453, 1995.
N. H. Bshouty, Z. Chen, and S. Homer. On learning discretized geometric concepts. In Proc. of 35th Annu. Symp. on Foundations of Computer Science, pages 54–63, 1994.
N. H. Bshouty, P. W. Goldberg, S. A. Goldman, and H. D. Mathias. Exact learning of discretized geometric concepts. Technical Report WUCS-94-19, Washington University, 1994.
N. H. Bshouty, S. A. Goldman, H. D. Mathias, S. Suri, and H. Tamaki. Noisetolerant distribution-free learning of general geometric concepts. In Proc. of 28th Annu. ACM Symp. on Theory of Computing, pages 151–160, 1996.
Z. Chen and S. Homer. The bounded injury priority method and the learnability of unions of rectangles. Annals of Pure and Applied Logic, 77(2):143–168, 1996.
Z. Chen and W. Maass. On-line learning of rectangles. In Proc. of 5th Annu. ACM Workshop on Comput. Learning Theory, 1992.
P. W. Goldberg, S. A. Goldman, and H. D. Mathias. Learning unions of boxes with membership and equivalence queries. In Proc. of 7th Annu. ACM Workshop on Comput. Learning Theory, 1994.
J. C. Jackson. An efficient membership-query algorithm for learning DNF with respect to the uniform distribution. In 35th Annu. Symp. on Foundations of Computer Science, pages 42–53, 1994.
J. C. Jackson. The Harmonic Sieve: A Novel Application of Fourier Analysis to Machine Learning Theory and Practice. PhD thesis, Technical Report CMU-CS-95-184, School of Computer Science, Carnegie Mellon University, 1995.
E. Kushilevitz. A simple algorithm for learning O(log n)-term DNF. In Proc. of 9th Annu. ACM Workshop on Comput. Learning Theory, pages 266–269, 1996.
N. Littlestone. Learning when irrelevant attributes abound: A new linear-threshold algorithm. Machine Learning, 2:285–318, 1988.
P. M. Long and M. K. Warmuth. Composite geometric concepts and polynomial predictability. In Proc. of 3rd Annu. ACM Workshop on Comput. Learning Theory, pages 273–287, 1990.
W. Maass and G. Turan. On the complexity of learning from counterexamples. In Proc. of 30th Annu. Symp. on Foundations of Computer Science, pages 262–273, 1989.
W. Maass and G. Turan. Algorithms and lower bounds for on-line learning of geometrical concepts. Machine Learning, 14:251–269, 1994.
W. Maass and M. K. Warmuth. Efficient learning with virtual threshold gates. In Proc. 12th International Conference on Machine Learning, pages 378–386. Morgan Kaufmann, 1995.
L. G. Valiant. A theory of the learnable. Communications of the ACM, 27(11):1134–1142, 1984.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1997 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Beimel, A., Kushilevitz, E. (1997). Learning boxes in high dimension. In: Ben-David, S. (eds) Computational Learning Theory. EuroCOLT 1997. Lecture Notes in Computer Science, vol 1208. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-62685-9_2
Download citation
DOI: https://doi.org/10.1007/3-540-62685-9_2
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-62685-5
Online ISBN: 978-3-540-68431-2
eBook Packages: Springer Book Archive