Abstract
We define a new model of learning called the Always Approximately Correct or AAC model. In this model the learner does not have random bits at its disposal, and instead learns by making the usual membership queries from a deterministic “training set.” This model is an extension of Angluin's Query model of exact learning with membership queries alone.
We discuss crucial issues and questions that arise when this model is used. One such question is whether a uniform training set is available for learning any concept in the concept class. This issue seems not to have been studied in the context of Angluin's Query model. Another question is whether the training set can be found quickly if partial information about the function is given to the learner (in addition to the answers to membership queries); for example information about a subclass in which the concept belongs. We formalize the latter scenario by introducing the notion of “subclass queries.”
Using this new model of learning, we prove three learnability results for classes of Boolean functions that are approximable (with respect to various norms) by linear combinations of a set of few Parity functions. We compare and contrast these results with several existing results for similar classes in the PAC model of learning with and without membership queries — these classes have not been previously emphasized under Angluin's Query model for exact learning.
Moreover, we point out the significance — in various contexts — of the classes of Boolean functions being learnt, for example in the context of probabilistic communication complexity.
This is a preview of subscription content, log in via an institution.
Preview
Unable to display preview. Download preview PDF.
References
Alon, N., Vu, V. H.: Threshold gates, coin weighing, and indecomposable hypergraphs. FOGS (1996)
Angluin, D.: Learning regular sets from queries and counterexamples. Information and Computation (Vol. 75(2), 1987) 87–106
Angluin, D., Frazier, M., Pitt, L.: Learning conjunctions of Horn clauses. Machine Learning (Vol. 9, 1992) 147–164
Angluin, D., Hellerstein, L., Karpinski, M.: Learning read-once formulas with queries Journal of the ACM (Vol. 40, 1993) 185–210
Auer, P., Long, P., Srinivasan, A.: Pseudorandomness and learning of combinatorial rectangles. to appear in STOC (1997)
Ben-Or, M., Tiwari, P.: A deterministic algorithm for sparse multivariate polynomial interpolation. Proc. 20th Ann. ACM Symp. Theory of Comput. (May 1988) 301–309
Bshouty, N. H.: Exact learning via the monotone theory. Proceedings of the 34th IEEE Symposium on the Foundations of Computer Science 302–311
Bshouty, N. H., Tamon, C.: On the Fourier spectrum of monotone functions. Proc. 27th Ann. ACM Symp. Theory of Comput. (May 1995) 219–399
Bshouty, N., Mansour, Y.: Simple learning algorithms for decision trees and multivariate polynomials. Proc. 36th Ann. IEEE Symp. Foundations of C.S. (1995) 304–311
Buck, R. C.: Applications of duality in approximation theory. Approximation of functions, Elsevier, H.L. Garabedian ed. (1964) 27–42
Enflo, P., Sitharam, M.: Stability of basis families and complexity lower bounds. ECCC report (Sept. 1996) Preprint also available at: http //nimitz.mcs.kent.edu/ sitharam
Furst, M., Jackson, J., Smith, S.: Improved learning of AC° functions. 4th Conf. on Computational Learning Theory, (1991) 317–325
Goldman, M., Hastad, J., Razborov, A. A.: Majority gates vs. general weighted threshold gates. 32nd Ann. IEEE Symp. Foundations of C.S. (1991)
Gotsman, C., Linial, N.: Equivalence of two problems on the cube — A note. J. Comb. Theory, Ser. A 61 (1992) 142–146
Grolmusz, V.: Harmonic analysis, real approximation and communication complexity of Boolean functions. Manuscript (1994)
Hastad, J.: Computational limitations of small depth circuits. Ph. D thesis, MIT press (1986)
Hastad, J.: On the size of weights for threshold gates. SIAM J. Disc. Math. (1994) 484–492
Hajnal, A., Maass, W., Pudlik, P., Szegedy, M., Turin, G.: Threshold circuits of bounded depth. 28th Ann. IEEE Syrnp. Foundations of C.S. (1987) 99–110
Jackson, J.: An efficient membership query algorithm for learning DNF with respect to the uniform distribution. Proc. 35th Ann. IEEE Symp. Foundations of C.S. (1994) 42–53
Kushilevitz, E., Mansour, Y.: Learning decision trees using the Fourier transform. 32nd Ann. IEEE Syrnp. Foundations of C.S. (1991) 455–464
Linial, N., Mansour, Y., Nisan, N.: Constant depth circuits, Fourier transforms, and learnability. 30th Ann. IEEE Symp. Foundations of C.S. (1989) 574–579
Nisan, N., Szegedy, M.: On the degree of Boolean functions as real polynomials. 24th Ann. ACM Symp. on Theory of Computing (1992) 462–467
Paturi, R.: On the degree of polynomials that approximate symmetric Boolean functions. 24th Ann. ACM Symp. on Theory of Computing (1992) 468–474
Sitharam, M.: Pseudorandom generators and learning algorithms for AC°. Ann. ACM Symp. on Theory of Computing (1994) 478–488
Schapire, R., Sellie, L.: Learning sparse multivariate polynomials over a field with queries and counterexamples. Proceedings of the 6th Workshop on Computational Learning Theory 17–26
Sitharain, M.: Approximation from spaces of functions over the cube, complexity lower bounds, derandomization, and learning algorithms. See ECCC reports. Preprint also available at: http //nimitz.mcs.kent.edu/ sitharam
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1997 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Sitharam, M., Straney, T. (1997). Derandomized learning of boolean functions. In: Li, M., Maruoka, A. (eds) Algorithmic Learning Theory. ALT 1997. Lecture Notes in Computer Science, vol 1316. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-63577-7_38
Download citation
DOI: https://doi.org/10.1007/3-540-63577-7_38
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-63577-2
Online ISBN: 978-3-540-69602-5
eBook Packages: Springer Book Archive