The monotone theory for the PAC-model

https://doi.org/10.1016/S0890-5401(03)00116-0Get rights and content
Under an Elsevier user license
open archive

Abstract

In this paper we extend the Monotone Theory to the PAC-learning Model with membership queries. Using this extention we show that a DNF formula that has at least one “1/poly-heavy” clause in one of its CNF representation (a clause that is not satisfied with probability 1/poly(n,s) where n is the number of variables and s is the number of terms in f) with respect to a distribution D is weakly learnable under this distribution. So DNF that are not weakly learnable under the distribution D do not have any “1/poly-heavy” clauses in any of their CNF representations.

A DNF f is called τ-CDNF if there is τ>τ and a CNF representation of f that contains poly(n,s) clauses that τ-approximates f according to a distribution D. We show that the class of all τ-CDNF is weakly (τ+ϵ)-PAC-learnable with membership queries under the distribution D.

We then show how to change our algorithm to a parallel algorithm that runs in polylogarithmic time with a polynomial number of processors. In particular, decision trees are (strongly) PAC-learnable with membership queries under any distribution in parallel in polylogarithmic time with a polynomial number of processors. Finally we show that no efficient parallel exact learning algorithm exists for decision trees.

Cited by (0)

1

This research was supported by the fund for promotion of research at the Technion. Research No. 120-025.