Motivation and Background
Epsilon nets were introduced by Haussler and Welz (1987), and their usefulness for computational learning theory has been discovered by Blumer et al. (1989).
Let X ≠∅ be any learning domain and let \(\mathcal{C}\subseteq \wp (X)\) be any nonempty concept class. For the sake of simplicity, we also use \(\mathcal{C}\) here as hypothesis space. In order to guarantee that all probabilities considered below do exist, we restrict ourselves to well-behaved concept classes (see PAC Learning).
Furthermore, let D be any arbitrarily fixed probability distribution over the learning domain X and let \(c \in \mathcal{C}\) be any fixed concept.
A hypothesis \(h \in \mathcal{C}\) is said to be bad for c iff
Furthermore, we use
to denote the...
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Recommended Reading
Blumer A, Ehrenfeucht A, Haussler D, Warmuth MK (1989) Learnability and the Vapnik-Chervonenkis dimension. J ACM 36(4):929–965
Haussler D, Welz E (1987) Epsilon nets and simplex range queries. Discret & Comput Geom 2:127–151 (1987)
Kearns MJ, Vazirani UV (1994) An introduction to computational learning theory. MIT, Cambridge
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer Science+Business Media New York
About this entry
Cite this entry
Zeugmann, T. (2017). Epsilon Nets. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning and Data Mining. Springer, Boston, MA. https://doi.org/10.1007/978-1-4899-7687-1_83
Download citation
DOI: https://doi.org/10.1007/978-1-4899-7687-1_83
Published:
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4899-7685-7
Online ISBN: 978-1-4899-7687-1
eBook Packages: Computer ScienceReference Module Computer Science and Engineering