Skip to main content

Continuity of Performance Metrics for Thin Feature Maps

  • Conference paper
Algorithmic Learning Theory (ALT 2007)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4754))

Included in the following conference series:

  • 2166 Accesses

Abstract

We study the class of hypothesis composed of linear functionals superimposed with smooth feature maps. We show that for “typical” smooth feature map, the pointwise convergence of hypothesis implies the convergence of some standard metrics such as error rate or area under ROC curve with probability 1 in selection of the test sample from a (Lebesgue measurable) probability density. Proofs use transversality theory. The crux is to show that for every “typical”, sufficiently smooth feature map into a finite dimensional vector space, the counter-image of every affine hyperplane has Lebesgue measure 0.

The results extend to every real analytic, in particular polynomial, feature map if its domain is connected and the limit hypothesis is non-constant. In the process we give an elementary proof of the fundamental lemma that locus of zeros of a real analytic function on a connected domain either fills the whole space or forms a subset of measure 0.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Platt, J.: Fast training of support vector machines using sequential minimal optimization. In: Schölkopf, B., Burges, C., Smola, A. (eds.) Advances in kernel methods: support vector learning, pp. 185–208. MIT Press, Cambridge, MA (1998)

    Google Scholar 

  2. Golubitsky, M., Guillemin, V.: Stable Mapping and Their Singularities. Springer, New York (1973)

    Book  MATH  Google Scholar 

  3. Arnold, V., Gussein-Zade, S., Varchenko, A.: Singularities of Differentiable Maps. Birkhauser, Boston (1985)

    Book  Google Scholar 

  4. Demazure, M.: Bifurcations and Catastrophes. Springer, New York (2000)

    Book  MATH  Google Scholar 

  5. Vapnik, V.: Statistical Learning Theory. John Wiley and Sons, New York (1998)

    MATH  Google Scholar 

  6. Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines. Cambridge University Press, Cambridge, UK (2000)

    MATH  Google Scholar 

  7. Schölkopf, B., Smola, A.: Learning with Kernels. MIT Press, Cambridge, MA (2002)

    MATH  Google Scholar 

  8. Bartle, R.: The Elements of Integration and Lebesgue Measure. Wiley, Chichester (1995)

    Book  MATH  Google Scholar 

  9. Provost, F., Fawcett, T.: Robust classification for imprecise environments. Machine Learning 42(3), 203–231 (2001)

    Article  MATH  Google Scholar 

  10. Bamber, D.: The area above the ordinal dominance graph and the area below the receiver operating characteristic graph. J. Math. Psych. 12, 387–415 (1975)

    Article  MathSciNet  MATH  Google Scholar 

  11. Albert, A.: Regression and the Moore-Penrose Pseudoinverse. Academic Press, New York (1972)

    MATH  Google Scholar 

  12. Bedo, J., Sanderson, C., Kowalczyk, A.: An efficient alternative to svm based recursive feature elimination with applications in natural language processing and bioinformatics. In: Sattar, A., Kang, B.-H. (eds.) AI 2006. LNCS (LNAI), vol. 4304, pp. 170–180. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  13. Kuratowski, K.: Introduction to Set Theory and Topology. PWN, Warszawa (1962)

    Google Scholar 

  14. Sternberg, S.: Lectures on Differential Geometry. Prentice-Hall, N.J (1964)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kowalczyk, A. (2007). Continuity of Performance Metrics for Thin Feature Maps. In: Hutter, M., Servedio, R.A., Takimoto, E. (eds) Algorithmic Learning Theory. ALT 2007. Lecture Notes in Computer Science(), vol 4754. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-75225-7_28

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-75225-7_28

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-75224-0

  • Online ISBN: 978-3-540-75225-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics