Skip to main content

Machine Learning and Statistical MAP Methods

  • Conference paper
Intelligent Information Processing and Web Mining

Part of the book series: Advances in Soft Computing ((AINSC,volume 31))

  • 849 Accesses

Abstract

For machine learning of an input-output function f from examples, we show it is possible to define an a priori probability density function on the hypothesis space to represent knowledge of the probability distribution of f, even when the hypothesis space H is large (i.e., nonparametric). This allows extension of maximum a posteriori (MAP) estimation methods nonparametric function estimation. Among other things, the resulting MAPN (MAP for nonparametric machine learning) procedure easily reproduces spline and radial basis function solutions of learning problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 259.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. N. Friedman and J. Halpern (1996) Plausibility measures and default reasoning. In Thirteenth National Conf. on Artificial Intelligence (AAAI).

    Google Scholar 

  2. T. Hastie, R. Tibshirani and J. Friedman (2001) The elements of statistical learning: data mining, inference and prediction. Springer-Verlag.

    Google Scholar 

  3. M. Jordan. and M. Meila (2000) Learning with mixtures of trees. Journal of Machine Learning Research 1, 1–48.

    MathSciNet  Google Scholar 

  4. M. Kon (2004) Density functions for machine learning and optimal recovery. preprint.

    Google Scholar 

  5. C. Micchelli and T. Rivlin (1985) Lectures on optimal recovery. Lecture Notes in Mathematics, 1129, Springer-Verlag, Berlin, 1985, 21–93.

    Google Scholar 

  6. T. Mitchell (1997) Machine Learning. McGraw-Hill, NY.

    Google Scholar 

  7. L. Plaskota (1996) Noisy Information and Complexity. Cambridge Univ. Press.

    Google Scholar 

  8. T. Poggio and C. Shelton (1999) Machine Learning, Machine Vision, and the Brain. The AI Magazine 20, 37–55.

    Google Scholar 

  9. J. Traub, G. Wasilkowski, and H. Wozniakowski (1988), Information-Based Complexity. Academic Press, Boston.

    Google Scholar 

  10. J. F. Traub and A. Werschulz (2001) Complexity and Information. Cambridge University Press, Cambridge.

    Google Scholar 

  11. J. Traub and H. Wozniakowski (1980) A General Theory of Optimal Algorithms. Academic Press, New York.

    Google Scholar 

  12. V. Vapnik (1998) Statistical Learning Theory. Wiley, New York.

    Google Scholar 

  13. G. Wahba (1999) Support vector machines, reproducing kernel Hilbert spaces and the randomized GACV. in B. Schoelkopf, C. Burges & A. Smola, eds, Advances in Kernel Methods Support Vector Learning, MIT Press, 69–88.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kon, M., Plaskota, L., Przybyszewski, A. (2005). Machine Learning and Statistical MAP Methods. In: Kłopotek, M.A., Wierzchoń, S.T., Trojanowski, K. (eds) Intelligent Information Processing and Web Mining. Advances in Soft Computing, vol 31. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-32392-9_49

Download citation

  • DOI: https://doi.org/10.1007/3-540-32392-9_49

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-25056-2

  • Online ISBN: 978-3-540-32392-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics