Skip to main content

Strong Universal Consistent Estimate of the Minimum Mean Squared Error

  • Chapter
  • First Online:
Empirical Inference

Abstract

Consider the regression problem with a response variable Y and a feature vector X. For the regression function m(x) = E{YX = x}, we introduce new and simple estimators of the minimum mean squared error Mean squared error—( Minimum mean squared error—( \({L}^{{\ast}} = \mathbf{E}\{{(Y - m(\mathbf{X}))}^{2}\}\), and prove their strong consistenciesConsistency—(. We bound the rate of convergenceRate of convergence, too.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Chow, Y.S.: Local convergence of martingales and the law of large numbers. Ann. Math. Stat. 36, 552–558 (1965)

    Article  MATH  Google Scholar 

  2. Devroye, L., Györfi, L., Krzyżak, A., Lugosi, G.: On the strong universal consistency of nearest neighbor regression function estimation. Ann. Stat. 22, 1371–1385 (1994)

    Article  MATH  Google Scholar 

  3. Devroye, L., Schäfer, D., Györfi, L., Walk, H.: The estimation problem of minimum mean squared error. Stat. Decis. 21, 15–28 (2003)

    Article  MATH  Google Scholar 

  4. Dudoit, S., van der Laan, M.: Asymptotics of cross-validated risk estimation in estimator selection and performance assessment. Stat. Methodol. 2, 131–154 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  5. Efron, B., Stein, C.: The jackknife estimate of variance. Ann. Stat. 9, 586–596 (1981)

    Article  MathSciNet  MATH  Google Scholar 

  6. Ferrario, P.G., Walk, H.: Nonparametric partitioning estimation of residual and local variance based on first and second nearest neighbors. J. Nonparametric Stat. 24, 1019–1039 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  7. Györfi, L., Kohler, M., Krzyżak, A., Walk, H.: A Distribution-Free Theory of Nonparametric Regression. Springer, New York (2002)

    Book  MATH  Google Scholar 

  8. Landau, E.: Über die Bedeutung einiger neuer Grenzwertsätze der Herren Hardy und Axer. Prace mat.-fiz. 21, 91–177 (1910)

    Google Scholar 

  9. Liitiäinen, E., Corona, F., Lendasse, A.: On nonparametric residual variance estimation. Neural Process. Lett. 28, 155–167 (2008)

    Article  Google Scholar 

  10. Liitiäinen, E., Verleysen, M., Corona, F., Lendasse, A.: Residual variance estimation in machine learning. Neurocomputing 72, 3692–3703 (2009)

    Article  Google Scholar 

  11. Liitiäinen, E., Corona, F., Lendasse, A.: Residual variance estimation using a nearest neighbor statistic. J. Multivar. Anal. 101, 811–823 (2010)

    Article  MATH  Google Scholar 

  12. Müller, H.G., Stadtmüller, U.: Estimation of heteroscedasticity in regression analysis. Ann. Stat. 15, 610–625 (1987)

    Article  MATH  Google Scholar 

  13. Müller, U., Schick, A., Wefelmeyer, W.: Estimating the error variance in nonparametric regression by a covariate-matched U-statistic. Statistics 37, 179–188 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  14. Neumann, M.H.: Fully data-driven nonparametric variance estimators. Statistics 25, 189–212 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  15. Stadtmüller, U., Tsybakov, A.: Nonparametric recursive variance estimation. Statistics 27, 55–63 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  16. Walk, H.: Strong laws of large numbers by elementary Tauberian arguments. Monatsh. Math. 144, 329–346 (2005)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work was partially supported by the European Union and the European Social Fund through project FuturICT.hu (grant no.: TAMOP-4.2.2.C-11/1/KONV-2012-0013).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luc Devroye .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Devroye, L., Ferrario, P.G., Györfi, L., Walk, H. (2013). Strong Universal Consistent Estimate of the Minimum Mean Squared Error. In: Schölkopf, B., Luo, Z., Vovk, V. (eds) Empirical Inference. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-41136-6_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-41136-6_14

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-41135-9

  • Online ISBN: 978-3-642-41136-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics