Skip to main content

A Note on the k-NN Density Estimate

  • Conference paper
  • First Online:
Intelligent Data Engineering and Automated Learning – IDEAL 2016 (IDEAL 2016)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 9937))

  • 1829 Accesses

Abstract

k-NN (k Nearest Neighbour) density estimate as a nonparametric estimation method is widely used in machine learning or data analysis. The convergence problem of k-NN approach has been intensively investigated. In particular, the equivalence of convergence in weak or strong sense (i.e. in probability sense or in almost surely sense) has been respectively developed. In this note, we will show that the k-NN estimator converges in probability is equivalent to converge in the \(L^2\) sense. Moreover, some relevant asymptotic results about the expectations of k-NN estimator will be established.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Devroye, L.: The equivalence of weak, strong and complete convergence in \(l_1\) for kernel density estimates. Ann. Stat. 11(3), 896–904 (1983)

    Article  MathSciNet  MATH  Google Scholar 

  2. Devroye, L.: A note on the \(l_1\) consistency of variable kernel estimates. Ann. Stat. 13(3), 1041–1049 (1985)

    Article  MathSciNet  MATH  Google Scholar 

  3. Devroye, L., Penrod, C.S.: The consistency of automatic kernel density estimates. Ann. Stat. 12(4), 1231–1249 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  4. Don, O., Loftsgaarden, C.P.Q.: A nonparametric estimate of a multivariate density function. Ann. Math. Stat. 36(3), 1049–1051 (1965)

    Article  MathSciNet  MATH  Google Scholar 

  5. Fix, E., Hodges, J.L.: Discriminatory analysis, nonparametric discrimination, consistency properties. Int. Stat. Rev. 57(3), 238–247 (1989)

    Article  MATH  Google Scholar 

  6. Scott, D.W.: Multivariate Density Estimation: Theory, Practice, and Visualization, 2nd edn. Wiley, Toronto (2015)

    MATH  Google Scholar 

  7. Wied, D., Weißbach, R.: Consistency of the kernel density estimator: a survey. Stat. Pap. 53(1), 1–21 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  8. Yan, J.: Lectures on Measure Theory, 2nd edn. Science Press, Beijing (2004). (in Chinese)

    Google Scholar 

Download references

Acknowledgements

The authors acknowledge the financial support by the National NSF of China under Grant No. 61472343, as well as the Open Fund of the State Key Laboratory of Software Development Environment under Grant No. SKLSDE-2015KF-05, Beihang University.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jie Ding .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Ding, J., Zhu, X. (2016). A Note on the k-NN Density Estimate. In: Yin, H., et al. Intelligent Data Engineering and Automated Learning – IDEAL 2016. IDEAL 2016. Lecture Notes in Computer Science(), vol 9937. Springer, Cham. https://doi.org/10.1007/978-3-319-46257-8_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-46257-8_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-46256-1

  • Online ISBN: 978-3-319-46257-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics