Skip to main content
Log in

Kernel Nearest-Neighbor Algorithm

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

The ‘kernel approach’ has attracted great attention with the development of support vector machine (SVM) and has been studied in a general way. It offers an alternative solution to increase the computational power of linear learning machines by mapping data into a high dimensional feature space. This ‘approach’ is extended to the well-known nearest-neighbor algorithm in this paper. It can be realized by substitution of a kernel distance metric for the original one in Hilbert space, and the corresponding algorithm is called kernel nearest-neighbor algorithm. Three data sets, an artificial data set, BUPA liver disorders database and USPS database, were used for testing. Kernel nearest-neighbor algorithm was compared with conventional nearest-neighbor algorithm and SVM Experiments show that kernel nearest-neighbor algorithm is more powerful than conventional nearest-neighbor algorithm, and it can compete with SVM.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Duda, R. O. and Hart, P. E.: Pattern Classi¢cation and Scene Analysis, Wiley, New York, 1973.

    Google Scholar 

  2. Hart, P. E.: The condensed nearest neighbor rule, IEEE Trans. Inf. Theory 16 (1968), 515–516.

    Article  Google Scholar 

  3. Wilson, D. L.: Asymptotic properties of nearest neighbor rules using edited data, IEEE Trans. Syst. Man Cybern. SMC-2 (1972), 408–421.

    Article  Google Scholar 

  4. Aizerman, M. A., Braverman, E. M. and Rozonoer, L. I.: Theoretical foundation of potential function method in pattern recognition learning, Automat. Remote Contr. 25 (1964), 821–837.

    MathSciNet  Google Scholar 

  5. Aizerman, M. A., Braverman, E.M. and Rozonoer, L. I.: The Robbince-Monroe process and the method of potential functions, Automat. Remote Contr. 28 (1965), 1882–1885.

    Google Scholar 

  6. Vapnik, V. N.: The Nature of Statistical Learning Theory, Springer-Verlag, New York, 1995.

    MATH  Google Scholar 

  7. Schölkopf, B., Smola, A. and Müller, K. R.: Nonlinear component analysis as a kernel eigenvalue problem, Neural Comput. 10 (1998), 1299–1319.

    Article  Google Scholar 

  8. Courant, R. and Hilbert, D.: Methods of Mathematical Physics, J. Wiley, New York, 1953.

    Google Scholar 

  9. Forsyth, R. S.: UCI Repository of machine learning databases, Irvine, CA: University of California, Department of Information and Computer Science, 1990.

    Google Scholar 

  10. LeCun, Y. et al.: Backpropagation applied to handwritten zip code recognition, Neural Comput. 1 (1989), 541–551.

    Google Scholar 

  11. Collobert, R. and Bengio, S.: Support Vector Machines for Large-Scale Regression Problems, IDIAP-RR–00–17, 2000.

  12. Schölkopf, B., Burges, C. and Vapnik, V.: Extracting support data for a given task, In: U. M. Fayyad. and R. Uthurusamy (eds), Proc. 1st International Conference on Knowledge Discovery & Data Mining, Menlo Park, AAAI Press, 1995.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yu, K., Ji, L. & Zhang, X. Kernel Nearest-Neighbor Algorithm. Neural Processing Letters 15, 147–156 (2002). https://doi.org/10.1023/A:1015244902967

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1015244902967

Navigation