Abstract
The k nearest neighbors classifier is simple and often results in good performance in problems. However, it can not work well on noisy and high dimensional data, as the structure composed of selected nearest neighbors on these data is easily deformed and perceptually unstable. This paper presents a locally centralizing samples approach with kernel techniques to preprocess the data. It creates a new sample for each original sample through its neighborhood and then replace it to be candidate for nearest neighbors. This approach can be justified by gestalt psychology and applied to provide better quality data for classifiers, even if the original data is noisy and high dimensional.The conducted experiments on challenging benchmark data sets validate the proposed approach.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Tristan, M.H., Stephane, R.: Tailored Aggregation for Classification. IEEE Trans. Pattern Anal. Mach. Intell. 31, 2098 (2009)
Wang, H.: Nearest neighbors by neighborhood counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 28, 942 (2006)
Mitani, Y., Hamamoto, Y.: A local mean-based nonparametric classifier. Pattern Recognition Letters 27, 1151 (2006)
Li, B., Chen, Y.W., Chen, Y.Q.: The Nearest Neighbor Algorithm of Local Probability Centers. IEEE Trans. Syst., Man, Cybern 38, 141 (2008)
Hamamoto, Y., Uchimura, S., Tomita, S.: A bootstrap technique for nearest neighbor classifier design. IEEE Trans. Pattern Anal. Mach. Intell. 19, 73 (1997)
Desolneux, A., Moisan, L., Morel, J.: Computational gestalts and perception thresholds. Journal of Physiology - Paris 97, 311 (2003)
Bergman, T.J., et al.: Hierarchical Classification by Rank and Kinship in Baboons. Science 302, 1234 (2003)
Goto, K., Wills, A.J., Lea, S.E.G.: Global-feature classification can be acquired more rapidly than local-feature classification in both humans and pigeons. Animal Cognition 7 (2004)
Peng, J., Heisterkamp, D.R., Dai, H.K.: Adaptive Quasiconformal Kernel Nearest Neighbor Classification. IEEE Trans. Pattern Analysis and Machine lntelligence 26(5), 656–661 (2004)
Wen, G., Jiang, L., Wen, J.: Using Locally Estimated Geodesic Distance to Optimize Neighborhood Graph for Isometric Data Embedding. Pattern Recognition 41, 2226 (2008)
Lam, W., Han, Y.: Automatic Textual Document Categorization Based on Generalized Instance Sets and a Metamodel. IEEE Trans. Pattern Anal. Mach. Intell. 25, 628 (2003)
Breiman, L.: Arcing classifiers. Ann. Statist. 26, 801 (1998)
Singh, S.: 2D spiral pattern recognition with possibilistic measure. Pattern Recognition Lett. 19, 131 (1998)
Wen, G., et al.: Local relative transformation with application to isometric embedding. Pattern Recognition Letters 30, 203 (2009)
Asuncion, A., Newman, D.J.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine, CA (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Wen, G., Wen, S., Wen, J., Jiang, L. (2010). Locally Centralizing Samples for Nearest Neighbors. In: Zhang, BT., Orgun, M.A. (eds) PRICAI 2010: Trends in Artificial Intelligence. PRICAI 2010. Lecture Notes in Computer Science(), vol 6230. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-15246-7_70
Download citation
DOI: https://doi.org/10.1007/978-3-642-15246-7_70
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-15245-0
Online ISBN: 978-3-642-15246-7
eBook Packages: Computer ScienceComputer Science (R0)