Abstract
In this paper, we propose a novel supervised dimension reduction algorithm based on K-nearest neighbor (KNN) classifier. The proposed algorithm reduces the dimension of data in order to improve the accuracy of the KNN classification. This heuristic algorithm proposes independent dimensions which decrease Euclidean distance of a sample data and its K-nearest within-class neighbors and increase Euclidean distance of that sample and its M-nearest between-class neighbors. This algorithm is a linear dimension reduction algorithm which produces a mapping matrix for projecting data into low dimension. The dimension reduction step is followed by a KNN classifier. Therefore, it is applicable for high-dimensional multiclass classification. Experiments with artificial data such as Helix and Twin-peaks show ability of the algorithm for data visualization. This algorithm is compared with state-of-the-art algorithms in classification of eight different multiclass data sets from UCI collection. Simulation results have shown that the proposed algorithm outperforms the existing algorithms. Visual place classification is an important problem for intelligent mobile robots which not only deals with high-dimensional data but also has to solve a multiclass classification problem. A proper dimension reduction method is usually needed to decrease computation and memory complexity of algorithms in large environments. Therefore, our method is very well suited for this problem. We extract color histogram of omnidirectional camera images as primary features, reduce the features into a low-dimensional space and apply a KNN classifier. Results of experiments on five real data sets showed superiority of the proposed algorithm against others.











Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Feng Z, Yang M, Zhang L, Liu Y, Zhang D (2013) Joint discriminative dimensionality reduction and dictionary learning for face recognition. Pattern Recognit 46(8):2134–2143
Wang J, Peng W, Ward MO, Rundensteiner EA (2003) Interactive hierarchical dimension ordering, spacing and filtering for exploration of high dimensional data sets. In: Information visualization, IEEE symposium, pp 105–112
Villegas M (2011) Contributions to high-dimensional pattern recognition. PhD thesis, Universidad Politecnica de Valencia, Valencia
Villegas M, Paredes R (2011) Dimensionality reduction by minimizing nearest-neighbor classification error. Pattern Recognit Lett 32(4):633–639
Bishop CH (2006) Pattern recognition and machine learning, vol 1. Springer, New York
Hinrichs A, Novak E, Ullrich M, Wozniakowski H (2014) The curse of dimensionality for numerical integration of smooth functions II. J Complex 30(2):117–143
Lee J, Verleysen M (2007) Nonlinear dimensionality reduction. Springer, Berlin
Oweis S, Lawrance K (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290:2323–2326
Abdi H, Williams LJ (2010) Principal component analysis. Wiley Interdiscip Rev Comput Stat 2:433–459
Xanthopoulos P, Pardalos PM, Trafalis TB (2013) Linear discriminant analysis. In: Robust data mining. Springer, New York, pp 27–33
Zheng Z, Yang F, Tan W, Jia J, Yang J (2007) Gabor feature-based face recognition using supervised locality preserving projection. Signal Process 87(10):2473–2483
Yan S, Xu D, Zhang B, Zhang HJ, Yang Q, Lin S (2007) Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Trans Pattern Anal Mach Intell 29(1):40–51
Bressan M, Vitria J (2003) Nonparametric discriminant analysis and nearest neighbor classification. Pattern Recognit Lett 24(15):2743–2749
Goldberger J, Roweis S, Hinton G, Salakhutdinov R (2005) Neighborhood components analysis. Advances in neural information processing systems. MIT Press, Cambridge, pp 513–520
Weinberger KQ, Blitzer J, Saul LK (2006) Distance metric learning for large margin nearest neighbor classification. In: Advances in neural information processing systems, vol 18, pp 1473–1480
Hu Q, Zhu P, Yang Y, Yu D (2011) Large-margin nearest neighbor classifiers via sample weight learning. Neurocomputing 74(4):656–660
Xu Y, Zhu Q, Fan Z, Qiu M, Chen Y, Liu H (2013) Coarse to fine K nearest neighbor classifier. Pattern Recognit Lett 34(9):980–986
Van der Maaten L, Hinton G (2008) Visualizing data using t-SNE. J Mach Learn Res 9(2579–2605):85
Asuncion A, Newman D (2007) UCI machine learning repository. http://www.ics.uci.edu/mlearn/MLRepository.html
Pronobis A, Caputo B (2009) COLD: the cosy localization database. Int J Robot Res 28(5):588–594. The COLD (CoSy Localization Database) database. http://cogvis.nada.kth.se/COLD/
Hall P, Park BU, Samworth RJ (2008) Choice of neighbor order in nearest-neighbor classification. Ann Stat 36(5):2135–2152
Davis J, Mark G (2006) The relationship between Precision–Recall and ROC curves. In: Proceedings of the 23rd international conference on machine learning. ACM, pp 233–240
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Omranpour, H., Shiry Ghidary, S. A heuristic supervised Euclidean data difference dimension reduction for KNN classifier and its application to visual place classification. Neural Comput & Applic 27, 1867–1881 (2016). https://doi.org/10.1007/s00521-015-1979-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-015-1979-8