Abstract
We present an extension of the Nearest Neighbour classifier that can adapt to sample imbalances in local regions of the dataset. Our approach uses the hubness statistic as a measure of a relation between new samples and the existing training set. This allows to estimate the upper limit of neighbours that vote for the label of the new instance. This estimation improves the classifier performance in situations where some classes are locally under-represented. The main focus of our method is to solve the problem of local undersampling that exists in hyperspectral data classification. Using several well-known Machine Learning and hyperspectral datasets, we show that our approach outperforms standard and distance-weighted kNN, especially for high values of k.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Fix, E., Hodges, J.L., Jr.: Discriminatory analysis-nonparametric discrimination: consistency properties. Technical Report, DTIC Document (1951)
Ding, H., Trajcevski, G., Scheuermann, P., Wang, X., Keogh, E.: Querying and mining of time series data: experimental comparison of representations and distance measures. Proc. VLDB Endow. 1(2), 1542–1552 (2008)
Romaszewski, M., Głomb, P., Cholewa, M.: Semi-supervised hyperspectral classification from a small number of training samples using a co-training approach. ISPRS J. Photogramm. Remote Sens. 121, 60–76 (2016)
Ghosh, A.K.: On optimum choice of k in nearest neighbor classification. Comput. Stat. Data Anal. 50(11), 3113–3123 (2006)
Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 13(1), 21–27 (1967)
Devroye, L., Gyorfi, L., Krzyzak, A., Lugosi, G.: On the strong universal consistency of nearest neighbor regression function estimates. Ann. Stat. 1371–1385 (1994)
Ouyang, D., Li, D., Li, Q.: Cross-validation and non-parametric k nearest-neighbour estimation. Econom. J. 9(3), 448–471 (2006)
Buza, K., Nanopoulos, A., Schmidt-Thieme, L.: Time-series classification based on individualised error prediction. In: 2010 IEEE 13th International Conference on Computational Science and Engineering (CSE), pp. 48–54. IEEE (2010)
Tomašev, N., Mladenić, D.: Nearest neighbor voting in high-dimensional data: Learning from past occurrences. In: 2011 IEEE 11th International Conference on Data Mining Workshops (ICDMW), pp. 1215–1218. IEEE (2011)
Tomašev, N., Buza, K.: Hubness-aware kNN classification of high-dimensional data in presence of label noise. Neurocomputing 160, 157–172 (2015)
Bhattacharya, G., Ghosh, K., Chowdhury, A.S.: Test point specific k estimation for kNN classifier. In: 2014 22nd International Conference on Pattern Recognition (ICPR), pp. 1478–1483. IEEE (2014)
Bioucas-Dias, J.M., Plaza, A., Camps-Valls, G., Scheunders, P., Nasrabadi, N.M., Chanussot, J.: Hyperspectral remote sensing data analysis and future challenges. IEEE Geosci. Remote Sens. Mag. 1(2), 6–36 (2013)
Li, J., Reddy Marpu, P., Plaza, A., Bioucas-Dias, J.M., Atli Benediktsson, J.: Generalized composite kernel framework for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 51(9), 4816–4829 (2013)
Melgani, F., Bruzzone, L.: Classification of hyperspectral remote sensing images with support vector machines. IEEE Trans. Geosci. Remote Sens. 42(8), 1778–1790 (2004)
Tomašev, N., Buza, K., Marussy, K., Kis, P.B.: Hubness-aware classification, instance selection and feature construction: survey and extensions to time-series. In: Stańczyk, U., Jain, L.C. (eds.) Feature Selection for Data and Pattern Recognition. SCI, vol. 584, pp. 231–262. Springer, Heidelberg (2015). https://doi.org/10.1007/978-3-662-45620-0_11
Biau, G., Devroye, L.: Weighted k-nearest neighbor density estimates. Lectures on the Nearest Neighbor Method. SSDS, pp. 43–51. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-25388-6_5
Acknowledgments
This work has been supported by the project ‘Representation of dynamic 3D scenes using the Atomic Shapes Network model’ financed by National Science Centre, decision DEC-2011/03/D/ST6/03753. Authors would like to thank Marcin Blachnik for extended discussion on the first version of the paper and Krisztian Buza for his insightful comments and for making available the PyHubs (http://www.biointelligence.hu/pyhubs.) library.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Romaszewski, M., Głomb, P., Cholewa, M. (2018). Adaptive, Hubness-Aware Nearest Neighbour Classifier with Application to Hyperspectral Data. In: Czachórski, T., Gelenbe, E., Grochla, K., Lent, R. (eds) Computer and Information Sciences. ISCIS 2018. Communications in Computer and Information Science, vol 935. Springer, Cham. https://doi.org/10.1007/978-3-030-00840-6_13
Download citation
DOI: https://doi.org/10.1007/978-3-030-00840-6_13
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-00839-0
Online ISBN: 978-3-030-00840-6
eBook Packages: Computer ScienceComputer Science (R0)