Skip to main content
Log in

Hypersphere anchor loss for K-Nearest neighbors

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Learning effective feature spaces for KNN (K-Nearest Neighbor) classifiers is critical for their performance. Existing KNN loss functions designed to optimize CNNs in \(\mathbb {R}^n\) feature spaces for specific KNN classifiers greatly boost the performance. However, these loss functions need to compute the pairwise distances within each batch, which requires large computational resource, GPU and memory. This paper aims to exploit lightweight KNN loss functions in order to reduce the computational cost while achieving comparable to or even better performance than existing KNN loss functions. To this end, an anchor loss function is proposed that assigns each category an anchor vector in KNN feature spaces and introduces the distances between training samples and anchor vectors in the NCA (Neighborhood Component Analysis) function. The proposed anchor loss function largely reduces the required computation by existing KNN loss functions. In addition, instead of optimizing CNNs in \(\mathbb {R}^n\) feature spaces, this paper proposed to optimize them in hypersphere feature spaces for faster convergence and better performance. The proposed anchor loss optimized in the hypersphere feature space is called HAL (Hypersphere Anchor Loss). Experiments on various image classification benchmarks show that HAL reduces the computational cost and achieves better performance: on CIFAR-10 and Fashion-MNIST datasets, compared with existing KNN loss functions, HAL improves the accuracy by over \(1\%\), and the computational cost decreases to less than \(10\%\).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Data availability and access

My manuscript has no associated data.

Notes

  1. The number 10 in the amount of distances is related to the number of classes in the MNIST dataset.

  2. The feature space used for KNN classifiers is also called KNN space in this paper.

  3. Training accuracy predicted the category of a sample within a training batch by finding its closest 10 neighbors in the batch and then defining the label which happened most frequently among the neighbors as its predicted category.

  4. Test accuracy means the common accuracy which predicts the category of a test sample by finding its neighbors in the training dataset. Test accuracy and accuracy are interchangeable in this paper.

References

  1. Cover T, Hart P (1967) Nearest neighbor pattern classification. IEEE Trans Inf Theory 13(1):21–27

    Article  Google Scholar 

  2. Zhou X, Ren Z , Zhou S,Yu T, Jiang Z (2023) Unsupervised saliency detection via knn mechanism and object-biased prior. Neural Process Lett 1–15

  3. Maria A, Sunder R, Antony AM (2023) Stress prediction using enhanced feature selection and knn model. In: 2023 Advanced computing and communication technologies for high performance applications (ACCTHPA), pp 1–5

  4. Lestari W, Sumarlinda S (2022) Implementation of k-nearest neighbor (knn) and suport vector machine (svm) for clasification cardiovascular disease. Int J Multi Sci 2(10):30–36

    Google Scholar 

  5. Sani S, Wiratunga N, Massie S (2017) Learning deep features for knn-based human activity recognition. CEUR Workshop Proceedings

  6. Uddin S, Haque I, Lu H, Moni MA, Gide E (2022) Comparative performance analysis of k-nearest neighbour (knn) algorithm and its different variants for disease prediction. Sci Rep 12(1):6256

    Article  Google Scholar 

  7. Gallego A-J, Calvo-Zaragoza J, Valero-Mas JJ, Rico-Juan JR (2018) Clustering-based k-nearest neighbor classification for large-scale data with neural codes representation. Pattern Recognition 74:531–543

    Article  Google Scholar 

  8. Ren W, Yu Y, Zhang J, Huang K (2014) Learning convolutional nonlinear features for k nearest neighbor image classification. In: 2014 22nd international conference on pattern recognition, IEEE pp 4358–4363

  9. Le L, Xie Y, Raghavan VV (2018) Deep similarity-enhanced k nearest neighbors.In: 2018 IEEE international conference on big data (Big Data), IEEE pp 2643–2650

  10. Ghosh S, Singh A, Jhanjhi N, Masud M, Aljahdali S, et al.(2022) Svm and knn based cnn architectures for plant classification. Computers, Materials & Continua 71(3)

  11. Lanjewar MG, Parab JS, Shaikh AY (2023) Development of framework by combining cnn with knn to detect alzheimer’s disease using mri images. Multimedia Tools and Applications 82(8):12699–12717

    Article  Google Scholar 

  12. Gallego A-J, Pertusa A, Calvo-Zaragoza J (2018) Improving convolutional neural networks’ accuracy in noisy environments using k-nearest neighbors. Appl Sci 8(11):2086

    Article  Google Scholar 

  13. Zhang S, Li J, Li Y (2022) Reachable distance function for knn classification. IEEE Trans Knowl Data Eng

  14. Goldberger J, Hinton GE, Roweis ST, Salakhutdinov RR (2005) Neighbourhood components analysis. In: Advances in neural information processing systems, pp 513–520

  15. Gallego A-J, Calvo-Zaragoza J, Valero-Mas JJ, Rico-Juan JR (2018) Clustering-based k-nearest neighbor classification for large-scale data with neural codes representation. Pattern Recogn 74:531–543

    Article  Google Scholar 

  16. Hassanat AB (2018) Furthest-pair-based binary search tree for speeding big data classification using k-nearest neighbors. Big Data 6(3):225–235

    Article  Google Scholar 

  17. Kobayashi S, Matsugu S, Shiokawa H (2022) Indexing complex networks for fast attributed knn queries. Soc Netw Anal Min 12(1):82

    Article  Google Scholar 

  18. Hassanat A (2018) Norm-based binary search trees for speeding up knn big data classification. Computers 7(4):54

  19. Skryjomski P, Krawczyk B, Cano A (2019) Speeding up k-nearest neighbors classifier for large-scale multi-label learning on gpus. Neurocomputing 354:10–19

    Article  Google Scholar 

  20. Barrientos RJ, Riquelme JA, Hernández-García R, Navarro CA, Soto-Silva W (2022) Fast knn query processing over a multi-node gpu environment. The J Supercomput 1–27

  21. Guerraoui R, Kermarrec A-M, Ruas O, Ta ïani F, (2020) Smaller, faster & lighter knn graph constructions. Proceedings of The Web Conference 2020:1060–1070

    Google Scholar 

  22. Abu-Aisheh Z, Raveaux R, Ramel J-Y (2020) Efficient k-nearest neighbors search in graph space. Pattern Recogn Lett 134:77–86

    Article  Google Scholar 

  23. Nielsen MA (2018) Neural networks and deep learning. Determination Press. http://neuralnetworksanddeeplearning.com/

Download references

Acknowledgements

This work was supported by National Natural Science Foundation of China (No.62071060) and the Beijing Key Laboratory of Work Safety and Intelligent Monitoring Foundation.

Author information

Authors and Affiliations

Authors

Contributions

Xiang Ye contributed to design of the paper, data analysis and interpretation, drafting and revision of the manuscript. Zihang He contributed to the revision of the manuscript. Heng Wang contributed to the revision of the manuscript. Yong Li contributed to data analysis and interpretation, drafting and revision of the manuscript.

Corresponding author

Correspondence to Yong Li.

Ethics declarations

Ethical and informed consent for data used

The data used in this paper are all publicly available database. The use of datasets was in compliance with the terms and conditions outlined by the datasets.

Conflicts of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ye, X., He, Z., Wang, H. et al. Hypersphere anchor loss for K-Nearest neighbors. Appl Intell 53, 30319–30328 (2023). https://doi.org/10.1007/s10489-023-05148-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-023-05148-5

Keywords

Navigation