Skip to main content

Maximum-Margin Nearest Prototype Classifiers with the Sum-over-Others Loss Function and a Performance Evaluation

  • Conference paper
  • First Online:
Integrated Uncertainty in Knowledge Modelling and Decision Making (IUKM 2023)

Abstract

This paper investigates margin-maximization models for nearest prototype classifiers. These models are formulated through a minimization problem, which is a weighted sum of the inverted margin and a loss function. It is reduced a difference-of-convex optimization problem, and solved using the convex-concave procedure. In our latest study, to overcome limitations of the previous model, we have revised the model in both of the optimization problem and the training algorithm. In this paper, we propose another revised margin-maximization model by replacing the max-over-others loss function used in the latest study with the sum-over-others loss function. We provide a derivation of the training algorithm of the proposed model. Moreover, we evaluate classification performance of the revised margin-maximization models through a numerical experiment using benchmark data sets of UCI Machine Learning Repository. We compare the performance of our models not only with the previous model but also with baseline methods that are the generalized learning quantization, the class-wise k-means, and the support vector machine.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Arthur, D., Vassilvitskii, S.: k-means++: the advantages of careful seeding. In: Proceedings of the Eighteenth Annual ACM-SIAM Symposium on Discrete Algorithms, pp. 1027–1035 (2007)

    Google Scholar 

  2. Doğan, Ü., Glasmachers, T., Igel, C.: A unified view on multi-class support vector classification. J. Mach. Learn. Res. 17(45), 1–32 (2016)

    MathSciNet  MATH  Google Scholar 

  3. Dua, D., Graff, C.: UCI machine learning repository (2017). http://archive.ics.uci.edu/ml

  4. Hammer, B., Villmann, T.: Generalized relevance learning vector quantization. Neural Netw. 15(8), 1059–1068 (2002)

    Article  Google Scholar 

  5. Kohonen, T.: The self-organizing map. Proc. IEEE 78(9), 1464–1480 (1990)

    Article  Google Scholar 

  6. Kusunoki, Y., Nakashima, T.: Revised optimization algorithm for maximum-margin nearest prototype classifier. In: Proceedings of IFSA 2023, pp. 276–280 (2023)

    Google Scholar 

  7. Kusunoki, Y., Wakou, C., Tatsumi, K.: Maximum-margin model for nearest prototype classifiers. J. Adv. Comput. Intell. Intell. Inform. 22(4), 565–577 (2018)

    Article  Google Scholar 

  8. Lipp, T., Boyd, S.: Variations and extension of the convex-concave procedure. Optim. Eng. 17(2), 263–287 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  9. MOSEK ApS: MOSEK Optimization Toolbox for MATLAB 10.0.33 (2022). https://docs.mosek.com/10.0/toolbox/index.html

  10. Sato, A., Yamada, K.: Generalized learning vector quantization. In: Touretzky, D., Mozer, M., Hasselmo, M. (eds.) Advances in Neural Information Processing Systems, vol. 8. MIT Press (1995)

    Google Scholar 

  11. Schneider, P., Biehl, M., Hammer, B.: Adaptive relevance matrices in learning vector quantization. Neural Comput. 21(12), 3532–3561 (2009)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work was supported by JSPS KAKENHI Grant Number JP21K12062.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yoshifumi Kusunoki .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kusunoki, Y. (2023). Maximum-Margin Nearest Prototype Classifiers with the Sum-over-Others Loss Function and a Performance Evaluation. In: Honda, K., Le, B., Huynh, VN., Inuiguchi, M., Kohda, Y. (eds) Integrated Uncertainty in Knowledge Modelling and Decision Making. IUKM 2023. Lecture Notes in Computer Science(), vol 14376. Springer, Cham. https://doi.org/10.1007/978-3-031-46781-3_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-46781-3_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-46780-6

  • Online ISBN: 978-3-031-46781-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics