Skip to main content

Deep Twin Support Vector Networks

  • Conference paper
  • First Online:
Artificial Intelligence (CICAI 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13606))

Included in the following conference series:

  • 1949 Accesses

Abstract

Twin support vector machine (TSVM) is a successful improvement for traditional support vector machine (SVM) for binary classification. However, it is still a shallow model and has many limitations on prediction performance and computational efficiency. In this paper, we propose deep twin support vector networks (DTSVN) which could enhance its performances in all aspects. Specifically, we put forward two version of DTSVN, for binary classification and multi classification, respectively. DTSVN improves the abilities of feature extraction and classification performance with neural networks instead of a manually selected kernel function. Besides, in order to break the bottleneck that the original model cannot directly handle multi classification tasks, multiclass deep twin support vector networks (MDTSVN) is further raised, which could avoid the inefficient one-vs-rest or one-vs-one strategy. In the numerical experiments, our proposed DTSVN and MDTSVN are compared with the other four methods on MNIST, FASHION MNIST and CIFAR10 datasets. The results demonstrate that our DTSVN achieves the best prediction accuracy for the binary problem, and our MDTSVN significantly outperforms other existing shallow and deep methods for the multi classification problem.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Chopra, S., Hadsell, R., LeCun, Y.: Learning a similarity metric discriminatively, with application to face verification. In: 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2005), vol. 1, pp. 539–546. IEEE (2005)

    Google Scholar 

  2. Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995)

    Google Scholar 

  3. Diaz-Vico, D., Prada, J., Omari, A., Dorronsoro, J.: Deep support vector neural networks. Integrated Computer-Aided Engineering (Preprint), pp. 1–14 (2020)

    Google Scholar 

  4. Díaz-Vico, D., Prada, J., Omari, A., Dorronsoro, J.R.: Deep support vector classification and regression. In: Ferrández Vicente, J., Álvarez-Sánchez, J., de la Paz López, F., Toledo Moreo, J., Adeli, H. (eds.) IWINAC 2019. LNCS, vol. 11487, pp. 33–43. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-19651-6_4

  5. Feurer, M., Klein, A., Eggensperger, K., Springenberg, J., Blum, M., Hutter, F.: Efficient and robust automated machine learning. Adv. Neural Inform. Process. Syst. 28 (2015)

    Google Scholar 

  6. Ghafoori, Z., Leckie, C.: Deep multi-sphere support vector data description. In: Proceedings of the 2020 SIAM International Conference on Data Mining, pp. 109–117. SIAM (2020). https://doi.org/10.1137/1.9781611976236.13

  7. Gönen, M., Alpaydın, E.: Multiple kernel learning algorithms. J. Mach. Learn. Res. 12, 2211–2268 (2011)

    MathSciNet  MATH  Google Scholar 

  8. Goodfellow, I., et al.: Generative adversarial nets. Adv. Neural Inform. Process. Syst. 27 (2014)

    Google Scholar 

  9. Hao, P.Y.: New support vector algorithms with parametric insensitive/margin model. Neural Networks 23(1), 60–73 (2010). https://doi.org/10.1016/j.neunet.2009.08.001

    Article  MATH  Google Scholar 

  10. Hastie, T., Tibshirani, R.: Classification by pairwise coupling. Adv. Neural Inform. Process. Syst. 10 (1997)

    Google Scholar 

  11. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)

    Google Scholar 

  12. Hearst, M.A., Dumais, S.T., Osuna, E., Platt, J., Scholkopf, B.: Support vector machines. IEEE Intell. Syst. Appl. 13(4), 18–28 (1998). https://doi.org/10.1109/5254.708428

    Article  Google Scholar 

  13. Hsu, C.W., Lin, C.J.: A comparison of methods for multiclass support vector machines. IEEE Trans. Neural Networks 13(2), 415–425 (2002). https://doi.org/10.1109/72.991427

    Article  Google Scholar 

  14. Hussain, M., Wajid, S.K., Elzaart, A., Berbar, M.: A comparison of SVM kernel functions for breast cancer detection. In: 2011 Eighth International Conference Computer Graphics, Imaging and Visualization, pp. 145–150. IEEE (2011). (10/fc9b7v)

    Google Scholar 

  15. Khemchandani, R., Chandra, S.: Twin support vector machines for pattern classification. IEEE Trans. Pattern Anal. Mach. Intell. 29(5), 905–910 (2007). (10/bkq688)

    Google Scholar 

  16. Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. Adv. Neural Inform. Process. Syst. 25 (2012)

    Google Scholar 

  17. LeCun, Y.: The MNIST database of handwritten digits (1998).http://yann.lecun.com/exdb/mnist/

  18. Li, D., Tian, Y., Xu, H.: Deep twin support vector machine. In: 2014 IEEE International Conference on Data Mining Workshop, pp. 65–73. IEEE (2014). (10/gpfwfh)

    Google Scholar 

  19. Li, Y., Zhang, T.: Deep neural mapping support vector machines. Neural Networks 93, 185–194 (2017). (10/gbspkm)

    Google Scholar 

  20. Liu, W., Wen, Y., Yu, Z., Yang, M.: Large-margin softmax loss for convolutional neural networks. In: ICML, vol. 2, p. 7 (2016)

    Google Scholar 

  21. Paszke, A., et al.: Pytorch: an imperative style, high-performance deep learning library. Adv. Neural Inform. Process. Syst. 32 (2019)

    Google Scholar 

  22. Ruff, L., et al.: Deep one-class classification. In: International Conference on Machine Learning, pp. 4393–4402. PMLR (2018)

    Google Scholar 

  23. Schölkopf, B., Smola, A.J., Williamson, R.C., Bartlett, P.L.: New support vector algorithms. Neural Comput. 12(5), 1207–1245 (2000). https://doi.org/10.1162/089976600300015565

    Article  Google Scholar 

  24. Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 (2014)

  25. Tang, Y.: Deep learning using support vector machines. CoRR, abs/1306.0239 2, 1 (2013)

    Google Scholar 

  26. Tax, D.M., Duin, R.P.: Support vector data description. Mach. Learn. 54(1), 45–66 (2004). https://doi.org/10.1023/B:MACH.0000008084.60811.49

  27. Wiering, M., Schomaker, L.R.: Multi-Layer Support Vector Machines. Chapman & Hall/CRC Press (2014)

    Google Scholar 

  28. Xiao, H., Rasul, K., Vollgraf, R.: Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint arXiv:1708.07747 (2017)

  29. Zareapoor, M., Shamsolmoali, P., Jain, D.K., Wang, H., Yang, J.: Kernelized support vector machine with deep learning: an efficient approach for extreme multiclass dataset. Pattern Recogn. Lett. 115, 4–13 (2018). https://doi.org/10.1016/j.patrec.2017.09.018

    Article  Google Scholar 

  30. Zhang, S.X., Liu, C., Yao, K., Gong, Y.: Deep neural support vector machines for speech recognition. In: 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 4275–4279. IEEE (2015). https://doi.org/10.1109/ICASSP.2015.7178777

  31. Zou, Y., et al.: MK-FSVM-SVDD: a multiple kernel-based fuzzy SVM model for predicting DNA-binding proteins via support vector data description. Current Bioinform 16(2), 274–283 (2021). https://doi.org/10.2174/1574893615999200607173829

Download references

Acknowledgments

This work was supported in part by Graduate innovation fund project of Yunnan university of finance and economics (No. 2021YUFEYC081), Scientific research fund project of Yunnan provincial department of education (No. 2022Y546), Scientific research fund project of Yunnan provincial department of science and technology (No. 202001AU070064) and National Natural Science Foundation of China (No. 62006206).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhiji Yang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, M., Yang, Z. (2022). Deep Twin Support Vector Networks. In: Fang, L., Povey, D., Zhai, G., Mei, T., Wang, R. (eds) Artificial Intelligence. CICAI 2022. Lecture Notes in Computer Science(), vol 13606. Springer, Cham. https://doi.org/10.1007/978-3-031-20503-3_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-20503-3_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-20502-6

  • Online ISBN: 978-3-031-20503-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics