Skip to main content
Log in

Weighted Least Squares Support Vector Machine for Semi-supervised Classification

  • Published:
Wireless Personal Communications Aims and scope Submit manuscript

Abstract

The recently proposed semi-supervised least squares support vector machine (SLS-SVM), extends support vector machine (SVM) to semi-supervised learning field. However, the support value in SLS-SVM is not zero and the solution is lack of sparseness. To overcome this drawback, a weighted semi-supervised SLS-SVM (WSLS-SVM) is proposed in this paper, where the impact of labeled and unlabeled samples can be controlled by weighting the corresponding error. It is basically a pruning method according to the sorted weight of estimation error. To solve the proposed classifier, an efficient progressive learning algorithm is presented to reduce the iteration. Experimental results on several benchmarks data sets confirm the sparseness and the effectiveness of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Hirai, H., Murota, K., & Rikitoku, M. (2017). Electric network classifiers for semi-supervised learning on graphs. Journal of the Operations Research Society of Japan, 50(3), 219–232.

    Article  MathSciNet  Google Scholar 

  2. Chapelle, O., Sindhwani, V., & Keerthi, S. S. (2008). Optimization techniques for semi-supervised support vector machines. Journal of Machine Learning Research, 9(1), 203–233.

    MATH  Google Scholar 

  3. Li, Y. F., & Zhou, Z. H. (2015). Towards making unlabeled data never hurt. IEEE Transactions on Pattern Analysis and Machine Intelligence, 37(1), 175–188.

    Article  Google Scholar 

  4. Li, Y. F., Tsang, I. W., Kwok, J. T., et al. (2013). Convex and scalable weakly labeled SVMs. Journal of Machine Learning Research, 14(1), 2151–2188.

    MathSciNet  MATH  Google Scholar 

  5. Singla, A., Patra, S., & Bruzzone, L. (2014). A novel classification technique based on progressive transductive SVM learning. Pattern Recognition Letters, 42(1), 101–106.

    Article  Google Scholar 

  6. Tian, Y., & Luo, J. (2017). A new branch-and-bound approach to semi-supervised support vector machine. Soft Computing, 21(1), 245–254.

    Article  MathSciNet  Google Scholar 

  7. Bai, Y., & Yan, X. (2016). Conic relaxations for semi-supervised support vector machines. Journal of Optimization Theory and Applications, 169(1), 299–313.

    Article  MathSciNet  Google Scholar 

  8. Adankon, M. M., & Cheriet, M. (2010). Genetic algorithm–based training for semi-supervised SVM. Neural Computing and Applications, 19(8), 1197–1206.

    Article  Google Scholar 

  9. Gieseke, F., Airola, A., Pahikkala, T., et al. (2014). Fast and simple gradient-based optimization for semi-supervised support vector machines. Neurocomputing, 123(1), 23–32.

    Article  Google Scholar 

  10. Reddy, I. S., Shevade, S., & Murty, M. N. (2011). A fast quasi-Newton method for semi-supervised SVM. Pattern Recognition, 44(10), 2305–2313.

    Article  Google Scholar 

  11. Gestel, T., Suykens, J., et al. (2004). Benchmarking least squares support vector machine classifiers. Machine Learning, 54(1), 5–32.

    Article  Google Scholar 

  12. Adankon, M. M., Cheriet, M., & Biem, A. (2009). Semisupervised least squares support vector machine. IEEE Transactions on Neural Networks, 20(12), 1858.

    Article  Google Scholar 

  13. Zhang, R., Wang, W., Ma, Y., et al. (2009). Least square transduction support vector machine. Neural Process Letters, 29(2), 133–142.

    Article  Google Scholar 

  14. Ding, S., Hua, X., & Yu, J. (2014). An overview on nonparallel hyperplane support vector machine algorithms. Neural Computing and Applications, 25(5), 975–982.

    Article  Google Scholar 

  15. Yang, L., Yang, S., Zhang, R., et al. (2014). Sparse least square support vector machine via coupled compressive pruning. Neurocomputing, 131(9), 77–86.

    Article  Google Scholar 

  16. Jiao, L., Bo, L., & Wang, L. (2007). Fast sparse approximation for least square support vector machine. IEEE Transactions on Neural Networks, 18(3), 685–697.

    Article  Google Scholar 

Download references

Acknowledgements

This work is supported by the Natural Science Foundation of Jiangsu (20140216) and by key projects of science and technology research of Henan education department (14B520024).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhanwei Liu.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, Z., Liu, H. & Zhao, Z. Weighted Least Squares Support Vector Machine for Semi-supervised Classification. Wireless Pers Commun 103, 797–808 (2018). https://doi.org/10.1007/s11277-018-5478-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11277-018-5478-y

Keywords

Navigation