Abstract
The recently proposed semi-supervised least squares support vector machine (SLS-SVM), extends support vector machine (SVM) to semi-supervised learning field. However, the support value in SLS-SVM is not zero and the solution is lack of sparseness. To overcome this drawback, a weighted semi-supervised SLS-SVM (WSLS-SVM) is proposed in this paper, where the impact of labeled and unlabeled samples can be controlled by weighting the corresponding error. It is basically a pruning method according to the sorted weight of estimation error. To solve the proposed classifier, an efficient progressive learning algorithm is presented to reduce the iteration. Experimental results on several benchmarks data sets confirm the sparseness and the effectiveness of the proposed method.
Similar content being viewed by others
References
Hirai, H., Murota, K., & Rikitoku, M. (2017). Electric network classifiers for semi-supervised learning on graphs. Journal of the Operations Research Society of Japan, 50(3), 219–232.
Chapelle, O., Sindhwani, V., & Keerthi, S. S. (2008). Optimization techniques for semi-supervised support vector machines. Journal of Machine Learning Research, 9(1), 203–233.
Li, Y. F., & Zhou, Z. H. (2015). Towards making unlabeled data never hurt. IEEE Transactions on Pattern Analysis and Machine Intelligence, 37(1), 175–188.
Li, Y. F., Tsang, I. W., Kwok, J. T., et al. (2013). Convex and scalable weakly labeled SVMs. Journal of Machine Learning Research, 14(1), 2151–2188.
Singla, A., Patra, S., & Bruzzone, L. (2014). A novel classification technique based on progressive transductive SVM learning. Pattern Recognition Letters, 42(1), 101–106.
Tian, Y., & Luo, J. (2017). A new branch-and-bound approach to semi-supervised support vector machine. Soft Computing, 21(1), 245–254.
Bai, Y., & Yan, X. (2016). Conic relaxations for semi-supervised support vector machines. Journal of Optimization Theory and Applications, 169(1), 299–313.
Adankon, M. M., & Cheriet, M. (2010). Genetic algorithm–based training for semi-supervised SVM. Neural Computing and Applications, 19(8), 1197–1206.
Gieseke, F., Airola, A., Pahikkala, T., et al. (2014). Fast and simple gradient-based optimization for semi-supervised support vector machines. Neurocomputing, 123(1), 23–32.
Reddy, I. S., Shevade, S., & Murty, M. N. (2011). A fast quasi-Newton method for semi-supervised SVM. Pattern Recognition, 44(10), 2305–2313.
Gestel, T., Suykens, J., et al. (2004). Benchmarking least squares support vector machine classifiers. Machine Learning, 54(1), 5–32.
Adankon, M. M., Cheriet, M., & Biem, A. (2009). Semisupervised least squares support vector machine. IEEE Transactions on Neural Networks, 20(12), 1858.
Zhang, R., Wang, W., Ma, Y., et al. (2009). Least square transduction support vector machine. Neural Process Letters, 29(2), 133–142.
Ding, S., Hua, X., & Yu, J. (2014). An overview on nonparallel hyperplane support vector machine algorithms. Neural Computing and Applications, 25(5), 975–982.
Yang, L., Yang, S., Zhang, R., et al. (2014). Sparse least square support vector machine via coupled compressive pruning. Neurocomputing, 131(9), 77–86.
Jiao, L., Bo, L., & Wang, L. (2007). Fast sparse approximation for least square support vector machine. IEEE Transactions on Neural Networks, 18(3), 685–697.
Acknowledgements
This work is supported by the Natural Science Foundation of Jiangsu (20140216) and by key projects of science and technology research of Henan education department (14B520024).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Liu, Z., Liu, H. & Zhao, Z. Weighted Least Squares Support Vector Machine for Semi-supervised Classification. Wireless Pers Commun 103, 797–808 (2018). https://doi.org/10.1007/s11277-018-5478-y
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11277-018-5478-y