Abstract
Semi-supervised learning has attracted great attention in machine learning for it makes full use of labeled and unlabeled data for training. Most semi-supervised learning methods are not suitable for regression due to the data labels in regression are real-valued and smooth. In this paper, hypergraph instead of graph is utilized to represent the geometric structure of data. The manifold regularization term is constructed by calculating the hypergraph Laplacian and introduced into the regularization framework of kernel learning, a hypergraph regularized support vector regression (HGSVR) is proposed. Moreover, we propose a two-layer maximum density minimum redundancy method (MDMR) to pre-select initial labeled data, which fully considers the density and redundancy of data. The pre-select method is introduced into HGSVR and a second semi-supervised regression called MDMR-HGSVR is proposed. Experimental results on 9 UCI datasets show that HGSVR and MDMR-HGSVR outperform the other compared semi-supervised regression methods.














Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data availability
Some or all data, models, or code generated or used during the study are available from the corresponding author by request (Yuting Sun).
References
Vapnik V (1995) The nature of statistical learning theory. Springer, New York
Vapnik V (1998) Statistical learning theory. Wiley, New York
Peng X, Chen D (2019) An l1-norm loss based twin support vector regression and its geometric extension. Int J Mach Learn Cybern 10:2573–2588
Angayarkanni S, Sivakumar R, Rao Y (2021) Hybrid grey wolf: bald eagle search optimized support vector regression for traffic flow forecasting. J Ambient Intell Humaniz Comput 12(1):1293–1304
Li W, Shi Q, Sbtain M, Li D, Mbanze D (2020) A hybrid forecasting model for short-term power load based on sample entropy, two-phase decomposition and whale algorithm optimized support vector regression. IEEE Access 8:166907–166921
Huang Z, Yang C, Zhou X, Yang S (2020) Energy consumption forecasting for the nonferrous metallurgy industry using hybrid support vector regression with an adaptive state transition algorithm. Cogn Comput 12(2):357–368
Yang Z, Yi X, Zhu A (2020) A mixed model based on wavelet transform and support vector regression to forecast stock price. In: Proceedings of 2020 IEEE international conference on artificial intelligence and computer applications, pp. 420–426
Zhang J, Teng Y, Chen W (2019) Support vector regression with modified firefly algorithm for stock price forecasting. Appl Intell 49(5):1658–1674
Zhu X, Goldberg AB (2009) Introduction to semi-supervised learning. Morgan & Claypool, New York
Jia Y, Kwong S, Hou J, Wu W (2020) Semi-supervised non-negative matrix factorization with dissimilarity and similarity regularization. IEEE Trans Neural Netw Learn Syst 31(8):2510–2521
Gong T, Chen H, Xu C (2021) Learning performance of LapSVM based on Markov subsampling. Neurocomputing 432:10–20
Hu R, Zhang L, Wei J (2021) Adaptive Laplacian support vector machine for semi-supervised learning. Comput J 64(7):1005–1015
Li Y, Wang Y, Bi C, Jiang X (2018) Revisiting transductive support vector machines with margin distribution embedding. Knowl-Based Syst 152:200–214
Zhou Z, Li M (2007) Semisupervised regression with containing-style algorithms. IEEE Trans Knowl Data Eng 19(11):1479–1493
Peng J, Estrada G, Pedersoli M, Desrosier C (2020) Deep co-training for semi-supervised image segmentation. Pattern Recogniz 107:107269
Wang P, Peng J, Pedersoli M, Zhou Y, Zhang C, Desrosier C (2021) Self-paced and self-consistent co-training for semi-supervised image segmentation. Med Image Anal 73:102146
Li Y, Zha H, Zhou Z (2017) Learning safe prediction for semi-supervised regression. In: Proceedings of the thirty-first AAAI conference on artificial intelligence, pp. 2217–2223
Ziraki N, Dornaika F, Bosaghzadeh A (2022) Multiple-view flexible semi-supervised classification through consistent graph construction and label propagation. Neural Netw 146:174–180
Qing Y, Zeng Y, Huang G (2021) Label propagation via local geometry preserving for deep semi-supervised image recognition. Neural Netw 143:303–313
Timilsina M, Figueroa A, d’Aquin M, Yang H (2021) Semi-supervised regression using diffusion on graphs. Appl Soft Comput J 104:107188
Huang S, Liu Z, Jin W, Mu Y (2021) Broad learning system with manifold regularized sparse features for semi-supervised classification. Neurocomputing 463:133–143
Lv S, Shi S, Wang H, Li F (2021) Semi-supervised multi-label feature selection with adaptive structure learning and manifold learning. Knowl-Based Syst 214:106757
Belkin M, Niyogi P, Sindhwani V (2006) Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. J Mach Learn Res 7:2399–2434
Yoo J, Kim H (2014) Semisupervised location awareness in wireless sensor networks using Laplacian support vector regression. Int J Distrib Sens Netw 2014(1):1–7
Berge C (1973) Graph and hypergraph, Amsterdam. North-Holland Publishing Company, Holland
Wu W, Kwong S, Zhou Y, Jia Y, Gao W (2018) Nonnegative matrix factorization with mixed hypergraph regularization for community detection. Inform Sci 435:263–281
Zhang S, Cui S, Ding Z (2020) Hypergraph-based image processing. In: Proceeding of IEEE international conference on image processing (ICIP), pp. 216–220
Luo F, Guo T, Lin Z (2020) Semisupervised hypergraph discriminant learning for dimensionality reduction of hyperspectral image. IEEE J Sel Top Appl Earth Observ Remote Sens 13:4242–4256
Tang C, Liu X, Wang P, Zhang C (2019) Adaptive hypergraph embedded semi-supervised multi-label image annotation. IEEE Trans Multimedia 21(11):2837–2849
Lierde H, Chow T (2019) Learning with fuzzy hypergraphs: a topical approach to query-oriented text summarization. Inf Sci 496:212–224
Lierde H, Chow T (2019) Query-oriented text summarization based on hypergraph transversals. Inf Process Manage 56(4):1317–1338
Zhu J, Zhu J, Ghosh S, Wu W, Yuan J (2019) Social influence maximization in hypergraph in social networks. IEEE Trans Netw Sci Eng 6:801–811
Zhao W, Tao S, Guan Z, Zhang B, Gong M, Cao Z, Wang Q (2018) Learning to map social network users by unified manifold alignment on hypergraph. IEEE Trans Netw Sci Eng 29(12):5834–5846
Raman M, Somu N, Kirthivasan K (2017) A hypergraph and arithmetic residue-based probabilistic neural network for classification in intrusion detection systems. Neural Netw 92:89–97
Wang Y, Chen S, Zhou Z (2012) New semi-supervised classification method based on modified cluster assumption. IEEE Trans Neural Netw Learn Syst 23(5):689–702
Huang S, Elhoseiny M, Elgammal A, Yang D (2015), Learning hypergraph-regularized attribute predictors. In: Proceedings of: 2015 IEEE conference on computer vision and pattern recognition (CVPR), pp. 409–417
Zhou D, Huang J, Schölkopf B (2006) Learning with hypergraphs: clustering, classification, and embedding. In: Proceedings of international conference on neural information processing systems, pp 1601–1608
Gu Y, Jin Z, Chiu S (2014) Active Learning with maximum density and minimum redundancy. In: Proceedings of 21st international conference on neural information processing (ICONIP), pp 103–110
Yang H, King I, Lyu MR (2007) DiffusionRank: a possible penicillin for web spamming. In: Proceedings of the 30th annual international ACM SIGIR conference on research and development in information retrieval (SIGIR07), pp 431–438
Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7(1):1–30
Iman RL, Davenport JM (1980) Approximations of the critical region of the fbietkan statistic. Commun Stat Theory Methods 9(6):571–595
Acknowledgements
This work is supported by the National Natural Science Foundation of China (nos. 61976216, 62276265).
Author information
Authors and Affiliations
Corresponding authors
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Ding, S., Sun, Y., Zhang, J. et al. Maximum density minimum redundancy based hypergraph regularized support vector regression. Int. J. Mach. Learn. & Cyber. 14, 1933–1950 (2023). https://doi.org/10.1007/s13042-022-01738-w
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13042-022-01738-w