Abstract
Adaptive Lasso preserves oracle properties comparing to classical Lasso. It performs as well as if the true underlying model is provided in advance. In order to let feature subset selected by Adaptive Lasso preserve more local information, which is discriminative and benefit for classification, Manifold-regularized Adaptive Lasso (MrALasso) is proposed for feature selection. Reconstructing response by linear sum of features is considered in manifold embedded in high-dimensional space. A similarity graph of data points is built. Connected points are restricted to stay together as close as possible so that the intrinsic geometry of the data and the local structure are preserved. An effective iterative algorithm, with detailed proof of convergence, is proposed to solve the optimization problem. Experimental results of feature selection on several classical gene datasets show the effectiveness and superiority of the proposed method.
B. Luo—This work was supported in part by National Natural Science Foundation of China under Grant 61472002, 61572030 and 61671018, and Collegiate Natural Science Fund of Anhui Province under Grant KJ2017A014.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Alon, U., Barkai, N.: Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays. Proc. Natl. Acad. Sci. USA 96(12), 6745–6750 (1999)
Antoniadis, A., Lambertlacroix, S., Leblanc, F.: Effective dimension reduction methods for tumor classification using gene expression data. Bioinformatics 19(5), 563–570 (2003)
Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15(6), 1373–1396 (2003)
Chen, X., Xu, Y.: Discriminative feature selection for multiple ocular diseases classification by sparse induced graph regularized group Lasso. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9350, pp. 11–19. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-24571-3_2
Ding, C.H.Q., Peng, H.: Minimum redundancy feature selection from microarray gene expression data. J. Bioinform. Comput. Biol. 3(2), 185–206 (2005)
Feng, W., Huang, W., Ren, J.: Class imbalance ensemble learning based on the margin theory. Appl. Sci. 8(5), 815 (2018)
Golub, T.R., Slonim, D.K., et al.: Molecular classification of cancer: class discovery and class prediction by gene expression monitoring. Science 286(5439), 531–537 (1999)
Gui, J., Sun, Z., Ji, S., Tao, D., Tan, T.: Feature selection based on structured sparsity: a comprehensive study. IEEE T-NNLS 28(7), 1490–1507 (2017)
Han, J., Zhang, D., et al.: Object detection in optical remote sensing images based on weakly supervised learning and high-level feature learning. IEEE TGRS 53(6), 3325–3337 (2015)
Han, J., Zhang, D., Hu, X., Guo, L., Ren, J., Wu, F.: Background prior-based salient object detection via deep reconstruction residual. IEEE T-CSVT 25(8), 1309–1321 (2015)
He, X., Yan, S., Hu, Y., Niyogi, P., Zhang, H.: Face recognition using laplacianfaces. IEEE Trans. Pattern Anal. Mach. Intell. 27(3), 328–340 (2005)
Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97(1–2), 273–324 (1997)
Kononenko, I.: Estimating attributes: analysis and extensions of RELIEF. In: Bergadano, F., De Raedt, L. (eds.) ECML 1994. LNCS, vol. 784, pp. 171–182. Springer, Heidelberg (1994). https://doi.org/10.1007/3-540-57868-4_57
Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE TPAMI 27(8), 1226–1238 (2005)
Raileanu, L.E., Stoffel, K.: Theoretical comparison between the gini index and information gain criteria. Ann. Math. Artif. Intell. 41(1), 77–93 (2004)
Roweis, S., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)
Sun, G., Ma, P., Ren, J., Zhang, A., Jia, X.: A stability constrained adaptive alpha for gravitational search algorithm. Knowl.-Based Syst. 139, 200–213 (2018)
Tenenbaum, J.B., De Silva, V., Langford, J.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)
Tibshirani, R.: Regression shrinkage and selection via the Lasso. J. R. Stat. Soc. Ser. B (Methodol.) 58(1), 267–288 (1996)
Wang, Z., Ren, J., et al.: A deep-learning based feature hybrid framework for spatiotemporal saliency detection inside videos. Neurocomputing 287, 68–83 (2018)
Yan, Y., Ren, J., et al.: Unsupervised image saliency detection with gestalt-laws guided optimization and visual attention based refinement. Pattern Recogn. 79, 65–78 (2018)
Zhang, A., Sun, G., Ren, J., Li, X., Wang, Z., Jia, X.: A dynamic neighborhood learning-based gravitational search algorithm. IEEE Trans. Cybern. 48(1), 436–447 (2018)
Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. J. R. Stat. Soc.: Ser. B (Stat. Methodol.) 67(2), 301–320 (2005)
Zou, H.: The adaptive Lasso and its oracle properties. J. Am. Stat. Assoc. 101(476), 1418–1429 (2006)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Chen, SB., Zhang, YM., Luo, B. (2018). Manifold-Regularized Adaptive Lasso. In: Ren, J., et al. Advances in Brain Inspired Cognitive Systems. BICS 2018. Lecture Notes in Computer Science(), vol 10989. Springer, Cham. https://doi.org/10.1007/978-3-030-00563-4_53
Download citation
DOI: https://doi.org/10.1007/978-3-030-00563-4_53
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-00562-7
Online ISBN: 978-3-030-00563-4
eBook Packages: Computer ScienceComputer Science (R0)