Abstract
In this paper, we propose adaptive \(L_p\) (\(0<p<1\)) estimators in sparse, high-dimensional, linear regression models when the number of covariates depends on the sample size. Other than the case of the number of covariates is smaller than the sample size, in this paper, we prove that under appropriate conditions, these adaptive \(L_p\) estimators possess the oracle property in the case that the number of covariates is much larger than the sample size. We present a series of experiments demonstrating the remarkable performance of this estimator with adaptive \(L_p\) regularization, in comparison with the \(L_{1}\) regularization, the adaptive \(L_{1}\) regularization, and non-adaptive \(L_{p}\) regularization with \(0<p<1\), and its broad applicability in variable selection, signal recovery and shape reconstruction.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Fan, J.Q., Li, R.: Variable selection via nonconcave penalized likelihood and its oracle properties. J. Am. Stat. Assoc. 96, 1348–1360 (2001)
Donoho, D.L.: Compressed sensing. IEEE Trans. Inf. Theory 52, 1289–1306 (2006)
Candes, E.J., Romberg, J., Tao, T.: Stable signal recovery from incomplete and inaccurate measurements. Comm. Pure. Appl. Math. 59, 1207–1223 (2006)
Donoho, D.L.: Neighbourly polytypes and the sparse solution of under-determined systems of linear equations. IEEE Trans. Inf. Theory (2005, to appear)
Donoho, D.L.: High-dimensional centrally symmetric polytypes with neighbour proportional to dimension. Discrete Comput. Geom. 35, 617–652 (2006)
Zou, H.: The adaptive lasso and its oracle properties. J. Am. Stat. Assoc. 101, 1418–1429 (2006)
Knight, K., Fu, W.: Asymptotics for Lasso-type estimators. Ann. Stat. 28, 1356–1378 (2000)
Huang, J., Horowitz, J., Ma, S.: Asymptotic properties for bridge estimators in sparse high-dimensional regression models. Ann. Stat. 36, 587–613 (2008)
He, X.N., Lu, W.L., Chen, T.P.: A note on adaptive \(L_p\) regularization. In: The 2012 International Joint Conference on Neural Networks (2012)
Meinshausen, N., Yu, B.: Lasso-type recovery of sparse representations for high-dimensional data. Ann. Stat. 37, 246–270 (2009)
Huang, J., Ma, S., Zhang, C.: Adaptive lasso for sparse high dimensional regression models. Stat. Sinica 18, 1603–1618 (2008)
Xu, Z.B., et al.: \(L_{1/2}\) regularization. SCIENCE CHINA-Inf. Sci. 53, 1159–1169 (2010)
Xu, Z.B., et al.: \(L_{1/2}\) regularization: a thresholding representation theory and a fast solver. IEEE Trans. Neural Netw. Learn. Syst. 23, 1013–1027 (2012)
Vaart, A.W., Wellner, J.A.: Weak Convergence and Empirical Processes, pp. 16–28. Springer, New York (1996)
Marial, J.: SPArse Modeling Software: an optimization toolbox for solving various sparse estimation problems. http://spams-devel.gforge.inria.fr/
Tibshirani, R.: Regression shrinkage and selection via the Lasso. J. R. Stat. Soc. Series B (Methodological) 58, 267–288 (1996)
Peter, C., Grace, W.: Smoothing noisy data with spline functions. Numer. Math. 31, 377–403 (1978)
Chen, S., Donoho, D.L., Saunders, M.: Atomic decomposition by basis pursuit. SIAM J. Sci. Comput. 20, 33–61 (1998)
Candes, E.J., Romberg, J., Tao, T.: Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inf. Theory 52, 489–509 (2006)
Candes, E.J., Tao, T.: Near-optimal signal recovery from random projections: universal encoding strategies? IEEE Trans. Inf. Theory 52, 5406–5425 (2006)
Candes, E.J.: The restricted isometry property and its application for compressed sensing. C.R. Math. 346, 589–592 (2006)
Candes, E.J., Wakin, M.B., Boyd, S.P.: Enhancing sparsity by reweighted \(\ell \)1 minimization. J. Fourier Anal. Appl. 14, 877–905 (2008)
Candes, E.J., Tao, T.: Decoding by linear programming. IEEE Trans. Inf. Theory 51, 4203–4215 (2005)
Pedregosa, F., et al.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)
Wang, Z., et al.: Image quality assesment: From error visibility to structral similarity. IEEE Trans. Image Process. 13, 600–612 (2004)
Wang, Z., Simoncelli, E.P.: Multiscale structural similarity for image quality assessment. In: Conference Record of the Thirty-Seventh Asilomar Conference on Signals, Systems and Computers, pp. 1398–1402 (2004)
Wang, Z., Bovik, A.C.: Mean squared error: love it or leave it? - A new look at signal fidelity measures. IEEE Signal Process. Mag. 26, 98–117 (2009)
Acknowledgments
This work is jointly supported by Natural Science Foundation of China (NSFC) under Grant No. 61673119 and the Shanghai Committee of Science and Technology, China under Grant No. 14DZ1118700.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Shi, Y., He, X., Wu, H., Jin, ZX., Lu, W. (2017). Adaptive \(L_p\) \((0<p<1)\) Regularization: Oracle Property and Applications. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, ES. (eds) Neural Information Processing. ICONIP 2017. Lecture Notes in Computer Science(), vol 10634. Springer, Cham. https://doi.org/10.1007/978-3-319-70087-8_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-70087-8_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-70086-1
Online ISBN: 978-3-319-70087-8
eBook Packages: Computer ScienceComputer Science (R0)