Abstract
Variable selection is an important method to analyze large quantity of data and extract useful information. Although least square regression is the most widely used scheme for its flexibility in obtaining explicit solutions, least absolute deviation (LAD) regression combined with lasso penalty becomes popular for its resistance to heavy-tailed errors in response variable, denoted as LAD-LASSO. In this paper, we consider the LAD-LASSO problem for variable selection. Based on a dynamic optimality condition of nonsmooth optimization problem, we develop a descent method to solve the nonsmooth optimization problem. Numerical experiments are conducted to confirm that the proposed method is more efficient than existing methods.
Similar content being viewed by others
References
Breiman, L.: Better subset regression using the nonnegative garrote. Technometrics 37(4), 373–384 (1995)
Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stati. Soc. Ser. B (Methodological), 58(1), 267–288 (1996)
Hastie, T., Tibshirani, R., Wainwright, M.: Statistical Learning with Sparsity. CRC Press, Boca Raton (2015)
Zeebari, Z.: Developing ridge estimation method for median regression. J. Appl. Stat 39(12), 2627–2638 (2012)
Wang, H., Li, G., Jiang, G.: Robust regression shrinkage and consistent variable selection through the LAD-Lasso. J. Bus. Econ. Stat. 25(3), 347–355 (2007)
Gao, X., Huang, J.: Asymptotic analysis of high-dimensional LAD regression with LASSO. Stat. Sin. 20(4), 1485–1506 (2010)
Arslan, O.: Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression. Comput. Stat. Data Anal. 56(6), 1952–1965 (2012)
Xu, J., Ying, Z.: Simultaneous estimation and variable selection in median regression using Lasso-type penalty. Ann. Inst. Stat. Math. 62(3), 487–514 (2010)
Portnoy, S., Koenker, R.: The Gaussian hare and the Laplacian tortoise: computability of squared-error versus absolute-error estimators. Stat. Sci. 12(4), 279–300 (1997)
Watson, G.A., Yiu, K.F.C.: On the solution of the errors in variables problem using the \(l_1\) norm. BIT Numer. Math. 31(4), 697–710 (1991)
Yiu, K.F.C., Yang, X., Nordholm, S., et al.: Near-field broadband beamformer design via multidimensional semi-infinite-linear programming techniques. IEEE Trans. Speech Audio Process. 11(6), 725–732 (2003)
Wang, L.: The \(L_1\) penalized LAD estimator for high dimensional linear regression. J. Multivar. Anal. 120, 135–151 (2013)
Zou, H., Hastie, T., Tibshirani, R.: On the degrees of freedom of the lasso. Ann. Stat. 35(5), 2173–2192 (2007)
Stamey, T.A., Kabalin, J.N., McNeal, J.E., Johnstone, I.M., Freiha, F., Redwine, E.A., Yang, N.: Prostate specific antigen in the diagnosis and treatment of adenocarcinoma of the prostate. II. Radical prostatectomy treated patients. J. Urol. 141(5), 1076–1083 (1989)
Harrison, D., Rubinfeld, D.L.: Hedonic housing prices and the demand for clean air. J. Environ. Econ. Manag. 5(1), 81–102 (1978)
Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. Ann. Stat. 32(2), 407–499 (2004)
Scheetz, T.E., Kim, K.Y.A., et al.: Regulation of gene expression in the mammalian eye and its relevance to eye disease. Proc. Natl. Acad. Sci. 103(39), 14429–14434 (2006)
Acknowledgements
This paper is partially supported by RGC Grant PolyU. 152200/14E and PolyU 4-ZZGS. The first author is also supported by the National Natural Science Foundation of China (No. 61673078) and the grant of Chongqing Normal University (No. 17XLB010).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Shi, Y., Feng, Z. & Yiu, K.F.C. A descent method for least absolute deviation lasso problems. Optim Lett 13, 543–559 (2019). https://doi.org/10.1007/s11590-017-1157-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-017-1157-2