Skip to main content
Log in

A descent method for least absolute deviation lasso problems

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

Variable selection is an important method to analyze large quantity of data and extract useful information. Although least square regression is the most widely used scheme for its flexibility in obtaining explicit solutions, least absolute deviation (LAD) regression combined with lasso penalty becomes popular for its resistance to heavy-tailed errors in response variable, denoted as LAD-LASSO. In this paper, we consider the LAD-LASSO problem for variable selection. Based on a dynamic optimality condition of nonsmooth optimization problem, we develop a descent method to solve the nonsmooth optimization problem. Numerical experiments are conducted to confirm that the proposed method is more efficient than existing methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  1. Breiman, L.: Better subset regression using the nonnegative garrote. Technometrics 37(4), 373–384 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  2. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stati. Soc. Ser. B (Methodological), 58(1), 267–288 (1996)

  3. Hastie, T., Tibshirani, R., Wainwright, M.: Statistical Learning with Sparsity. CRC Press, Boca Raton (2015)

    Book  MATH  Google Scholar 

  4. Zeebari, Z.: Developing ridge estimation method for median regression. J. Appl. Stat 39(12), 2627–2638 (2012)

  5. Wang, H., Li, G., Jiang, G.: Robust regression shrinkage and consistent variable selection through the LAD-Lasso. J. Bus. Econ. Stat. 25(3), 347–355 (2007)

    Article  MathSciNet  Google Scholar 

  6. Gao, X., Huang, J.: Asymptotic analysis of high-dimensional LAD regression with LASSO. Stat. Sin. 20(4), 1485–1506 (2010)

  7. Arslan, O.: Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression. Comput. Stat. Data Anal. 56(6), 1952–1965 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  8. Xu, J., Ying, Z.: Simultaneous estimation and variable selection in median regression using Lasso-type penalty. Ann. Inst. Stat. Math. 62(3), 487–514 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  9. Portnoy, S., Koenker, R.: The Gaussian hare and the Laplacian tortoise: computability of squared-error versus absolute-error estimators. Stat. Sci. 12(4), 279–300 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  10. Watson, G.A., Yiu, K.F.C.: On the solution of the errors in variables problem using the \(l_1\) norm. BIT Numer. Math. 31(4), 697–710 (1991)

    Article  MATH  Google Scholar 

  11. Yiu, K.F.C., Yang, X., Nordholm, S., et al.: Near-field broadband beamformer design via multidimensional semi-infinite-linear programming techniques. IEEE Trans. Speech Audio Process. 11(6), 725–732 (2003)

    Article  Google Scholar 

  12. Wang, L.: The \(L_1\) penalized LAD estimator for high dimensional linear regression. J. Multivar. Anal. 120, 135–151 (2013)

    Article  MATH  Google Scholar 

  13. Zou, H., Hastie, T., Tibshirani, R.: On the degrees of freedom of the lasso. Ann. Stat. 35(5), 2173–2192 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  14. Stamey, T.A., Kabalin, J.N., McNeal, J.E., Johnstone, I.M., Freiha, F., Redwine, E.A., Yang, N.: Prostate specific antigen in the diagnosis and treatment of adenocarcinoma of the prostate. II. Radical prostatectomy treated patients. J. Urol. 141(5), 1076–1083 (1989)

    Article  Google Scholar 

  15. Harrison, D., Rubinfeld, D.L.: Hedonic housing prices and the demand for clean air. J. Environ. Econ. Manag. 5(1), 81–102 (1978)

    Article  MATH  Google Scholar 

  16. Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. Ann. Stat. 32(2), 407–499 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  17. Scheetz, T.E., Kim, K.Y.A., et al.: Regulation of gene expression in the mammalian eye and its relevance to eye disease. Proc. Natl. Acad. Sci. 103(39), 14429–14434 (2006)

    Article  Google Scholar 

Download references

Acknowledgements

This paper is partially supported by RGC Grant PolyU. 152200/14E and PolyU 4-ZZGS. The first author is also supported by the National Natural Science Foundation of China (No. 61673078) and the grant of Chongqing Normal University (No. 17XLB010).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ka Fai Cedric Yiu.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shi, Y., Feng, Z. & Yiu, K.F.C. A descent method for least absolute deviation lasso problems. Optim Lett 13, 543–559 (2019). https://doi.org/10.1007/s11590-017-1157-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-017-1157-2

Keywords

Navigation