Skip to main content
Log in

A greedy screening test strategy to accelerate solving LASSO problems with small regularization parameters

  • Methodologies and Application
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

In the era of big data remarked by high dimensionality and large sample size, the least absolute shrinkage and selection operator (LASSO) problems demand efficient algorithms. Both static and dynamic strategies based on screening test principle have been proposed recently, in order to safely filter out irrelevant atoms from the dictionary. However, such strategies only work well for LASSO problems with large regularization parameters, and lose their efficiency for those with small regularization parameters. This paper presents a novel greedy screening test strategy to accelerate solving LASSO problems with small regularization parameters, as well as its effectiveness through adoption of a relatively larger regularization parameter which filters out irrelevant atoms in every iteration. Further more, the convergence proof of the greedy strategy is given, and the computational complexity of LASSO solvers integrated with this strategy is investigated. Numerical experiments on both synthetic and real data sets support the effectiveness of this greedy strategy, and the results show it outperforms both the static and dynamic strategies for LASSO problems with small regularization parameters.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  • Agarwal V, Gribok AV, Abidi MA (2007) Image restoration using l1 norm penalty function. Inverse Problems Sci Eng 15(8):785–809

    Article  MathSciNet  Google Scholar 

  • Angelosante D, Giannakis GB (2009) Rls-weighted lasso for adaptive estimation of sparse signals. In: IEEE international conference on acoustics, speech and signal processing, ICASSP 2009. IEEE, pp 3245–3248

  • Angelosante D, Giannakis GB, Grossi E (2009) Compressed sensing of time-varying signals. In: 2009 16th international conference on digital signal processing. IEEE, pp 1–8

  • Beck A, Teboulle M (2009a) A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J Imaging Sci 2(1):183–202

    Article  MathSciNet  Google Scholar 

  • Beck A, Teboulle M (2009b) A fast iterative shrinkage-thresholding algorithm with application to wavelet-based image deblurring. In: 2009 IEEE international conference on acoustics, speech and signal processing. IEEE, pp 693–696

  • Bioucas-Dias JM, Figueiredo MA (2007) A new twist: two-step iterative shrinkage/thresholding algorithms for image restoration. IEEE Trans Image Process 16(12):2992–3004

    Article  MathSciNet  Google Scholar 

  • Bonnefoy A, Emiya V, Ralaivola L, Gribonval R (2014) A dynamic screening principle for the lasso. In: 2014 22nd European signal processing conference (EUSIPCO). IEEE, pp 6–10

  • Bottou L (2010) Large-scale machine learning with stochastic gradient descent. In: Proceedings of COMPSTAT’2010. Springer, pp 177–186

  • Chambolle A, Pock T (2011) A first-order primal-dual algorithm for convex problems with applications to imaging. J Math Imaging Vis 40(1):120–145

    Article  MathSciNet  Google Scholar 

  • Daubechies I, Defrise M, De Mol C (2004) An iterative thresholding algorithm for linear inverse problems with a sparsity constraint. Commun Pure Appl Math 57(11):1413–1457

    Article  MathSciNet  Google Scholar 

  • Desmedt C, Piette F, Loi S, Wang Y, Lallemand F, Haibe-Kains B, Viale G, Delorenzi M, Zhang Y, d’Assignies MS et al (2007) Strong time dependence of the 76-gene prognostic signature for node-negative breast cancer patients in the transbig multicenter independent validation series. Clin Cancer Res 13(11):3207–3214

    Article  Google Scholar 

  • Efron B, Hastie T, Johnstone I, Tibshirani R et al (2004) Least angle regression. Ann stat 32(2):407–499

    Article  MathSciNet  Google Scholar 

  • Friedman J, Hastie T, Höfling H, Tibshirani R et al (2007) Pathwise coordinate optimization. Ann Appl Stat 1(2):302–332

    Article  MathSciNet  Google Scholar 

  • Ghaoui LE, Viallon V, Rabbani T (2010) Safe feature elimination for the lasso and sparse supervised learning problems. arXiv preprint arXiv:1009.4219,

  • LeCun Y, Cortes C, Burges CJ (1998) The mnist database of handwritten digits. http://yann.lecun.com/exdb/mnist/

  • Lee H, Battle A, Raina R, Ng AY (2006) Efficient sparse coding algorithms. In: Advances in neural information processing systems, pp 801–808

  • Li W, Feng J, Jiang T (2011) Isolasso: a lasso regression approach to rna-seq based transcriptome assembly. J Comput Biol 18(11):1693–1707

    Article  MathSciNet  Google Scholar 

  • Ndiaye E (2018) Safe optimization algorithms for variable selection and hyperparameter tuning, Ph.D. dissertation, Université Paris-Saclay

  • Nesterov Y et al (2007) Gradient methods for minimizing composite objective function. Math Programm 140:125–161

    Article  Google Scholar 

  • Pan X, Xu Y (2019) A safe reinforced feature screening strategy for lasso based on feasible solutions. Inf Sci 477:132–147

    Article  MathSciNet  Google Scholar 

  • Saeys Y, Inza I, Larrañaga P (2007) A review of feature selection techniques in bioinformatics. Bioinformatics 23(19):2507–2517

    Article  Google Scholar 

  • Shalev-Shwartz S, Tewari A (2011) Stochastic methods for l1-regularized loss minimization. J Mach Learn Res 12(Jun):1865–1892

    MathSciNet  MATH  Google Scholar 

  • Shalev-Shwartz S, Zhang T (2014) Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization. In: ICML, pp 64–72

  • Tibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc Ser B Methodol 58:267–288

    MathSciNet  MATH  Google Scholar 

  • Wang Y (2015) Feature screening for the lasso, Ph.D. dissertation, Princeton University

  • Wang J, Zhou J, Wonka P, Ye J (2013) Lasso screening rules via dual polytope projection. In: Advances in neural information processing systems, pp 1070–1078

  • Wright SJ, Nowak RD, Figueiredo MA (2009) Sparse reconstruction by separable approximation. IEEE Trans Signal Process 57(7):2479–2493

    Article  MathSciNet  Google Scholar 

  • Wu TT, Lange K (2008) Coordinate descent algorithms for lasso penalized regression. Ann Appl Stat 2:224–244

    Article  MathSciNet  Google Scholar 

  • Xiang ZJ, Ramadge PJ (2012)Fast lasso screening tests based on correlations. In: 2012 IEEE international conference on acoustics, speech and signal processing (ICASSP). IEEE, pp 2137–2140

  • Xiang ZJ, Xu H, Ramadge PJ (2011) Learning sparse representations of high dimensional data on large scale dictionaries. In: Advances in neural information processing systems, pp 900–908

  • Xiang ZJ, Wang Y, Ramadge PJ (2014) Screening tests for lasso problems. IEEE Trans Pattern Anal Mach Intell 39:1008–1027

    Article  Google Scholar 

Download references

Acknowledgements

This work is supported by the Macau Science and Technology Development Funds (Grant No. 003/2016/AFJ) from the Macau Special Administrative Region of the People’s Republic of China.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yong Liang.

Ethics declarations

Conflict of interest

The authors declared that they have no conflicts of interest to this work. We declare that we do not have any commercial or associative interest that represents a conflict of interest in connection with the work submitted.

Additional information

Communicated by V. Loia.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shen, HW., Chai, H., Xia, LY. et al. A greedy screening test strategy to accelerate solving LASSO problems with small regularization parameters. Soft Comput 24, 5245–5253 (2020). https://doi.org/10.1007/s00500-019-04275-x

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-019-04275-x

Keywords

Navigation