Skip to main content
Log in

A new double-regularized regression using Liu and lasso regularization

  • Original paper
  • Published:
Computational Statistics Aims and scope Submit manuscript

Abstract

This paper discusses a new estimator that performs simultaneous parameter estimation and variable selection in the scope of penalized regression methods. The estimator is an extension of the Liu estimator with \(\ell _{1}\)-norm penalization. We give the coordinate descent algorithm to estimate the coefficient vector of the proposed estimator, efficiently. We also examine the consistency properties of the estimator. We conduct simulation studies and two real data analyses to compare the proposed estimator with several estimators including the ridge, Liu, lasso and elastic net. The simulation studies and real data analyses show that besides performing automatic variable selection, the new estimator has considerable prediction performance with a small mean squared error under sparse and non-sparse data structures.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23

Similar content being viewed by others

References

  • Belloni A, Chernozhukov V, Wang L et al (2014) Pivotal estimation via square-root lasso in nonparametric regression. Ann Stat 42(2):757–788

    Article  MathSciNet  Google Scholar 

  • Boyd S, Parikh N, Chu E, Peleato B, Eckstein J (2011) Distributed optimization and statistical learning via the alternating direction method of multipliers. Found Trends® Mach Learn 3(1):1–122

  • Breheny P, Huang J (2011) Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection. Ann Appl Stat 5(1):232

    Article  MathSciNet  Google Scholar 

  • Candès EJ, Tao T (2010) The power of convex relaxation: near-optimal matrix completion. IEEE Trans Inf Theory 56(5):2053–2080

    Article  MathSciNet  Google Scholar 

  • Donoho DL, Johnstone JM (1994) Ideal spatial adaptation by wavelet shrinkage. Biometrika 81(3):425–455

    Article  MathSciNet  Google Scholar 

  • Efron B, Hastie T, Johnstone I, Tibshirani R (2004) Least angle regression. Ann Stat 32(2):407–499

    Article  MathSciNet  Google Scholar 

  • Friedman J, Hastie T, Höfling H, Tibshirani R (2007) Pathwise coordinate optimization. Ann Appl Stat 1(2):302–332

    Article  MathSciNet  Google Scholar 

  • Friedman J, Hastie T, Tibshirani R (2010) Regularization paths for generalized linear models via coordinate descent. J Stat Softw 33(1):1

    Article  Google Scholar 

  • Hastie T, Tibshirani R, Wainwright M (2015) Statistical learning with sparsity: the lasso and generalizations. CRC Press, Boca Raton

    Book  Google Scholar 

  • Hoerl AE, Kennard RW (1970) Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1):55–67

    Article  Google Scholar 

  • Hubert M, Wijekoon P (2006) Improvement of the Liu estimator in linear regression model. Stat Pap 47(3):471

    Article  MathSciNet  Google Scholar 

  • Kato K (2009) Asymptotics for argmin processes: convexity arguments. J Multivar Anal 100(8):1816–1829

    Article  MathSciNet  Google Scholar 

  • Knight K, Fu W (2000) Asymptotics for lasso-type estimators. Ann Stat 1356–1378

  • Li Y, Yang H (2012) A new Liu-type estimator in linear regression model. Stat Pap 53(2):427–437

    Article  MathSciNet  Google Scholar 

  • Liu K (1993) A new class of blased estimate in linear regression. Commun Stat Theory Methods 22(2):393–402

    Article  Google Scholar 

  • Liu K (2003) Using Liu-type estimator to combat collinearity. Commun Stat Theory Methods 32(5):1009–1020

    Article  MathSciNet  Google Scholar 

  • Stamey T A, Kabalin J N, McNeal J E, Johnstone I M, Freiha F, Redwine E A, Yang N (1989) Prostate specific antigen in the diagnosis and treatment of adenocarcinoma of the prostate. II. Radical prostatectomy treated patients. J Urol 141(5):1076–1083

    Article  Google Scholar 

  • Stein C (1956) Inadmissibility of the usual estimator for the mean of a multivariate normal distribution. In: Proceedings of the third Berkeley symposium on mathematical statistics and probability, volume 1: contributions to the theory of statistics. Calif. University of California Press, Berkeley, pp 197–206

  • Tibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc Ser B (Methodol) 58:267–288

    MathSciNet  MATH  Google Scholar 

  • Tibshirani RJ, Hoefling H, Tibshirani R (2011) Nearly-isotonic regression. Technometrics 53(1):54–61

    Article  MathSciNet  Google Scholar 

  • Tseng P (2001) Convergence of a block coordinate descent method for nondifferentiable minimization. J Optim Theory Appl 109(3):475–494

    Article  MathSciNet  Google Scholar 

  • Yang H, Chang X, Liu D (2009) Improvement of the Liu estimator in weighted mixed regression. Commun Stat Theory Methods 38(2):285–292

    Article  MathSciNet  Google Scholar 

  • Yang H, Xu J (2009) An alternative stochastic restricted Liu estimator in linear regression. Stat Pap 50(3):639–647

    Article  MathSciNet  Google Scholar 

  • Yuan M, Lin Y (2006) Model selection and estimation in regression with grouped variables. J R Stat Soc Ser B (Stat Methodol) 68(1):49–67

    Article  MathSciNet  Google Scholar 

  • Yuan M, Lin Y (2007) Model selection and estimation in the gaussian graphical model. Biometrika 94(1):19–35

    Article  MathSciNet  Google Scholar 

  • Zou H, Hastie T (2005) Regularization and variable selection via the elastic net. J R Stat Soc Ser B (Stat Methodol) 67(2):301–320

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Murat Genç.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Genç, M. A new double-regularized regression using Liu and lasso regularization. Comput Stat 37, 159–227 (2022). https://doi.org/10.1007/s00180-021-01120-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00180-021-01120-4

Keywords

Navigation