Skip to main content
Log in

On the proximal Landweber Newton method for a class of nonsmooth convex problems

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

We consider a class of nonsmooth convex optimization problems where the objective function is a convex differentiable function regularized by the sum of the group reproducing kernel norm and \(\ell _1\)-norm of the problem variables. This class of problems has many applications in variable selections such as the group LASSO and sparse group LASSO. In this paper, we propose a proximal Landweber Newton method for this class of convex optimization problems, and carry out the convergence and computational complexity analysis for this method. Theoretical analysis and numerical results show that the proposed algorithm is promising.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. http://www.zjucadcg.cn/dengcai/Data/MLData.html

References

  1. Bakin, S.: Adaptive regression and model selection in data mining problems. PhD Thesis, Australian National University, Canberra (1999)

  2. Combettes, P.L., Wajs, V.R.: Signal recovery by proximal forward-backward splitting. SIAM J. Multiscale Model. Simul. 4, 1168–1200 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  3. Friedman, J., Hastie, T., Tibshirani, R.: A note on the group LASSO and a sparse group LASSO, Technical Report, arXiv:1001.0736v1 (2010)

  4. Johansson, B., Elfving, T., Kozlovc, V., Censor, Y., Forss’en, P.E., Granlund, G.: The application of an oblique-projected Landweber method to a model of supervised learning. Math. Comput. Modell. 43, 892–909 (2006)

    Article  MATH  Google Scholar 

  5. Kelley, C.T.: Iterative Methods for Linear and Nonlinear Equations. Society for Industrial and Applied Mathematics, Philadelphia, PA (1995)

    Book  MATH  Google Scholar 

  6. Kim, D., Sra, S., Dhillon, I.: A scalable trust-region algorithm with application to mixednorm regression. In: Proceedings of the International Conference on Machine Learning (ICML) (2010)

  7. LeCun, Y., Cortes, C.: The MNIST database of handwritten digits [J] (1998). Retrieved July 17, 2012 from http://yann.lecun.com/exdb/mnist/

  8. Liu, J., Ji, S., Ye, J.: SLEP: Sparse Learning with Efficient Projections. Arizona State University, Phoenix, AZ (2009b)

    Google Scholar 

  9. Luo, Z.Q., Tseng, P.: On the linear convergence of descent methods for convex essentially smooth minimization. SIAM J. Control Optim. 30, 408–425 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  10. Nene, S.A., Nayar, S.K., Murase, H.: Columbia object image library (COIL-20)[R]. Technical Report CUCS-005-96 (1996)

  11. Qin, Z., Scheinberg, K., Goldfarb, D.: Efficient block-coordinate descent algorithms for the Group Lasso. Math. Program. Comput. 5, 143–169 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  12. Trussell, H.J., Civanlar, M.R.: The Landweber iteration and projection onto convex sets. IEEE Trans. Acoust. Speech Signal Process. 33, 1632–1634 (1985)

    Article  Google Scholar 

  13. Tibshirani, R.: Regression shrinkage and selection via the LASSO. J. Royal Statist. Soc. B 58, 267–288 (1996)

    MATH  MathSciNet  Google Scholar 

  14. Villa, S., Salzo, S., Baldassarre, L., Verri, A.: Accelerated and inexact forward-backward algorithms. SIAM J. Optim. 23, 1607–1633 (2013)

  15. Vincent, M., Hansen, N.R.: Sparse group lasso and high dimensional multinomial classification. J. Comput. Statist. Data Anal. 71, 771–786 (2014)

  16. Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. Royal Statist. Soc. B 68, 49–67 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  17. Zhang, H.B., Jiang, J.J., Luo, Z.Q.: On the linear convergence of a proximal gradient method for a class of nonsmooth convex minimization problems. J. Oper. Res. Soc. China 1, 163–186 (2013)

    Article  MATH  Google Scholar 

  18. Zhang, H.B., Wei, J., Li, M.X., Zhou, J., Chao, M.T.: On proximal gradient method for the convex problems regularized with the group reproducing kernel norm. J. Global Optim. 58, 169–188 (2014)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Acknowledgments

Haibin Zhang was supported by the National Science Foundation of China (Grant No. 61179033). Yun-Bin Zhao was supported by the Engineering and Physical Sciences Research Council (EPSRC) under the Grant #EP/K00946X/1.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hai-Bin Zhang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, HB., Jiang, JJ. & Zhao, YB. On the proximal Landweber Newton method for a class of nonsmooth convex problems. Comput Optim Appl 61, 79–99 (2015). https://doi.org/10.1007/s10589-014-9703-7

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-014-9703-7

Keywords

Navigation