Abstract
We consider a class of nonsmooth convex optimization problems where the objective function is a convex differentiable function regularized by the sum of the group reproducing kernel norm and \(\ell _1\)-norm of the problem variables. This class of problems has many applications in variable selections such as the group LASSO and sparse group LASSO. In this paper, we propose a proximal Landweber Newton method for this class of convex optimization problems, and carry out the convergence and computational complexity analysis for this method. Theoretical analysis and numerical results show that the proposed algorithm is promising.
Similar content being viewed by others
References
Bakin, S.: Adaptive regression and model selection in data mining problems. PhD Thesis, Australian National University, Canberra (1999)
Combettes, P.L., Wajs, V.R.: Signal recovery by proximal forward-backward splitting. SIAM J. Multiscale Model. Simul. 4, 1168–1200 (2005)
Friedman, J., Hastie, T., Tibshirani, R.: A note on the group LASSO and a sparse group LASSO, Technical Report, arXiv:1001.0736v1 (2010)
Johansson, B., Elfving, T., Kozlovc, V., Censor, Y., Forss’en, P.E., Granlund, G.: The application of an oblique-projected Landweber method to a model of supervised learning. Math. Comput. Modell. 43, 892–909 (2006)
Kelley, C.T.: Iterative Methods for Linear and Nonlinear Equations. Society for Industrial and Applied Mathematics, Philadelphia, PA (1995)
Kim, D., Sra, S., Dhillon, I.: A scalable trust-region algorithm with application to mixednorm regression. In: Proceedings of the International Conference on Machine Learning (ICML) (2010)
LeCun, Y., Cortes, C.: The MNIST database of handwritten digits [J] (1998). Retrieved July 17, 2012 from http://yann.lecun.com/exdb/mnist/
Liu, J., Ji, S., Ye, J.: SLEP: Sparse Learning with Efficient Projections. Arizona State University, Phoenix, AZ (2009b)
Luo, Z.Q., Tseng, P.: On the linear convergence of descent methods for convex essentially smooth minimization. SIAM J. Control Optim. 30, 408–425 (1992)
Nene, S.A., Nayar, S.K., Murase, H.: Columbia object image library (COIL-20)[R]. Technical Report CUCS-005-96 (1996)
Qin, Z., Scheinberg, K., Goldfarb, D.: Efficient block-coordinate descent algorithms for the Group Lasso. Math. Program. Comput. 5, 143–169 (2013)
Trussell, H.J., Civanlar, M.R.: The Landweber iteration and projection onto convex sets. IEEE Trans. Acoust. Speech Signal Process. 33, 1632–1634 (1985)
Tibshirani, R.: Regression shrinkage and selection via the LASSO. J. Royal Statist. Soc. B 58, 267–288 (1996)
Villa, S., Salzo, S., Baldassarre, L., Verri, A.: Accelerated and inexact forward-backward algorithms. SIAM J. Optim. 23, 1607–1633 (2013)
Vincent, M., Hansen, N.R.: Sparse group lasso and high dimensional multinomial classification. J. Comput. Statist. Data Anal. 71, 771–786 (2014)
Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. Royal Statist. Soc. B 68, 49–67 (2006)
Zhang, H.B., Jiang, J.J., Luo, Z.Q.: On the linear convergence of a proximal gradient method for a class of nonsmooth convex minimization problems. J. Oper. Res. Soc. China 1, 163–186 (2013)
Zhang, H.B., Wei, J., Li, M.X., Zhou, J., Chao, M.T.: On proximal gradient method for the convex problems regularized with the group reproducing kernel norm. J. Global Optim. 58, 169–188 (2014)
Acknowledgments
Haibin Zhang was supported by the National Science Foundation of China (Grant No. 61179033). Yun-Bin Zhao was supported by the Engineering and Physical Sciences Research Council (EPSRC) under the Grant #EP/K00946X/1.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Zhang, HB., Jiang, JJ. & Zhao, YB. On the proximal Landweber Newton method for a class of nonsmooth convex problems. Comput Optim Appl 61, 79–99 (2015). https://doi.org/10.1007/s10589-014-9703-7
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-014-9703-7