Abstract
The singly linearly and box constrained least squares regression has diverse applications in various fields. This paper builds upon previous work to develop an efficient and robust semismooth Newton based augmented Lagrangian (Ssnal) algorithm for solving this problem, in which a semismooth Newton (Ssn) algorithm with superlinear or even quadratic convergence is applied to solve the subproblems. Theoretically, the global and asymptotically superlinear local convergence of the Ssnal algorithm hold automatically under standard conditions. Computationally, a generalized Jacobian for the projector onto the feasible set is shown to be either diagonal or diagonal-minus-rank-1, which is a key ingredient for the efficiency of the Ssnal algorithm. Numerical experiments conducted on both synthetic and real data sets demonstrate that the Ssnal algorithm compared to several state-of-the-art first-order algorithms is much more efficient and robust.
Similar content being viewed by others
References
Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2, 183–202 (2009)
Björck, Å.: A direct method for sparse least squares problems with lower and upper bounds. Numer. Math. 5(4), 19–32 (1988)
Chen, X., Ching, W.-K., Chen, X.S., Cong, Y., Tsing, N.-K.: Construction of probabilistic Boolean networks from a prescribed transition probability matrix: a maximum entropy rate approach. East Asian J. Appl. Math. 1, 132–154 (2011)
Ching, W.-K., Chen, X., Tsing, N.-K.: Generating probabilistic Boolean networks from a prescribed transition probability matrix. IET Syst. Biol. 3, 453–464 (2009)
Clarke, F.H.: Optimization and Nonsmooth Analysis. Wiley, New York (1983)
Cominetti, R., Mascarenhas, W.F., Silva, P.J.S.: A Newton’s method for the continuous quadratic knapsack problem. Math. Program. Comput. 6, 151–169 (2014)
Cox, A.J., Higham, N.J.: Accuracy and stability of the null space method for solving the equality constrained least squares problem. BIT 3(9), 34–50 (1999)
Dai, Y.-H., Fletcher, R.: New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds. Math. Program. 106, 403–421 (2006)
Facchinei, F.: Minimization of \({\rm SC}^{1}\) functions and the Maratos effect. Oper. Res. Lett. 1(7), 131–137 (1995)
Facchinei, F., Pang, J.-S.: Finite-Dimensional Variational Inequalities and Complementarity Problems. Springer, New York (2003)
Gabay, D., Mercier, B.: A dual algorithm for the solution of nonlinear variational problems via finite element approximation. Comput. Math. Appl. 2, 17–40 (1976)
Glowinski, R., Marroco, A.: Sur l’approximation, par éléments finis d’ordre un, et la résolution, par pénalisation-dualité d’une classe de problèmes de Dirichlet non linéaires. ESAIM Math. Model. Numer. Anal. 9, 41–76 (1975)
Golub, G.H., Van Loan, C.F.: Matrix Computations. Johns Hopkins University Press, Baltimore (1996)
Greville, T.N.E.: Some applications of the pseudoinverse of a matrix. SIAM Rev. 2, 15–22 (1960)
Han, J.Y., Sun, D.F.: Newton and quasi-Newton methods for normal maps with polyhedral sets. J. Optim. Theory Appl. 9(4), 659–676 (1997)
Helgason, R., Kennington, J., Lall, H.: A polynomially bounded algorithm for a singly constrained quadratic program. Math. Program. 1(8), 338–343 (1980)
Huang, L., Jia, J.Z., Yu, B., Chun, B.-G., Maniatis, P., Naik, M.: Predicting execution time of computer programs using sparse polynomial regression. In: Proceedings of the 23rd Conference on Neural Information Processing Systems, vol. 1, pp. 883–891 (2010)
Klopfenstein, Q., Vaiter, S.: Linear support vector regression with linear constraints. To appear in Mach. Learn. 39 pages (2019)
Kummer, B.: Newton’s method for non-differentiable functions. Adv. Math. Optim. 4(5), 114–125 (1988)
Lemarchal, C., Sagastizbal, C.: Practical aspects of the Moreau-Yosida regularization: theoretical preliminaries. SIAM J. Optim. 7, 367–385 (1997)
Li, X.D., Sun, D.F., Toh, K.-C.: A highly efficient semismooth Newton augmented Lagrangian method for solving Lasso problems. SIAM J. Optim. 2(8), 433–458 (2018)
Li, X.D., Sun, D.F., Toh, K.-C.: On efficiently solving the subproblems of a level-set method for fused Lasso problems. SIAM J. Optim. 2(8), 1842–1866 (2018)
Li, X.D., Sun, D.F., Toh, K.-C.: On the efficient computation of a generalized Jacobian of the projector over the Birkhoff polytope. Math. Program. 179, 419–446 (2020)
Lin, M.X., Liu, Y.-J., Sun, D.F., Toh, K.-C.: Efficient sparse semismooth Newton methods for the clustered Lasso problem. SIAM J. Optim. 2(9), 2026–2052 (2019)
Liu, J., Yuan, L., Ye, J.P.: An efficient algorithm for a class of fused Lasso problems. In: Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, vol. 2010, pp. 323–332. (2010)
Luo, Z.Y., Sun, D.F., Toh, K.-C., Xiu, N.H.: Solving the OSCAR and SLOPE models using a semismooth Newton-based augmented Lagrangian method. J. Mach. Learn. Res. 2, 1–25 (2019)
Luque, F.J.: Asymptotic convergence analysis of the proximal point algorithm. SIAM J. Control. Optim. 2(2), 277–293 (1984)
Mifflin, R.: Semismooth and semiconvex functions in constrained optimization. SIAM J. Control Optim. 1(5), 959–972 (1977)
Moreau, J.-J.: Proximité et dualité dans un espace hilbertien. Bull. Soc. Math. France 9(3), 273–299 (1965)
Morigi, S., Reichel, L., Sgallari, F., Zama, F.: An iterative method for linear discrete ill-posed problems with box constraints. J. Comput. Appl. Math. 198, 505–520 (2007)
Robinson, S.M.: Some continuity properties of polyhedral multifunctions. Math. Program. Stud. 1(4), 206–214 (1981)
Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (1970)
Rockafellar, R.T.: Augmented Lagrangians and applications of the proximal point algorithm in convex programming. Math. Oper. Res. 1, 97–116 (1976)
Rockafellar, R.T.: Monotone operators and the proximal point algorithm. SIAM J. Control Optim. 1(4), 877–898 (1976)
Rockafellar, R.T., Wets, R.J.-B.: Variational Analysis. Springer, Berlin (1998)
Schölkopf, B., Platt, J.C., Shawe-Taylor, J., Smola, A.J., Williamson, R.C.: Estimating the support of a high-dimensional distribution. Neural Comput. 1(3), 1443–1471 (2001)
Stewart, G.W.: On the weighting method for least squares problems with linear equality constraints. BIT 3(7), 961–967 (1997)
Teng, Y.Y., Qi, S.L., Xiao, D.Y., Xu, L.S., Kang, Y.: A general solution to least squares problems with box constraints and its applications. Math. Probl. Eng. 6, 1–11 (2016)
Wen, Y.W., Wang, M., Cao, Z.Y., Cheng, X.Q., Ching, W.-K., Vassiliadis, V.S.: Sparse solution of nonnegative least squares problems with applications in the construction of probabilistic Boolean networks. Numer. Linear Algebra Appl. 2(2), 883–899 (2015)
Zeb, S., Yousaf, M.: Repeated QR updating algorithm for solution of equality constrained linear least squares problems. J. Math. 4(9), 51–61 (2017)
Zhang, Y.J., Zhang, N., Sun, D.F., Toh, K.-C.: An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems. Math. Program. 3, 1–41 (2018)
Zhao, X.-Y., Sun, D.F., Toh, K.-C.: A Newton-CG augmented Lagrangian method for semidefinite programming. SIAM J. Optim. 2, 1737–1765 (2010)
Acknowledgements
The work of Yong-Jin Liu was in part supported by the National Natural Science Foundation of China (Grant No. 11871153) and the Natural Science Foundation of Fujian Province of China (Grant No. 2019J01644).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Lin, L., Liu, YJ. An Efficient Hessian Based Algorithm for Singly Linearly and Box Constrained Least Squares Regression. J Sci Comput 88, 26 (2021). https://doi.org/10.1007/s10915-021-01541-9
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10915-021-01541-9