Skip to main content
Log in

An Efficient Hessian Based Algorithm for Singly Linearly and Box Constrained Least Squares Regression

  • Published:
Journal of Scientific Computing Aims and scope Submit manuscript

Abstract

The singly linearly and box constrained least squares regression has diverse applications in various fields. This paper builds upon previous work to develop an efficient and robust semismooth Newton based augmented Lagrangian (Ssnal) algorithm for solving this problem, in which a semismooth Newton (Ssn) algorithm with superlinear or even quadratic convergence is applied to solve the subproblems. Theoretically, the global and asymptotically superlinear local convergence of the Ssnal algorithm hold automatically under standard conditions. Computationally, a generalized Jacobian for the projector onto the feasible set is shown to be either diagonal or diagonal-minus-rank-1, which is a key ingredient for the efficiency of the Ssnal algorithm. Numerical experiments conducted on both synthetic and real data sets demonstrate that the Ssnal algorithm compared to several state-of-the-art first-order algorithms is much more efficient and robust.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. http://leo.ugr.es/elvira/DBCRepository/index.html

  2. https://archive.ics.uci.edu/ml/index.php

  3. https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/

References

  1. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2, 183–202 (2009)

    Article  MathSciNet  Google Scholar 

  2. Björck, Å.: A direct method for sparse least squares problems with lower and upper bounds. Numer. Math. 5(4), 19–32 (1988)

    Article  MathSciNet  Google Scholar 

  3. Chen, X., Ching, W.-K., Chen, X.S., Cong, Y., Tsing, N.-K.: Construction of probabilistic Boolean networks from a prescribed transition probability matrix: a maximum entropy rate approach. East Asian J. Appl. Math. 1, 132–154 (2011)

    Article  MathSciNet  Google Scholar 

  4. Ching, W.-K., Chen, X., Tsing, N.-K.: Generating probabilistic Boolean networks from a prescribed transition probability matrix. IET Syst. Biol. 3, 453–464 (2009)

    Article  Google Scholar 

  5. Clarke, F.H.: Optimization and Nonsmooth Analysis. Wiley, New York (1983)

    MATH  Google Scholar 

  6. Cominetti, R., Mascarenhas, W.F., Silva, P.J.S.: A Newton’s method for the continuous quadratic knapsack problem. Math. Program. Comput. 6, 151–169 (2014)

    Article  MathSciNet  Google Scholar 

  7. Cox, A.J., Higham, N.J.: Accuracy and stability of the null space method for solving the equality constrained least squares problem. BIT 3(9), 34–50 (1999)

    Article  MathSciNet  Google Scholar 

  8. Dai, Y.-H., Fletcher, R.: New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds. Math. Program. 106, 403–421 (2006)

    Article  MathSciNet  Google Scholar 

  9. Facchinei, F.: Minimization of \({\rm SC}^{1}\) functions and the Maratos effect. Oper. Res. Lett. 1(7), 131–137 (1995)

    Article  Google Scholar 

  10. Facchinei, F., Pang, J.-S.: Finite-Dimensional Variational Inequalities and Complementarity Problems. Springer, New York (2003)

    MATH  Google Scholar 

  11. Gabay, D., Mercier, B.: A dual algorithm for the solution of nonlinear variational problems via finite element approximation. Comput. Math. Appl. 2, 17–40 (1976)

    Article  Google Scholar 

  12. Glowinski, R., Marroco, A.: Sur l’approximation, par éléments finis d’ordre un, et la résolution, par pénalisation-dualité d’une classe de problèmes de Dirichlet non linéaires. ESAIM Math. Model. Numer. Anal. 9, 41–76 (1975)

    MATH  Google Scholar 

  13. Golub, G.H., Van Loan, C.F.: Matrix Computations. Johns Hopkins University Press, Baltimore (1996)

    MATH  Google Scholar 

  14. Greville, T.N.E.: Some applications of the pseudoinverse of a matrix. SIAM Rev. 2, 15–22 (1960)

    Article  MathSciNet  Google Scholar 

  15. Han, J.Y., Sun, D.F.: Newton and quasi-Newton methods for normal maps with polyhedral sets. J. Optim. Theory Appl. 9(4), 659–676 (1997)

    Article  MathSciNet  Google Scholar 

  16. Helgason, R., Kennington, J., Lall, H.: A polynomially bounded algorithm for a singly constrained quadratic program. Math. Program. 1(8), 338–343 (1980)

    Article  MathSciNet  Google Scholar 

  17. Huang, L., Jia, J.Z., Yu, B., Chun, B.-G., Maniatis, P., Naik, M.: Predicting execution time of computer programs using sparse polynomial regression. In: Proceedings of the 23rd Conference on Neural Information Processing Systems, vol. 1, pp. 883–891 (2010)

  18. Klopfenstein, Q., Vaiter, S.: Linear support vector regression with linear constraints. To appear in Mach. Learn. 39 pages (2019)

  19. Kummer, B.: Newton’s method for non-differentiable functions. Adv. Math. Optim. 4(5), 114–125 (1988)

    MATH  Google Scholar 

  20. Lemarchal, C., Sagastizbal, C.: Practical aspects of the Moreau-Yosida regularization: theoretical preliminaries. SIAM J. Optim. 7, 367–385 (1997)

    Article  MathSciNet  Google Scholar 

  21. Li, X.D., Sun, D.F., Toh, K.-C.: A highly efficient semismooth Newton augmented Lagrangian method for solving Lasso problems. SIAM J. Optim. 2(8), 433–458 (2018)

    Article  MathSciNet  Google Scholar 

  22. Li, X.D., Sun, D.F., Toh, K.-C.: On efficiently solving the subproblems of a level-set method for fused Lasso problems. SIAM J. Optim. 2(8), 1842–1866 (2018)

    Article  MathSciNet  Google Scholar 

  23. Li, X.D., Sun, D.F., Toh, K.-C.: On the efficient computation of a generalized Jacobian of the projector over the Birkhoff polytope. Math. Program. 179, 419–446 (2020)

    Article  MathSciNet  Google Scholar 

  24. Lin, M.X., Liu, Y.-J., Sun, D.F., Toh, K.-C.: Efficient sparse semismooth Newton methods for the clustered Lasso problem. SIAM J. Optim. 2(9), 2026–2052 (2019)

    Article  MathSciNet  Google Scholar 

  25. Liu, J., Yuan, L., Ye, J.P.: An efficient algorithm for a class of fused Lasso problems. In: Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, vol. 2010, pp. 323–332. (2010)

  26. Luo, Z.Y., Sun, D.F., Toh, K.-C., Xiu, N.H.: Solving the OSCAR and SLOPE models using a semismooth Newton-based augmented Lagrangian method. J. Mach. Learn. Res. 2, 1–25 (2019)

    Article  MathSciNet  Google Scholar 

  27. Luque, F.J.: Asymptotic convergence analysis of the proximal point algorithm. SIAM J. Control. Optim. 2(2), 277–293 (1984)

    Article  MathSciNet  Google Scholar 

  28. Mifflin, R.: Semismooth and semiconvex functions in constrained optimization. SIAM J. Control Optim. 1(5), 959–972 (1977)

    Article  MathSciNet  Google Scholar 

  29. Moreau, J.-J.: Proximité et dualité dans un espace hilbertien. Bull. Soc. Math. France 9(3), 273–299 (1965)

    Article  Google Scholar 

  30. Morigi, S., Reichel, L., Sgallari, F., Zama, F.: An iterative method for linear discrete ill-posed problems with box constraints. J. Comput. Appl. Math. 198, 505–520 (2007)

    Article  MathSciNet  Google Scholar 

  31. Robinson, S.M.: Some continuity properties of polyhedral multifunctions. Math. Program. Stud. 1(4), 206–214 (1981)

    Article  MathSciNet  Google Scholar 

  32. Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (1970)

    Book  Google Scholar 

  33. Rockafellar, R.T.: Augmented Lagrangians and applications of the proximal point algorithm in convex programming. Math. Oper. Res. 1, 97–116 (1976)

    Article  MathSciNet  Google Scholar 

  34. Rockafellar, R.T.: Monotone operators and the proximal point algorithm. SIAM J. Control Optim. 1(4), 877–898 (1976)

    Article  MathSciNet  Google Scholar 

  35. Rockafellar, R.T., Wets, R.J.-B.: Variational Analysis. Springer, Berlin (1998)

    Book  Google Scholar 

  36. Schölkopf, B., Platt, J.C., Shawe-Taylor, J., Smola, A.J., Williamson, R.C.: Estimating the support of a high-dimensional distribution. Neural Comput. 1(3), 1443–1471 (2001)

    Article  Google Scholar 

  37. Stewart, G.W.: On the weighting method for least squares problems with linear equality constraints. BIT 3(7), 961–967 (1997)

    Article  MathSciNet  Google Scholar 

  38. Teng, Y.Y., Qi, S.L., Xiao, D.Y., Xu, L.S., Kang, Y.: A general solution to least squares problems with box constraints and its applications. Math. Probl. Eng. 6, 1–11 (2016)

    Article  MathSciNet  Google Scholar 

  39. Wen, Y.W., Wang, M., Cao, Z.Y., Cheng, X.Q., Ching, W.-K., Vassiliadis, V.S.: Sparse solution of nonnegative least squares problems with applications in the construction of probabilistic Boolean networks. Numer. Linear Algebra Appl. 2(2), 883–899 (2015)

    Article  MathSciNet  Google Scholar 

  40. Zeb, S., Yousaf, M.: Repeated QR updating algorithm for solution of equality constrained linear least squares problems. J. Math. 4(9), 51–61 (2017)

    MathSciNet  MATH  Google Scholar 

  41. Zhang, Y.J., Zhang, N., Sun, D.F., Toh, K.-C.: An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems. Math. Program. 3, 1–41 (2018)

    Article  Google Scholar 

  42. Zhao, X.-Y., Sun, D.F., Toh, K.-C.: A Newton-CG augmented Lagrangian method for semidefinite programming. SIAM J. Optim. 2, 1737–1765 (2010)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The work of Yong-Jin Liu was in part supported by the National Natural Science Foundation of China (Grant No. 11871153) and the Natural Science Foundation of Fujian Province of China (Grant No. 2019J01644).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yong-Jin Liu.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lin, L., Liu, YJ. An Efficient Hessian Based Algorithm for Singly Linearly and Box Constrained Least Squares Regression. J Sci Comput 88, 26 (2021). https://doi.org/10.1007/s10915-021-01541-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10915-021-01541-9

Keywords

Mathematics Subject Classification

Navigation