Abstract
In this paper, we consider a class of nonlinear equations derived from first-order type methods for solving composite optimization problems. Traditional approaches to establishing superlinear convergence rates of semismooth Newton-type methods for solving nonlinear equations usually postulate either nonsingularity of the B-Jacobian or smoothness of the equation. We investigate the feasibility of both conditions. For the nonsingularity condition, we present equivalent characterizations in broad generality and illustrate that they are easy-to-check criteria for some examples. For the smoothness condition, we show that it holds locally for a class of residual mappings derived from composite optimization problems. Furthermore, we investigate a relaxed version of the smoothness condition - smoothness restricted to certain active manifolds. We present a conceptual algorithm utilizing such structures and prove that it has a superlinear convergence rate.



Similar content being viewed by others
Availability of data and materials
The datasets analyzed during the current study are available in [59].
References
Absil, P.-A., Malick, J.: Projection-like retractions on matrix manifolds. SIAM J. Optim. 22(1), 135–158 (2012)
Attouch, H., Bolte, J., Redont, P., Soubeyran, A.: Proximal alternating minimization and projection methods for nonconvex problems: an approach based on the Kurdyka-Łojasiewicz inequality. Math. Oper. Res. 35(2), 438–457 (2010)
Bareilles, G, Iutzeler, F, Malick, J: Newton acceleration on manifolds identified by proximal gradient methods. Mathematical Programming, pp 1–34, (2022)
Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imag. Sci. 2(1), 183–202 (2009)
Bolte, J., Daniilidis, A., Lewis, A., Shiota, M.: Clarke subgradients of stratifiable functions. SIAM J. Optim. 18(2), 556–572 (2007)
Zi Xian, C., Defeng, S.: Constraint nondegeneracy, strong regularity, and nonsingularity in semidefinite programming. SIAM J. Optim. 19(1), 370–396 (2008)
Clarke, F.H., Stern, R.J., Wolenski, P.R.: Proximal smoothness and the lower-\(C^2\) property. J. Convex Anal. 2(1–2), 117–144 (1995)
Clarke, F H.: Nonsmooth analysis and optimization. In Proceedings of the International Congress of Mathematicians, volume 5, pages 847–853. Citeseer, (1983)
Combettes, P L., Pesquet, J-C.: Proximal splitting methods in signal processing. In Fixed-Point Algorithms for Inverse Problems in Science and Engineering, pages 185–212. Springer, (2011)
Daniilidis, A., Hare, W., Malick, J.: Geometrical interpretation of the predictor-corrector type algorithms in structured optimization problems. Optimization 55(5–6), 481–503 (2006)
Davis, D, Drusvyatskiy, D, Shi, Z: Stochastic optimization over proximally smooth sets. arXiv preprint arXiv:2002.06309, (2020)
Davis, D, Yin, W: Convergence rate analysis of several splitting schemes. In Splitting Methods in Communication, Imaging, Science, and Engineering, pages 115–163. Springer, (2016)
Drusvyatskiy, D., Ioffe, A.D., Lewis, A.S.: Generic minimizing behavior in semialgebraic optimization. SIAM J. Optim. 26(1), 513–534 (2016)
Drusvyatskiy, D, Lewis, A S.: Optimality, identifiability, and sensitivity. arXiv preprint arXiv:1207.6628, (2012)
Eckstein, J., Bertsekas, D.P.: On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math. Program. 55(1), 293–318 (1992)
Facchinei, F: Finite-dimensional variational inequalities and complementarity problems, (2003)
Facchinei, F., Fischer, A., Herrich, M.: An LP-Newton method: nonsmooth equations, KKT systems, and nonisolated solutions. Math. Program. 146, 1–36 (2014)
Fan, J., Pan, J.: Inexact Levenberg-Marquardt method for nonlinear equations. Dis. Contin. Dyn. Syst-B 4(4), 1223 (2004)
Fischer, A.: Local behavior of an iterative framework for generalized equations with nonisolated solutions. Math. Program. 94(1), 91–124 (2002)
Fischer, A., Herrich, M., Izmailov, A.F., Solodov, M.V.: Convergence conditions for Newton-type methods applied to complementarity systems with nonisolated solutions. Comput. Optim. Appl. 63, 425–459 (2016)
Fischer, A., Shukla, P., Wang, M.: On the inexactness level of robust Levenberg-Marquardt methods. Optimization 59(2), 273–287 (2010)
Fukushima, M., Mine, H.: A generalized proximal point algorithm for certain non-convex minimization problems. Int. J. Syst. Sci. 12(8), 989–1000 (1981)
Griesse, R., Lorenz, D.A.: A semismooth Newton method for Tikhonov functionals with sparsity constraints. Inverse Prob. 24(3), 035007 (2008)
Hale, E.T., Yin, W., Zhang, Y.: Fixed-point continuation for \(\ell _1\)-minimization: Methodology and convergence. SIAM J. Optim. 19(3), 1107–1130 (2008)
Karimi, H, Nutini, J, Schmidt, M: Linear convergence of gradient and proximal-gradient methods under the Polyak-Łojasiewicz condition. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases, pages 795–811. Springer, (2016)
Kummer, B.: Newton’s method for non-differentiable functions. Adv. Math. Optim. 45(1988), 114–125 (1988)
Lewis, A.S.: Active sets, nonsmoothness, and sensitivity. SIAM J. Optim. 13(3), 702–725 (2002)
Lewis, A.S., Malick, J.: Alternating projections on manifolds. Math. Oper. Res. 33(1), 216–234 (2008)
Li, D.-H., Fukushima, M., Qi, L., Yamashita, N.: Regularized Newton methods for convex minimization problems with singular solutions. Comput. Optim. Appl. 28(2), 131–147 (2004)
Guoyin Li and Ting Kei Pong: Douglas-Rachford splitting for nonconvex optimization with application to nonconvex feasibility problems. Math. Program. 159(1), 371–401 (2016)
Li, X., Sun, D., Toh, K.-C.: A highly efficient semismooth Newton augmented Lagrangian method for solving Lasso problems. SIAM J. Optim. 28(1), 433–458 (2018)
Li, X., Sun, D., Toh, K.-C.: An asymptotically superlinearly convergent semismooth Newton augmented Lagrangian method for linear programming. SIAM J. Optim. 30(3), 2410–2440 (2020)
Li, Y., Wen, Z., Yang, C., Yuan, Y.: A semismooth Newton method for semidefinite programs and its applications in electronic structure calculations. SIAM J. Sci. Comput. 40(6), A4131–A4157 (2018)
Liang, J, Fadili, J, Peyré, G: Local linear convergence of forward–backward under partial smoothness. Advances in Neural Information Processing Systems, 27, (2014)
Liang, J., Fadili, J., Peyré, G.: Activity identification and local linear convergence of forward-backward-type methods. SIAM J. Optim. 27(1), 408–437 (2017)
Lin, M., Liu, Y.-J., Sun, D., Toh, K.-C.: Efficient sparse semismooth Newton methods for the clustered Lasso problem. SIAM J. Optim. 29(3), 2026–2052 (2019)
Lions, P.-L., Mercier, B.: Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal. 16(6), 964–979 (1979)
Liu, Y., Wen, Z., Yin, W.: A multiscale semi-smooth Newton method for optimal transport. J. Sci. Comput. 91(2), 1–29 (2022)
Dinh The Luc and Siegfried Schaible: Generalized monotone nonsmooth maps. J. Convex Anal. 3, 195–206 (1996)
Luo, Z.-Q., Tseng, P.: On the linear convergence of descent methods for convex essentially smooth minimization. SIAM J. Control. Optim. 30(2), 408–425 (1992)
Luo, Z.-Q., Tseng, P.: Error bounds and convergence analysis of feasible descent methods: a general approach. Ann. Oper. Res. 46(1), 157–178 (1993)
Mifflin, R.: Semismooth and semiconvex functions in constrained optimization. SIAM J. Control. Optim. 15(6), 959–972 (1977)
Milzarek, A: Numerical methods and second order theory for nonsmooth problems. PhD thesis, Technische Universität München, (2016)
Milzarek, A., Ulbrich, M.: A semismooth Newton method with multidimensional filter globalization for \(\ell _1\)-optimization. SIAM J. Optim. 24(1), 298–333 (2014)
Milzarek, A., Xiao, X., Cen, S., Wen, Z., Ulbrich, M.: A stochastic semismooth Newton method for nonsmooth nonconvex optimization. SIAM J. Optim. 29(4), 2916–2948 (2019)
Nocedal, J., Wright, S.: Numerical optimization. Science 35(67–68), 7 (1999)
Pang, J.-S.: A posteriori error bounds for the linearly-constrained variational inequality problem. Math. Oper. Res. 12(3), 474–484 (1987)
Pang, J.-S., Qi, L.: Nonsmooth equations: motivation and algorithms. SIAM J. Optim. 3(3), 443–465 (1993)
Poliquin, R.A., Rockafellar, T.R.: Generalized Hessian properties of regularized nonsmooth functions. SIAM J. Optim. 6(4), 1121–1137 (1996)
Qi, L.: Convergence analysis of some algorithms for solving nonsmooth equations. Math. Oper. Res. 18(1), 227–244 (1993)
Qi, L., Sun, J.: A nonsmooth version of Newton’s method. Math. Program. 58(1), 353–367 (1993)
Rockafellar, T.R.: First-and second-order epi-differentiability in nonlinear programming. Trans. Am. Math. Soc. 307(1), 75–108 (1988)
Rockafellar, T R., Wets, Roger J.-B.: Variational analysis, volume 317. Springer Science & Business Media, (2009)
Shapiro, A.: On a class of nonsmooth composite functions. Math. Oper. Res. 28(4), 677–692 (2003)
Stella, L., Themelis, A., Patrinos, P.: Forward-backward quasi-Newton methods for nonsmooth optimization problems. Comput. Optim. Appl. 67(3), 443–487 (2017)
Themelis, A, Ahookhosh, M, Patrinos, P: On the acceleration of forward-backward splitting via an inexact Newton method. Splitting Algorithms, Modern Operator Theory, and Applications, pages 363–412, (2019)
Themelis, A., Stella, L., Patrinos, P.: Forward-backward envelope for the sum of two nonconvex functions: Further properties and nonmonotone linesearch algorithms. SIAM J. Optim. 28(3), 2274–2303 (2018)
Tseng, P.: Approximation accuracy, gradient methods, and error bound for structured convex optimization. Math. Program. 125(2), 263–295 (2010)
Xiao, X., Li, Y., Wen, Z., Zhang, L.: A regularized semi-smooth Newton method with projection steps for composite convex programs. J. Sci. Comput. 76(1), 364–389 (2018)
Yamashita, N, Fukushima, M: On the rate of convergence of the Levenberg–Marquardt method. In Topics in Numerical Analysis: With Special Emphasis on Nonlinear Problems, pages 239–249. Springer, (2001)
Yan, M, Yin, W: Self equivalence of the alternating direction method of multipliers. Splitting Methods in Communication, Imaging, Science, and Engineering, pages 165–194, (2016)
Yang, L., Sun, D., Toh, K.-C.: SDPNAL+: a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraints. Math. Program. Comput. 7(3), 331–366 (2015)
Yang, M, Milzarek, A, Wen, Z, Zhang, T: A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization. Mathematical Programming, pages 1–47, (2021)
Yue, M.-C., Zhou, Z., So, A.M.-C.: A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property. Math. Program. 174(1), 327–358 (2019)
Zhao, X.-Y., Sun, D., Toh, K.-C.: A Newton-CG augmented Lagrangian method for semidefinite programming. SIAM J. Optim. 20(4), 1737–1765 (2010)
Zhou, G., Qi, L.: On the convergence of an inexact Newton-type method. Oper. Res. Lett. 34(6), 647–652 (2006)
Zhou, G., Toh, K.-C.: Superlinear convergence of a Newton-type algorithm for monotone equations. J. Optim. Theory Appl. 125(1), 205–221 (2005)
Zirui, Z., Anthony Man-Cho, S.: A unified approach to error bounds for structured convex optimization problems. Math. Program. 165(2), 689–728 (2017)
Acknowledgements
The authors are grateful to the (associate) editor and anonymous referees for their detailed and valuable comments and suggestions.
Funding
This work was supported in part by the National Natural Science Foundation of China under the grant numbers 12331010 and 12288101, and National Key Research and Development Program of China under the grant number 2024YFA1012903.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors have no relevant financial interest to disclose.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Hu, J., Tian, T., Pan, S. et al. On the Analysis of Semismooth Newton-Type Methods for Composite Optimization. J Sci Comput 103, 59 (2025). https://doi.org/10.1007/s10915-025-02867-4
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10915-025-02867-4