Skip to main content
Log in

Newton’s Method for Computing the Nearest Correlation Matrix with a Simple Upper Bound

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

The standard nearest correlation matrix can be efficiently computed by exploiting a recent development of Newton’s method (Qi and Sun in SIAM J. Matrix Anal. Appl. 28:360–385, 2006). Two key mathematical properties, that ensure the efficiency of the method, are the strong semismoothness of the projection operator onto the positive semidefinite cone and constraint nondegeneracy at every feasible point. In the case where a simple upper bound is enforced in the nearest correlation matrix in order to improve its condition number, it is shown, among other things, that constraint nondegeneracy does not always hold, meaning Newton’s method may lose its quadratic convergence. Despite this, the numerical results show that Newton’s method is still extremely efficient even for large scale problems. Through regularization, the developed method is applied to semidefinite programming problems with simple bounds.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Higham, N.J.: Computing the nearest correlation matrix—a problem from finance. IMA J. Numer. Anal. 22, 329–343 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  2. Malick, J.: A dual approach to semidefinite least-squares problems. SIAM J. Matrix Anal. Appl. 26, 272–284 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  3. Boyd, S., Xiao, L.: Least-squares covariance matrix adjustment. SIAM J. Matrix Anal. Appl. 27, 532–546 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  4. Toh, K.C., Tütüncü, R.H., Todd, M.J.: Inexact primal-dual path-following algorithms for a special class of convex quadratic SDP and related problems. Pac. J. Optim. 3, 135–164 (2007)

    MATH  MathSciNet  Google Scholar 

  5. Toh, K.C.: An inexact primal-dual path-following algorithm for convex quadratic SDP. Math. Program. 112, 221–254 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  6. Qi, H.-D., Sun, D.F.: A quadratically convergent Newton method for computing the nearest correlation matrix. SIAM J. Matrix Anal. Appl. 28, 360–385 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  7. Borsdorf, R., Higham, N.J.: A preconditioned Newton algorithm for the nearest correlation matrix. IMA J. Numer. Anal. 94, 94–107 (2010)

    Article  MathSciNet  Google Scholar 

  8. Gao, Y., Sun, D.F.: Calibrating least squares covariance matrix problems with equality and inequality constraints. SIAM J. Matrix Anal. Appl. 31, 1423–1457 (2010)

    Article  Google Scholar 

  9. Qi, H.-D., Sun, D.F.: Correlation stress testing for value-at-risk: an unconstrained convex optimization approach. Comput. Optim. Appl. 45, 427–462 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  10. Rockafellar, R.T.: Conjugate Duality and Optimization. SIAM, Philadelphia (1974)

    MATH  Google Scholar 

  11. Deutsch, F.: Best Approximation in Inner Product Spaces. CMS Books in Mathematics, vol. 7. Springer, New York (2001)

    MATH  Google Scholar 

  12. Borwein, J., Lewis, A.S.: Partially finite convex programming I: Quasi relative interiors and duality theory. Math. Program. 57, 15–48 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  13. Sun, D.F., Sun, J.: Semismooth matrix valued functions. Math. Oper. Res. 27, 150–169 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  14. Qi, L., Sun, J.: A nonsmooth version of Newton’s method. Math. Program. 58, 353–367 (1993)

    Article  MathSciNet  Google Scholar 

  15. Kummer, B.: Newton’s method for nondifferentiable functions. In: Guddat, J., Bank, B., Hollatz, H., Kall, P., Klatte, D., Kummer, B., Lommatzsch, K., Tammer, L., Vlach, M., Zimmerman, K. (eds.) Advances in Mathematical Optimization, pp. 114–125. Academie Verlag, Berlin (1988)

    Google Scholar 

  16. Werner, R., Schöettle, K.: Calibration of correlation matrices—SDP or not SDP. Technical Report, Munich University of Technology (2007)

  17. Alizadeh, F.: Interior point methods in semidefinite programming with applications to combinatorial optimizations. SIAM J. Optim. 5, 13–51 (1995)

    Article  MATH  MathSciNet  Google Scholar 

  18. Dattorro, J.: Convex Optimization and Euclidean Distance Geometry. Meboo Publishing USA, California (2005)

    Google Scholar 

  19. Bonnans, J.F., Shapiro, A.: Perturbation Analysis of Optimization Problems. Springer, New York (2000)

    MATH  Google Scholar 

  20. Bonnans, J.F., Shapiro, A.: Nondegeneracy and quantitative stability of parameterized optimization problems with multiple solutions. SIAM J. Optim. 8, 940–946 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  21. Shapiro, A., Fan, M.K.H.: On eigenvalue optimization. SIAM J. Optim. 5, 552–569 (1995)

    Article  MATH  MathSciNet  Google Scholar 

  22. Alizadeh, F., Haeberly, J.-P.A., Overton, M.L.: Complementarity and nondegeneracy in semidefinite programming. Math. Program. 77, 111–128 (1997)

    MathSciNet  Google Scholar 

  23. Chan, Z.X., Sun, D.F.: Constraint nondegeneracy, strong regularity and nonsingularity in semidefinite programming. SIAM J. Optim. 19, 370–396 (2008)

    Article  MathSciNet  Google Scholar 

  24. Sun, D.F.: The strong second-order sufficient condition and constraint nondegeneracy in nonlinear semidefinite programming and their implications. Math. Oper. Res. 31, 761–776 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  25. Zhao, X.Y., Sun, D.F., Toh, K.C.: A Newton-CG augmented Lagrangian method for semidefinite programming. SIAM J. Optim. 20, 1735–1765 (2010)

    Article  MathSciNet  Google Scholar 

  26. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49, 409–436 (1952)

    MATH  MathSciNet  Google Scholar 

  27. Kočvara, M., Stingl, M.: PENNON: a generalized augmented Lagrangian method for semidefinite programming. Optim. Method. Softw. 18, 317–333 (2003)

    Article  MATH  Google Scholar 

  28. Zhao, X.Y.: A semismooth Newton-CG augmented Lagrangian method for large scale linear and convex quadratic SDPs. Ph.D. thesis, National University of Singapore (2009)

  29. Malick, J., Povh, J., Rendl, F., Wiegele, A.: Regularization methods for semidefinite programming. SIAM J. Optim. 20, 336–356 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  30. Chen, X., Qi, L., Sun, D.F.: Global and superlinear convergence of the smoothing Newton method and its application to general box constrained variational inequalities. Math. Comput. 67, 519–540 (1998)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Donghui Li.

Additional information

Communicated by X.-Q. Yang.

D. Li’s research is supported by the major project of the Ministry of Education of China Grant 309023 and the NSF of China Grant 10771057.

H. Qi’s research was partially supported by EPSRC Grant EP/D502535/1.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Li, Q., Li, D. & Qi, H. Newton’s Method for Computing the Nearest Correlation Matrix with a Simple Upper Bound. J Optim Theory Appl 147, 546–568 (2010). https://doi.org/10.1007/s10957-010-9738-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-010-9738-6

Keywords

Navigation