Skip to main content
Log in

Two adaptive scaled gradient projection methods for Stiefel manifold constrained optimization

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

This article is concerned with the problem of minimizing a smooth function over the Stiefel manifold. In order to address this problem, we introduce two adaptive scaled gradient projection methods that incorporate scaling matrices that depend on the step-size and a parameter that controls the search direction. These iterative algorithms use a projection operator based on the QR factorization to preserve the feasibility in each iteration. However, for some particular cases, the proposals do not require the use of any projection operator. In addition, we consider a Barzilai and Borwein-like step-size combined with the Zhang–Hager nonmonotone line-search technique in order to accelerate the convergence of the proposed procedures. We proved the global convergence for these schemes, and we evaluate their effectiveness and efficiency through an extensive computational study, comparing our approaches with other state-of-the-art gradient-type algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. The OptStiefel code is available in https://www.math.ucla.edu/~wotaoyin/papers/feasible_method_matrix_manifold.html.

  2. The Riemannian conjugate gradient methods Algor.1a, Algor.1b, and Algor.1b+ZH can be downloaded from http://www.optimization-online.org/DB_HTML/2016/09/5617.html.

References

  1. Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization algorithms on matrix manifolds. Princeton University Press, Princeton (2009)

  2. Bin Y.: Projection approximation subspace tracking. IEEE Trans. Signal Process. 43(1), 95–107 (1995)

    Article  Google Scholar 

  3. Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Math. Program. 142(1-2), 397–434 (2013)

    Article  MathSciNet  Google Scholar 

  4. Boufounos, P.T., Baraniuk, R.G.: 1-bit compressive, Sensing. In: In information sciences and systems CISS 2008. 42nd Annual Conference on, IEEE, pp 16–21 (2008)

  5. Laska, J.N., Wen, Z., Yin, W., Baraniuk, R.G.: Trust, but verify: fast and accurate signal recovery from 1-bit compressive measurements. IEEE Trans. Signal Process. 59(11), 5289–5301 (2011)

    Article  MathSciNet  Google Scholar 

  6. Zhaosong, L., Zhang, Y.: An augmented Lagrangian approach for sparse principal component analysis. Math. Program. 135(1-2), 149–193 (2012)

    Article  MathSciNet  Google Scholar 

  7. Yang, C., Meza, J.C., Lee, B., Wang, L.-W.: KSSOLV—a MATLAB toolbox for solving the Kohn-Sham equations. ACM Trans. Math. Softw. (TOMS) 36(2), 10 (2009)

    Article  MathSciNet  Google Scholar 

  8. Kokiopoulou, E., Chen, J., Saad, Y.: Trace optimization and eigenproblems in dimension reduction methods. Numer. Linear. Algeb. Appl. 18(3), 565–602 (2011)

    Article  MathSciNet  Google Scholar 

  9. Saad, Y.: Numerical Methods for Large Eigenvalue Problems. Manchester University Press, Manchester (1992)

    MATH  Google Scholar 

  10. Joho, M, Mathis, H: Joint diagonalization of correlation matrices by using gradient methods with application to blind signal separation. In: Sensor Array and Multichannel Signal Processing Workshop Proceedings, 2002, IEEE, pp 273–277 (2002)

  11. Theis, F.J., Cason, T.P., Absil, P.-A.: Soft dimension reduction for ICA by joint diagonalization on the Stiefel manifold. In: International Conference on Independent Component Analysis and Signal Separation, pp 354–361. Springer, New York (2009)

  12. Eldén, L., Park, H.: A Procrustes problem on the Stiefel manifold. Numer. Math. 82(4), 599–619 (1999)

    Article  MathSciNet  Google Scholar 

  13. Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. J. Sci. Comput. 58(2), 431–449 (2014)

    Article  MathSciNet  Google Scholar 

  14. Edelman, A., Arias, T.A., Smith, S.T.: The geometry of algorithms with orthogonality constraints. SIAM J. Matrix. Anal. Appl. 20(2), 303–353 (1998)

    Article  MathSciNet  Google Scholar 

  15. Abrudan, T.E., Eriksson, J., Koivunen, V.: Steepest descent algorithms for optimization under unitary matrix constraint. IEEE Trans. Signal Process. 56(3), 1134–1147 (2008)

    Article  MathSciNet  Google Scholar 

  16. Nishimori, Y., Akaho, S.: Learning algorithms utilizing quasi-geodesic flows on the Stiefel manifold. Neurocomputing 67, 106–135 (2005)

    Article  Google Scholar 

  17. Manton, J.H.: Optimization algorithms exploiting unitary constraints. IEEE Trans. Signal Process. 50(3), 635–650 (2002)

    Article  MathSciNet  Google Scholar 

  18. Oviedo, H., Lara, H., Dalmau, O.: A non-monotone linear search algorithm with mixed direction on Stiefel manifold. Opt. Methods Softw., 1–21 (2018)

  19. Dalmau-Cedeño, O., Oviedo, H.: A projection method for optimization problems on the Stiefel manifold. In: Mexican Conference on Pattern Recognition, pp 84–93. Springer, New York (2017)

  20. Cedeño, O.S.D., Leon, H.F.O.: Projected nonmonotone search methods for optimization with orthogonality constraints. Comput. Appl. Math., 1–27 (2017)

  21. Bonettini, S., Prato, M.: New convergence results for the scaled gradient projection method. Inv. Problems 31(9), 095008 (2015)

    Article  MathSciNet  Google Scholar 

  22. Bonettini, S., Zanella, R., Zanni, L.: A scaled gradient projection method for constrained image deblurring. Inv. Problems 25(1), 015002 (2008)

    Article  MathSciNet  Google Scholar 

  23. Raydan, M: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7(1), 26–33 (1997)

    Article  MathSciNet  Google Scholar 

  24. Oveido, H.F., Dalmau, O.S., Herrera, R.: Two novel gradient methods with optimal step sizes. Preprint in optimization-online http://www.optimization-online.org/DB_HTML/2020/05/7797.html (2020)

  25. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)

    Article  MathSciNet  Google Scholar 

  26. Zhang, H., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Opt. 14(4), 1043–1056 (2004)

    Article  MathSciNet  Google Scholar 

  27. Francisco, J.B., Viloche Bazán, F.S., Weber Mendonċa, M.: Non-monotone algorithm for minimization on arbitrary domains with applications to large-scale orthogonal Procrustes problem. Appl. Numer. Math. 112, 51–64 (2017)

    Article  MathSciNet  Google Scholar 

  28. Hu, J, Liu, X., Wen, Z., Yuan, Y.: A brief introduction to manifold optimization. arXiv:1906.05450 (2019)

  29. Zhu, X.: A Riemannian conjugate gradient method for optimization on the Stiefel manifold. Comput. Optim. Appl. 67(1), 73–110 (2017)

    Article  MathSciNet  Google Scholar 

  30. Davis, T.A., Hu, Y.: The University of Florida Sparse Matrix Collection. ACM Trans. Math. Softw. (TOMS) 38(1), 1 (2011)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The second author wants to thank the Federal University of Santa Catarina–Brazil and remarks that his contribution to the present article was predominantly carried out at this institution.

Funding

This research was supported in part by Conacyt, Mexico (258033 research grant and H.O.L. PhD. studies scholarship).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Oscar Dalmau.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Oviedo, H., Dalmau, O. & Lara, H. Two adaptive scaled gradient projection methods for Stiefel manifold constrained optimization. Numer Algor 87, 1107–1127 (2021). https://doi.org/10.1007/s11075-020-01001-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-020-01001-9

Keywords

Mathematics Subject Classification (2010)

Navigation