Skip to main content
Log in

Cayley-transform-based gradient and conjugate gradient algorithms on Grassmann manifolds

  • Published:
Advances in Computational Mathematics Aims and scope Submit manuscript

Abstract

In this paper, we study Cayley-transform-based gradient and conjugate gradient algorithms for optimization on Grassmann manifolds. We revisit the Cayley transform on Grassmann manifolds as a retraction in the framework of quotient manifolds constructed by Lie group actions and obtain an efficient formula for this retraction in low-rank cases. We also prove that this retraction is the restriction of the Cayley transform on Stiefel manifolds to horizontal spaces. To develop vector transports on Grassmann manifolds, we introduce a concept called induced vector transports on quotient manifolds. Based on this concept, three vector transports associated with the Cayley transform are obtained. The first vector transport is the traditional orthogonal projection onto horizontal spaces, whereas the other two vector transports are newly proposed herein. We show that one of the new vector transports satisfies the Ring–Wirth non-expansion condition and that the other is isometric. We also simplify the formulae of the new vector transports in low-rank cases. Riemannian gradient and conjugate gradient algorithms are established via the Cayley transform and the three abovementioned vector transports. Numerical experiments on two mean-of-subspaces problems demonstrate the effectiveness of the proposed algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Absil, P. -A., Mahony, R., Sepulchre, R.: Riemannian geometry of Grassmann manifolds with a view on algorithmic computation. Acta Appl. Math. 80, 199–220 (2004)

    Article  MathSciNet  Google Scholar 

  2. Absil, P. -A., Mahony, R., Sepulchre, R.: Optimization algorithms on matrix manifolds. Princeton University Press, Princeton (2008)

  3. Absil, P. -A., Mahony, R., Sepulchre, R., Van Dooren, P.: A Grassmann–Rayleigh quotient iteration for computing invariant subspaces. SIAM Rev. 44, 57–73 (2002)

    Article  MathSciNet  Google Scholar 

  4. Absil, P. -A., Malick, J.: Projection-like retractions on matrix manifolds. SIAM J. Optim. 22, 135–158 (2012)

    Article  MathSciNet  Google Scholar 

  5. Absil, P. -A., Oseledets, I. V.: Low-rank retractions: a survey and new results. Comput. Optim. Appl. 62, 5–29 (2015)

    Article  MathSciNet  Google Scholar 

  6. Barzilai, J., Borwein, J. M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    Article  MathSciNet  Google Scholar 

  7. Boumal, N.: An Introduction to Optimization on Smooth Manifolds. http://sma.epfl.ch/~nboumal/#book

  8. Boumal, N., Absil, P. -A.: Low-rank matrix completion via preconditioned optimization on the Grassmann manifold. Linear Algebra Appl. 475, 200–239 (2015)

    Article  MathSciNet  Google Scholar 

  9. Boumal, N., Mishra, B., Absil, P. -A., Sepulchre, R.: Manopt, a Matlab toolbox for optimization on manifolds. J. Mach. Learn. Res. 15, 1455–1459 (2014)

    MATH  Google Scholar 

  10. Edelman, A., Arias, T. A., Smith, S. T.: The geometry of algorithms with orthogonality constraints. SIAM J. Matrix Anal. Appl. 20, 303–353 (1998)

    Article  MathSciNet  Google Scholar 

  11. Fiori, S., Kaneko, T., Tanaka, T.: Tangent-bundle maps on the Grassmann manifold: application to empirical arithmetic averaging. IEEE Trans. Signal Process. 63, 155–168 (2015)

    Article  MathSciNet  Google Scholar 

  12. Gallot, S., Hulin, D., Lafontaine, J.: Riemannian Geometry, 3rd edn. Springer, Berlin (2004)

    Book  Google Scholar 

  13. Gawlik, E. S., Leok, M.: High-order retractions on matrix manifolds using projected polynomials. SIAM J. Matrix Anal. Appl. 39, 801–828 (2018)

    Article  MathSciNet  Google Scholar 

  14. Harandi, M., Hartley, R., Salzmann, M., Trumpf, J.: Dictionary learning on grassmann manifolds. In: Minh, H., Murino, V. (eds.) Algorithmic Advances in Riemannian Geometry and Applications. Advances in Computer Vision and Pattern Recognition. Springer, Cham (2016)

  15. Harandi, M., Hartley, R., Shen, C., Lovell, B., Sanderson, C.: Extrinsic methods for coding and dictionary learning on Grassmann manifolds. Int. J. Comput. Vis. 114, 113–136 (2015)

    Article  MathSciNet  Google Scholar 

  16. Hauberg, S., Feragen, A., Black, M. J.: Grassmann averages for scalable robust PCA Inproceedings. IEEE Conference on Computer Vision and Pattern Recognition (2014)

  17. Hauberg, S., Feragen, A., Enficiaud, R., Black, M. J.: Scalable robust principal component analysis using Grassmann averages. IEEE Trans. Pattern Anal. Mach. Intell. 38, 2298–2311 (2016)

    Article  Google Scholar 

  18. Hu, J., Liu, X., Wen, Z., Yuan, Y.: A brief introduction to manifold optimization. J. Oper. Res. Soc. China 8, 199–248 (2020)

    Article  MathSciNet  Google Scholar 

  19. Huang, W.: Optimization algorithms on Riemannian manifolds with applications. Ph.D. thesis, Department of Mathematics Florida State University (2013)

  20. Huang, W., Absil, P. -A., Gallivan, K. A.: Intrinsic representation of tangent vectors and vector transports on matrix manifolds. Numer. Math. 136, 523–543 (2017)

    Article  MathSciNet  Google Scholar 

  21. Huang, W., Gallivan, K. A., Absil, P. -A.: A Broyden class of quasi-Newton methods for Riemannian optimization. SIAM J. Optim. 25, 1660–1685 (2015)

    Article  MathSciNet  Google Scholar 

  22. Karcher, H.: Riemannian center of mass and mollifier smoothing. Comm. Pure Appl. Math. 30, 509–541 (1977)

    Article  MathSciNet  Google Scholar 

  23. Lee, J. M.: Introducton to Smooth Manifolds, 2nd edn. Springer, New York (2012)

    Book  Google Scholar 

  24. Moler, C., Van Loan, C.: Nineteen dubious ways to compute the exponential of a matrix, twenty-five years later. SIAM Rev. 45, 3–49 (2003)

    Article  MathSciNet  Google Scholar 

  25. Petersen, P.: Riemannian Geometry, 3rd edn. Springer, Cham (2016)

    Book  Google Scholar 

  26. Qiu, L., Zhang, Y., Li, C.: Unitarily invariant metrics on the Grassmann space. SIAM J. Matrix Anal. Appl. 27, 507–531 (2005)

    Article  MathSciNet  Google Scholar 

  27. Ring, W., Wirth, B.: Optimization methods on Riemannian manifolds and their application to shape space. SIAM J. Optim. 22, 596–627 (2012)

    Article  MathSciNet  Google Scholar 

  28. Sato, H.: A Dai–Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions. Comput. Optim. Appl. 64, 101–118 (2016)

    Article  MathSciNet  Google Scholar 

  29. Sato, H., Iwai, T.: Optimization algorithms on the Grassmann manifold with application to matrix eigenvalue problems. Jpn. J. Indust. Appl. Math. 31, 355–400 (2014)

    Article  MathSciNet  Google Scholar 

  30. Sato, H., Iwai, T.: A new, globally convergent Riemannian conjugate gradient method. Optimization 64, 1011–1031 (2015)

    Article  MathSciNet  Google Scholar 

  31. Sato, H., Kasai, H., Mishra, B.: Riemannian stochastic variance reduced gradient algorithm with retraction and vector transport. SIAM J. Optim. 29, 1444–1472 (2019)

    Article  MathSciNet  Google Scholar 

  32. Sato, K., Sato, H., Damm, T.: Riemannian optimal identification method for linear systems with symmetric positive-definite matrix. IEEE Trans. Autom. Control 65, 4493–4508 (2020)

    Article  MathSciNet  Google Scholar 

  33. Smith, S. T.: Optimization techniques on Riemannian manifolds, in Fields Institute Communications, vol. 3, pp. 113–146. AMS, Providence (1994)

  34. Sterck, H. D., Howse, A.: Nonlinearly preconditioned optimization on Grassmann manifolds for computing approximate Tucker tensor decompositions. SIAM J. Sci. Comput. 38, A997–A1018 (2016)

    Article  MathSciNet  Google Scholar 

  35. Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Math. Program. 142, 397–434 (2013)

    Article  MathSciNet  Google Scholar 

  36. Zhu, X.: A Riemannian conjugate gradient method for optimization on the Stiefel manifold. Comput. Optim. Appl. 67, 73–110 (2017)

    Article  MathSciNet  Google Scholar 

  37. Zhu, X., Duan, C.: On matrix exponentials and their approximations related to optimization on the Stiefel manifold. Optim. Lett. 13, 1069–1083 (2019)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors are grateful to the two anonymous referees for their valuable comments and suggestions.

Funding

X. Zhu is supported by National Natural Science Foundation of China Grant Number 11601317 and Research Project of Ideological and Political Education in Graduate Courses of Shanghai University of Electric Power Grant Number YKJ-2021009. H. Sato is supported by JSPS KAKENHI Grant Number JP20K14359.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaojing Zhu.

Additional information

Communicated by: Ivan Oseledets

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhu, X., Sato, H. Cayley-transform-based gradient and conjugate gradient algorithms on Grassmann manifolds. Adv Comput Math 47, 56 (2021). https://doi.org/10.1007/s10444-021-09880-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10444-021-09880-9

Keywords

Mathematics Subject Classification 2010

Navigation