Skip to main content
Log in

Optimality conditions for Tucker low-rank tensor optimization

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

Optimization problems with tensor variables are widely used in statistics, machine learning, pattern recognition, signal processing, computer vision, etc. Among these applications, the low-rankness of tensors is an intrinsic property that can help unearth potential but important structure or feature in the corresponding high-dimensional multi-way datasets, leading to the study on low-rank tensor optimization (LRTO for short). For the general framework of LRTO, little has been addressed in optimization theory. This motivates us to study the optimality conditions, with special emphasis on the Tucker low-rank constrained problems and the Tucker low-rank decomposition-based reformulations. It is noteworthy that all the involved optimization problems are nonconvex, and even discontinuous, due to the complexity of the tensor Tucker rank function or the multi-linear decomposition with the orthogonality or even group sparsity constraints imposed on factor matrices. By employing the tools in variational analysis, especially the normal cones to low-rank matrices and the properties of matrix manifolds, we propose necessary and/or sufficient optimality conditions for Tucker low-rank tensor optimization problems, which will enrich the context of the nonconvex and nonsmooth optimization.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Data availibility

No datasets are used in the paper.

References

  1. Absil, P.A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)

    MATH  Google Scholar 

  2. Bi, X., Tang, X., Yuan, Y., Zhang, Y., Annie, Q.: Tensors in statistics. Annu. Rev. Stat. Appl. 8(1), 345–368 (2021)

    MathSciNet  Google Scholar 

  3. Candès, E.J., Recht, B.: Exact matrix completion via convex optimization. Found. Comput. Math. 9(6), 717–772 (2009)

    MathSciNet  MATH  Google Scholar 

  4. Candès, E.J., Tao, T.: Decoding by linear programming. IEEE Trans. Inf. Theory 51(12), 4203–4215 (2005)

    MathSciNet  MATH  Google Scholar 

  5. Canyi, L., Feng, J., Chen, Y., Liu, W., Lin, Z., Yan, S.: Tensor robust principal component analysis with a new tensor nuclear norm. IEEE Trans. Pattern Anal. Mach. Intell. 42(4), 925–938 (2020)

    Google Scholar 

  6. Che, M., Wei, Y., Yan, H.: The computation of low multilinear rank approximations of tensors via power scheme and random projection. SIAM J. Matrix Anal. Appl. 41(2), 605–636 (2020)

    MathSciNet  MATH  Google Scholar 

  7. Chen, B., Li, Z.: On tensor spectral \(p\)-norm and its dual norm via partitions. Comput. Optim. Appl. 75, 609–628 (2020)

    MathSciNet  MATH  Google Scholar 

  8. Chen, C., Batselier, K., Wenjian, Yu., Wong, N.: Kernelized support tensor train machines. Pattern Recogn. 122, 108337 (2022)

    Google Scholar 

  9. Chen, H., Raskutti, G., Yuan, M.: Non-convex projected gradient descent for generalized low-rank tensor regression. J. Mach. Learn. Res. 20, 172–208 (2019)

    MathSciNet  MATH  Google Scholar 

  10. Chen, X., Pan, L., Xiu, N.: Solution sets of three sparse optimization problems for multivariate regression. J. Global Optim. (2022). https://doi.org/10.1007/s10898-021-01124-w

    Article  MATH  Google Scholar 

  11. Cheng, M., Jing, L., Michael, K.N.: Tensor-based low-dimensional representation learning for multi-view clustering. IEEE Trans. Image Process. 28(5), 2399–2414 (2019)

    MathSciNet  Google Scholar 

  12. De Lathauwer, L., De Moor, B., Vandewalle, J.: A multilinear singular value decomposition. SIAM J. Matrix Anal. Appl. 21(4), 1253–1278 (2000)

    MathSciNet  MATH  Google Scholar 

  13. De Lathauwer, L., De Moor, B., Vandewalle, J..: On the best rank-1 and rank-\((r_1, r_2,..., r_n)\) approximation of higher-order tensors. SIAM J. Matrix Anal. Appl. 21, 1324–1342 (2000)

  14. Ding, W., Wei, Y.: Theory and Computation of Tensors: Multi-Dimensional Arrays. Elsevier, New York (2016)

    MATH  Google Scholar 

  15. Donoho, D.L.: Compressed sensing. IEEE Trans. Inf. Theory 52(4), 1289–1306 (2006)

    MathSciNet  MATH  Google Scholar 

  16. Drakopoulos, G., Spyrou, E., Mylonas, P.: Tensor clustering: a review. In: 2019 14th International Workshop on Semantic and Social Media Adaptation and Personalization (SMAP), pp. 1–6 (2019)

  17. Eldén, L., Savas, B.: Perturbation theory and optimality conditions for the best multilinear rank approximation of a tensor. SIAM J. Matrix Anal. Appl. 32(4), 1422–1450 (2011)

    MathSciNet  MATH  Google Scholar 

  18. Goldfarb, D., Qin, Z.: Robust low-rank tensor recovery: models and algorithms. SIAM J. Matrix Anal. Appl. 35(1), 225–253 (2014)

    MathSciNet  MATH  Google Scholar 

  19. Hao, B., Zhang, A., Cheng, G.: Sparse and low-rank tensor estimation via cubic sketchings. IEEE Trans. Inf. Theory 66(9), 5927–5964 (2020)

    MathSciNet  MATH  Google Scholar 

  20. Helmke, U., Shayman, M.A.: Critical points of matrix least squares distance functions. Linear Algebra Appl. 215(2), 1–19 (1995)

    MathSciNet  MATH  Google Scholar 

  21. Jiang, H., Liu, X., Wen, Z., Yuan, Y.: A brief introduction to manifold optimization. J. Oper. Res. Soc. China 8(2), 199–248 (2020)

    MathSciNet  MATH  Google Scholar 

  22. Janzamin, M., Ge, R., Kossaifi, J., Anandkumar, A.: Spectral learning on matrices and tensors. Found. Trends® Mach. Learn. 12, 393–536 (2019)

  23. Kilmer, M.E., Martin, C.D.: Factorization strategies for third-order tensors. Linear Algebra Appl. 435(3), 641–658 (2011)

    MathSciNet  MATH  Google Scholar 

  24. Koch, O., Lubich, C.: Dynamical tensor approximation. SIAM J. Matrix Anal. Appl. 31(5), 2360–2375 (2010)

    MathSciNet  MATH  Google Scholar 

  25. Kolda, T.G., Bader, B.W.: Tensor decompositions with applications. SIAM Rev. 51(3), 455–500 (2009)

    MathSciNet  MATH  Google Scholar 

  26. Kotsia, I., Patras, I.: Support tucker machines. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition, pp. 633–640 (2011)

  27. Kressner, D., Steinlechner, M., Vandereycken, B.: Low-rank tensor completion by Riemannian optimization. BIT Numer. Math. 54(2), 447–468 (2014)

    MathSciNet  MATH  Google Scholar 

  28. Li, X., Da, X., Zhou, H., Li, L.: Tucker tensor regression and neuroimaging analysis. Stat. Biosci. 10, 520–545 (2018)

    Google Scholar 

  29. Li, X., Song, W., Xiu, N.: Optimality conditions for rank-constrained matrix optimization. J. Oper. Res. Soc. China 7(2), 285–301 (2019)

    MathSciNet  MATH  Google Scholar 

  30. Lian, H.: Learning rate for convex support tensor machines. IEEE Trans. Neural Netw. Learn. Syst. 32(8), 3755–3760 (2021)

    MathSciNet  Google Scholar 

  31. Liu, J., Musialski, P., Wonka, P., Ye, J.: Tensor completion for estimating missing values in visual data. In: 2009 IEEE 12th International Conference on Computer Vision, pp. 2114–2121 (2009)

  32. Liu, J., Zhu, C., Long, Z., Liu, Y.: Tensor regression. Found. Trends® Mach. Learn. 14(4), 379–565 (2021)

  33. Liu, Y., Liu, J., Long, Z., Zhu, C.: Tensor Computation for Data Analysis. Springer, Berlin (2022)

    Google Scholar 

  34. Minster, R., Saibaba, A.K., Kilmer, M.E.: Randomized algorithms for low-rank tensor decompositions in the Tucker format. SIAM J. Math. Data Sci. 2(1), 189–215 (2020)

    MathSciNet  MATH  Google Scholar 

  35. Oseledets, I.: Tensor-train decomposition. SIAM J. Sci. Comput. 33(5), 2295–2317 (2011)

    MathSciNet  MATH  Google Scholar 

  36. Qi, L., Chen, H., Chen, Y.: Tensor Eigenvalues and Their Applications. Springer, Berlin (2018)

    MATH  Google Scholar 

  37. Qi, L., Chen, Y., Bakshi, M., Zhang, X.: Triple decomposition and tensor recovery of third order tensors. SIAM J. Matrix Anal. Appl. 42(1), 299–329 (2021)

    MathSciNet  MATH  Google Scholar 

  38. Qi, L., Luo, Z.: Tensor Analysis: Spectral Theory and Special Tensors. SIAM Press, Philadelphia (2017)

    MATH  Google Scholar 

  39. Rabanser, S., Shchur, O., Günnemann, S.: Introduction to tensor decompositions and their applications in machine learning. arXiv preprint, arXiv:1711.10781 (2017)

  40. Raskutti, G., Yuan, M., Chen, H.: Convex regularization for high-dimensional multiresponse tensor regression. Ann. Stat. 47(3), 1554–1584 (2019)

    MathSciNet  MATH  Google Scholar 

  41. Rockafellar, R.T., Wets, B., Roger, J.: Variational Analysis. Springer, Berlin (2013)

    MATH  Google Scholar 

  42. Russell Luke, D.: Prox-regularity of rank constraint sets and implications for algorithms. J. Math. Imaging Vis. 47(3), 231–238 (2013)

    MathSciNet  MATH  Google Scholar 

  43. Schneider, R., Uschmajew, A.: Convergence results for projected line-search methods on varieties of low-rank matrices via Łojasiewicz inequality. SIAM J. Optim. 25(1), 622–646 (2015)

    MathSciNet  MATH  Google Scholar 

  44. Sidiropoulos, N.D., De Lathauwer, L., Xiao, F., Huang, K., Papalexakis, E.E., Faloutsos, C.: Tensor decomposition for signal processing and machine learning. IEEE Trans. Signal Process. 65(13), 3551–3582 (2017)

    MathSciNet  MATH  Google Scholar 

  45. Song, Q., Ge, H., Caverlee, J., Xia, H.: Tensor completion algorithms in big data analytics. ACM Trans. Knowl. Discov. Data 13(1), 1–48 (2019)

    Google Scholar 

  46. Sun, W.W., Hao, B., Li, L.: Tensors in modern statistical learning. Wiley StatsRef: Statistics Reference Online, pp. 1–25 (2021)

  47. Sun, W.W., Li, L.: STORE: Sparse tensor response regression and neuroimaging analysis. J. Mach. Learn. Res. 18, 1–37 (2017)

    MathSciNet  MATH  Google Scholar 

  48. Tao, D., Li, X., Xindong, W., Weiming, H., Maybank, S.J.: Supervised tensor learning. Knowl. Inf. Syst. 13, 1–42 (2007)

    Google Scholar 

  49. Vannieuwenhoven, N., Vandebril, R., Meerbergen, K.: A new truncation strategy for the higher-order singular value decomposition. SIAM J. Sci. Comput. 34(2), A1027–A1052 (2012)

    MathSciNet  MATH  Google Scholar 

  50. Wang, R., Xiu, N., Toh, K.-C.: Subspace quadratic regularization method for group sparse multinomial logistic regression. Comput. Optim. Appl. 79, 531–559 (2021)

    MathSciNet  MATH  Google Scholar 

  51. Xiaotong, Yu., Luo, Z.: A sparse tensor optimization approach for background subtraction from compressive measurements. Multimedia Tools Appl. 80, 26657–26682 (2021)

    Google Scholar 

  52. Xiaotong, Yu., Luo, Z., Qi, L., Yanwei, X.: SLRTA: a sparse and low-rank tensor-based approach to internet traffic anomaly detection. Neurocomputing 434, 295–314 (2021)

    Google Scholar 

  53. Yang, W., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific J. Optim. 10(2), 415–434 (2014)

    MathSciNet  MATH  Google Scholar 

  54. Yuan, M., Zhang, C.-H.: On tensor completion via nuclear norm minimization. Found. Comput. Math. 16, 1031–1068 (2016)

    MathSciNet  MATH  Google Scholar 

  55. Zhang, A., Luo, Y., Raskutti, G., Yuan, M.: ISLET: fast and optimal low-rank tensor regression via importance sketching. SIAM J. Math. Data Sci. 2(2), 444–479 (2020)

    MathSciNet  MATH  Google Scholar 

  56. Zhao, Q., Zhou, G., Xie, S., Zhang, L., Cichocki, A.: Tensor ring decomposition. arXiv preprint, arXiv:1606.05535 (2016)

  57. Zhou, B., Song, B., Hassan, M.M., Alamri, A.: Multilinear rank support tensor machine for crowd density estimation. Eng. Appl. Artif. Intell. 72(1), 382–392 (2018)

    Google Scholar 

  58. Zhou, H., Li, L., Zhu, H.: Tensor regression with applications in neuroimaging data analysis. J. Am. Stat. Assoc. 108(502), 540–552 (2013)

    MathSciNet  MATH  Google Scholar 

Download references

Funding

The first author’s research was supported by the Beijing Natural Science Foundation (Grant No. Z190002) and the National Natural Science Foundation of China (Grant No.12271022).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Liqun Qi.

Ethics declarations

Conflict of interest

No potential conflicts of interest were reported by the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Luo, Z., Qi, L. Optimality conditions for Tucker low-rank tensor optimization. Comput Optim Appl 86, 1275–1298 (2023). https://doi.org/10.1007/s10589-023-00465-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-023-00465-4

Keywords

Navigation