Abstract
Optimization problems with tensor variables are widely used in statistics, machine learning, pattern recognition, signal processing, computer vision, etc. Among these applications, the low-rankness of tensors is an intrinsic property that can help unearth potential but important structure or feature in the corresponding high-dimensional multi-way datasets, leading to the study on low-rank tensor optimization (LRTO for short). For the general framework of LRTO, little has been addressed in optimization theory. This motivates us to study the optimality conditions, with special emphasis on the Tucker low-rank constrained problems and the Tucker low-rank decomposition-based reformulations. It is noteworthy that all the involved optimization problems are nonconvex, and even discontinuous, due to the complexity of the tensor Tucker rank function or the multi-linear decomposition with the orthogonality or even group sparsity constraints imposed on factor matrices. By employing the tools in variational analysis, especially the normal cones to low-rank matrices and the properties of matrix manifolds, we propose necessary and/or sufficient optimality conditions for Tucker low-rank tensor optimization problems, which will enrich the context of the nonconvex and nonsmooth optimization.
Similar content being viewed by others
Data availibility
No datasets are used in the paper.
References
Absil, P.A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)
Bi, X., Tang, X., Yuan, Y., Zhang, Y., Annie, Q.: Tensors in statistics. Annu. Rev. Stat. Appl. 8(1), 345–368 (2021)
Candès, E.J., Recht, B.: Exact matrix completion via convex optimization. Found. Comput. Math. 9(6), 717–772 (2009)
Candès, E.J., Tao, T.: Decoding by linear programming. IEEE Trans. Inf. Theory 51(12), 4203–4215 (2005)
Canyi, L., Feng, J., Chen, Y., Liu, W., Lin, Z., Yan, S.: Tensor robust principal component analysis with a new tensor nuclear norm. IEEE Trans. Pattern Anal. Mach. Intell. 42(4), 925–938 (2020)
Che, M., Wei, Y., Yan, H.: The computation of low multilinear rank approximations of tensors via power scheme and random projection. SIAM J. Matrix Anal. Appl. 41(2), 605–636 (2020)
Chen, B., Li, Z.: On tensor spectral \(p\)-norm and its dual norm via partitions. Comput. Optim. Appl. 75, 609–628 (2020)
Chen, C., Batselier, K., Wenjian, Yu., Wong, N.: Kernelized support tensor train machines. Pattern Recogn. 122, 108337 (2022)
Chen, H., Raskutti, G., Yuan, M.: Non-convex projected gradient descent for generalized low-rank tensor regression. J. Mach. Learn. Res. 20, 172–208 (2019)
Chen, X., Pan, L., Xiu, N.: Solution sets of three sparse optimization problems for multivariate regression. J. Global Optim. (2022). https://doi.org/10.1007/s10898-021-01124-w
Cheng, M., Jing, L., Michael, K.N.: Tensor-based low-dimensional representation learning for multi-view clustering. IEEE Trans. Image Process. 28(5), 2399–2414 (2019)
De Lathauwer, L., De Moor, B., Vandewalle, J.: A multilinear singular value decomposition. SIAM J. Matrix Anal. Appl. 21(4), 1253–1278 (2000)
De Lathauwer, L., De Moor, B., Vandewalle, J..: On the best rank-1 and rank-\((r_1, r_2,..., r_n)\) approximation of higher-order tensors. SIAM J. Matrix Anal. Appl. 21, 1324–1342 (2000)
Ding, W., Wei, Y.: Theory and Computation of Tensors: Multi-Dimensional Arrays. Elsevier, New York (2016)
Donoho, D.L.: Compressed sensing. IEEE Trans. Inf. Theory 52(4), 1289–1306 (2006)
Drakopoulos, G., Spyrou, E., Mylonas, P.: Tensor clustering: a review. In: 2019 14th International Workshop on Semantic and Social Media Adaptation and Personalization (SMAP), pp. 1–6 (2019)
Eldén, L., Savas, B.: Perturbation theory and optimality conditions for the best multilinear rank approximation of a tensor. SIAM J. Matrix Anal. Appl. 32(4), 1422–1450 (2011)
Goldfarb, D., Qin, Z.: Robust low-rank tensor recovery: models and algorithms. SIAM J. Matrix Anal. Appl. 35(1), 225–253 (2014)
Hao, B., Zhang, A., Cheng, G.: Sparse and low-rank tensor estimation via cubic sketchings. IEEE Trans. Inf. Theory 66(9), 5927–5964 (2020)
Helmke, U., Shayman, M.A.: Critical points of matrix least squares distance functions. Linear Algebra Appl. 215(2), 1–19 (1995)
Jiang, H., Liu, X., Wen, Z., Yuan, Y.: A brief introduction to manifold optimization. J. Oper. Res. Soc. China 8(2), 199–248 (2020)
Janzamin, M., Ge, R., Kossaifi, J., Anandkumar, A.: Spectral learning on matrices and tensors. Found. Trends® Mach. Learn. 12, 393–536 (2019)
Kilmer, M.E., Martin, C.D.: Factorization strategies for third-order tensors. Linear Algebra Appl. 435(3), 641–658 (2011)
Koch, O., Lubich, C.: Dynamical tensor approximation. SIAM J. Matrix Anal. Appl. 31(5), 2360–2375 (2010)
Kolda, T.G., Bader, B.W.: Tensor decompositions with applications. SIAM Rev. 51(3), 455–500 (2009)
Kotsia, I., Patras, I.: Support tucker machines. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition, pp. 633–640 (2011)
Kressner, D., Steinlechner, M., Vandereycken, B.: Low-rank tensor completion by Riemannian optimization. BIT Numer. Math. 54(2), 447–468 (2014)
Li, X., Da, X., Zhou, H., Li, L.: Tucker tensor regression and neuroimaging analysis. Stat. Biosci. 10, 520–545 (2018)
Li, X., Song, W., Xiu, N.: Optimality conditions for rank-constrained matrix optimization. J. Oper. Res. Soc. China 7(2), 285–301 (2019)
Lian, H.: Learning rate for convex support tensor machines. IEEE Trans. Neural Netw. Learn. Syst. 32(8), 3755–3760 (2021)
Liu, J., Musialski, P., Wonka, P., Ye, J.: Tensor completion for estimating missing values in visual data. In: 2009 IEEE 12th International Conference on Computer Vision, pp. 2114–2121 (2009)
Liu, J., Zhu, C., Long, Z., Liu, Y.: Tensor regression. Found. Trends® Mach. Learn. 14(4), 379–565 (2021)
Liu, Y., Liu, J., Long, Z., Zhu, C.: Tensor Computation for Data Analysis. Springer, Berlin (2022)
Minster, R., Saibaba, A.K., Kilmer, M.E.: Randomized algorithms for low-rank tensor decompositions in the Tucker format. SIAM J. Math. Data Sci. 2(1), 189–215 (2020)
Oseledets, I.: Tensor-train decomposition. SIAM J. Sci. Comput. 33(5), 2295–2317 (2011)
Qi, L., Chen, H., Chen, Y.: Tensor Eigenvalues and Their Applications. Springer, Berlin (2018)
Qi, L., Chen, Y., Bakshi, M., Zhang, X.: Triple decomposition and tensor recovery of third order tensors. SIAM J. Matrix Anal. Appl. 42(1), 299–329 (2021)
Qi, L., Luo, Z.: Tensor Analysis: Spectral Theory and Special Tensors. SIAM Press, Philadelphia (2017)
Rabanser, S., Shchur, O., Günnemann, S.: Introduction to tensor decompositions and their applications in machine learning. arXiv preprint, arXiv:1711.10781 (2017)
Raskutti, G., Yuan, M., Chen, H.: Convex regularization for high-dimensional multiresponse tensor regression. Ann. Stat. 47(3), 1554–1584 (2019)
Rockafellar, R.T., Wets, B., Roger, J.: Variational Analysis. Springer, Berlin (2013)
Russell Luke, D.: Prox-regularity of rank constraint sets and implications for algorithms. J. Math. Imaging Vis. 47(3), 231–238 (2013)
Schneider, R., Uschmajew, A.: Convergence results for projected line-search methods on varieties of low-rank matrices via Łojasiewicz inequality. SIAM J. Optim. 25(1), 622–646 (2015)
Sidiropoulos, N.D., De Lathauwer, L., Xiao, F., Huang, K., Papalexakis, E.E., Faloutsos, C.: Tensor decomposition for signal processing and machine learning. IEEE Trans. Signal Process. 65(13), 3551–3582 (2017)
Song, Q., Ge, H., Caverlee, J., Xia, H.: Tensor completion algorithms in big data analytics. ACM Trans. Knowl. Discov. Data 13(1), 1–48 (2019)
Sun, W.W., Hao, B., Li, L.: Tensors in modern statistical learning. Wiley StatsRef: Statistics Reference Online, pp. 1–25 (2021)
Sun, W.W., Li, L.: STORE: Sparse tensor response regression and neuroimaging analysis. J. Mach. Learn. Res. 18, 1–37 (2017)
Tao, D., Li, X., Xindong, W., Weiming, H., Maybank, S.J.: Supervised tensor learning. Knowl. Inf. Syst. 13, 1–42 (2007)
Vannieuwenhoven, N., Vandebril, R., Meerbergen, K.: A new truncation strategy for the higher-order singular value decomposition. SIAM J. Sci. Comput. 34(2), A1027–A1052 (2012)
Wang, R., Xiu, N., Toh, K.-C.: Subspace quadratic regularization method for group sparse multinomial logistic regression. Comput. Optim. Appl. 79, 531–559 (2021)
Xiaotong, Yu., Luo, Z.: A sparse tensor optimization approach for background subtraction from compressive measurements. Multimedia Tools Appl. 80, 26657–26682 (2021)
Xiaotong, Yu., Luo, Z., Qi, L., Yanwei, X.: SLRTA: a sparse and low-rank tensor-based approach to internet traffic anomaly detection. Neurocomputing 434, 295–314 (2021)
Yang, W., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific J. Optim. 10(2), 415–434 (2014)
Yuan, M., Zhang, C.-H.: On tensor completion via nuclear norm minimization. Found. Comput. Math. 16, 1031–1068 (2016)
Zhang, A., Luo, Y., Raskutti, G., Yuan, M.: ISLET: fast and optimal low-rank tensor regression via importance sketching. SIAM J. Math. Data Sci. 2(2), 444–479 (2020)
Zhao, Q., Zhou, G., Xie, S., Zhang, L., Cichocki, A.: Tensor ring decomposition. arXiv preprint, arXiv:1606.05535 (2016)
Zhou, B., Song, B., Hassan, M.M., Alamri, A.: Multilinear rank support tensor machine for crowd density estimation. Eng. Appl. Artif. Intell. 72(1), 382–392 (2018)
Zhou, H., Li, L., Zhu, H.: Tensor regression with applications in neuroimaging data analysis. J. Am. Stat. Assoc. 108(502), 540–552 (2013)
Funding
The first author’s research was supported by the Beijing Natural Science Foundation (Grant No. Z190002) and the National Natural Science Foundation of China (Grant No.12271022).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
No potential conflicts of interest were reported by the authors.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Luo, Z., Qi, L. Optimality conditions for Tucker low-rank tensor optimization. Comput Optim Appl 86, 1275–1298 (2023). https://doi.org/10.1007/s10589-023-00465-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-023-00465-4