Abstract
The recently proposed fully-connected tensor network (FCTN) decomposition has a powerful ability to capture the low-rankness of tensors and has achieved great success in tensor completion. However, the FCTN decomposition-based method is highly sensitive to the choice of the FCTN-rank and can not provide satisfactory results in the recovery of local details. In this paper, we propose a novel tensor completion model by introducing a factor-based regularization to the framework of the FCTN decomposition. The regularization provides a robust performance to the choice of the FCTN-rank and simultaneously enforces the global low-rankness and the local continuity of the target tensor. More specifically, by illustrating that the unfolding matrices of the FCTN factors can be reasonably assumed to be of low-rank in the gradient domain and further imposing a low-rank matrix factorization (LRMF) on them, the proposed model enhances the robustness to the choice of the FCTN-rank. By employing a Tikhonov regularization to the LRMF factors, the proposed model promotes the local continuity and preserves local details of the target tensor. To solve the optimization problem associated with the proposed model, we develop an efficient proximal alternating minimization (PAM)-based algorithm and theoretically demonstrate its convergence. To reduce the running time of the developed algorithm, we design an automatic rank-increasing strategy. Numerical experimental results demonstrate that the proposed method outperforms its competitors.










Similar content being viewed by others
Data availability
All datasets are publicly available.
Notes
If not specifically stated, \(\mathbf {n}\) denotes an arbitrary reordering of \((1,2,\ldots ,N)\) throughout the paper.
The code is available at https://yubangzheng.github.io.
References
Attouch, H., Bolte, J., Redont, P., Soubeyran, A.: Proximal alternating minimization and projection methods for nonconvex problems: an approach based on the Kurdyka-Łojasiewicz inequality. Math. Oper. Res. 35(2), 438–457 (2010)
Attouch, H., Bolte, J., Svaiter, B.F.: Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods. Math. Program. 137(1), 91–129 (2013)
Bengua, J.A., Phien, H.N., Tuan, H.D., Do, M.N.: Efficient tensor completion for color image and video recovery: low-rank tensor train. IEEE Trans. Image Process. 26(5), 2466–2479 (2017)
Bolte, J., Daniilidis, A., Lewis, A., Shiota, M.: Clarke subgradients of stratifiable functions. SIAM J. Optim. 18(2), 556–572 (2007)
Bolte, J., Sabach, S., Teboulle, M.: Proximal alternating linearized minimization for nonconvex and nonsmooth problems. Math. Program. 146, 459–494 (2014)
Brewer, J.: Kronecker products and matrix calculus in system theory. IEEE Trans. Circ. Syst. 25(9), 772–781 (1978)
Chen, Y., Zhang, X., Qi, L., Xu, Y.: A Barzilai-Borwein gradient algorithm for spatio-temporal internet traffic data completion via tensor triple decomposition. J. Sci. Comput. (2021). https://doi.org/10.1007/s10915-021-01574-0
de Silva, V., Lim, L.-H.: Tensor rank and the ill-posedness of the best low-rank approximation problem. SIAM J. Matrix Anal. Appl. 30(3), 1084–1127 (2008)
Ding, M., Huang, T.-Z., Ji, T.-Y., Zhao, X.-L., Yang, J.-H.: Low-rank tensor completion using matrix factorization based on tensor train rank and total variation. J. Sci. Comput. 81(2), 941–964 (2019)
Gandy, S., Recht, B., Yamada, I.: Tensor completion and low-n-rank tensor recovery via convex optimization. Inverse Prob. 27(2), 025010 (2011)
Gao, S., Fan, Q.: Robust Schatten-p norm based approach for tensor completion. J. Sci. Comput. 82(11), 1–23 (2020)
Gong, X., Chen, W., Chen, J., Ai, B.: Tensor denoising using low-rank tensor train decomposition. IEEE Signal Process. Lett. 27, 1685–1689 (2020)
Hillar, C.J., Lim, L.-H.: Most tensor problems are NP-hard. J. ACM 60(6), 45:1-45:39 (2013)
Ji, T.-Y., Huang, T.-Z., Zhao, X.-L., Ma, T.-H., Deng, L.-J.: A non-convex tensor rank approximation for tensor completion. Appl. Math. Model. 48, 410–422 (2017)
Kilmer, M.E., Braman, K., Hao, N., Hoover, R.C.: Third-order tensors as operators on matrices: a theoretical and computational framework with applications in imaging. SIAM J. Matrix Anal. Appl. 34(1), 148–172 (2013)
Kilmer, M.E., Martin, C.D.: Factorization strategies for third-order tensors. Linear Algebra Appl. 435(3), 641–658 (2011)
Kolda, T.G., Bader, B.W.: Tensor decompositions and applications. SIAM Rev. 51(3), 455–500 (2009)
Li, B.-Z., Zhao, X.-L., Wang, J.-L., Chen, Y., Jiang, T.-X., Liu, J.: Tensor completion via collaborative sparse and low-rank transforms. IEEE Trans. Comput. Imaging 7, 1289–1303 (2021)
Li, C., Khan, M.E., Sun, Z., Niu, G., Han, B., Xie, S., Zhao, Q.: Beyond unfolding: Exact recovery of latent convex tensor decomposition under reshuffling. Proc. AAAI Conf. Artif. Intell. 34, 4602–4609 (2020)
Li, X., Ye, Y., Xu, X.: Low-rank tensor completion with total variation for visual data inpainting. Proc. AAAI Conf. Artif. Intell. 31, 2210–2216 (2017)
Lin, J., Huang, T.-Z., Zhao, X.-L., Jiang, T.-X., Zhuang, L.: A tensor subspace representation-based method for hyperspectral image denoising. IEEE Trans. Geosci. Remote Sens. 59(9), 7739–7757 (2021)
Liu, J., Musialski, P., Wonka, P., Ye, J.: Tensor completion for estimating missing values in visual data. IEEE Trans. Pattern Anal. Mach. Intell. 35(1), 208–220 (2013)
Liu, Y.-Y., Zhao, X.-L., Zheng, Y.-B., Ma, T.-H., Zhang, H.: Hyperspectral image restoration by tensor fibered rank constrained optimization and plug-and-play regularization. IEEE Trans. Geosci. Remote Sens. 60, 1–17 (2022)
Lu, C., Feng, J., Chen, Y., Liu, W., Lin, Z., Yan, S.: Tensor robust principal component analysis with a new tensor nuclear norm. IEEE Trans. Pattern Anal. Mach. Intell. 42(4), 925–938 (2020)
Martin, D.R., Reichel, L.: Projected Tikhonov regularization of large-scale discrete ill-posed problems. J. Sci. Comput. 56, 471–493 (2013)
Onunwor, E., Reichel, L.: On the computation of a truncated SVD of a large linear discrete ill-posed problem. Numer. Algorithms 75, 359–380 (2017)
Oseledets, I.V.: Tensor-train decomposition. SIAM J. Sci. Comput. 33(5), 2295–2317 (2011)
Qiu, D., Bai, M., Ng, M.K., Zhang, X.: Robust low transformed multi-rank tensor methods for image alignment. J. Sci. Comput. 87(24), 1–40 (2021)
Semerci, O., Hao, N., Kilmer, M.E., Miller, E.L.: Tensor-based formulation and nuclear norm regularization for multienergy computed tomography. IEEE Trans. Image Process. 23(4), 1678–1693 (2014)
Shi, C., Huang, Z., Wan, L., Xiong, T.: Low-rank tensor completion based on log-det rank approximation and matrix factorization. J. Sci. Comput. 80, 1888–1912 (2019)
Song, G., Ng, M.K., Zhang, X.: Robust tensor completion using transformed tensor singular value decomposition. Numer. Linear Algebra Appl. 27(3), e2299 (2020)
Wang, H., Zhang, F., Wang, J., Huang, T., Huang, J., Liu, X.: Generalized nonconvex approach for low-tubal-rank tensor recovery. IEEE Trans. Neural Netw. Learn. Syst. (2021). https://doi.org/10.1109/TNNLS.2021.3051650
Wang, W., Ng, M.K.: Color image restoration by saturation-value total variation regularization on vector bundles. SIAM J. Imag. Sci. 14(1), 178–197 (2021)
Wang, Y., Meng, D., Yuan, M.: Sparse recovery: from vectors to tensors. Natl. Sci. Rev. 5(5), 756–767 (2018)
Xie, Q., Zhao, Q., Meng, D., Xu, Z.: Kronecker-basis-representation based tensor sparsity and its applications to tensor recovery. IEEE Trans. Pattern Anal. Mach. Intell. 40(8), 1888–1902 (2018)
Xu, W.-H., Zhao, X.-L., Ji, T.-Y., Miao, J.-Q., Ma, T.-H., Wang, S., Huang, T.-Z.: Laplace function based nonconvex surrogate for low-rank tensor completion. Signal Process. Image Commun. 73, 62–69 (2019)
Xu, Y., Hao, R., Yin, W., Su, Z.: Parallel matrix factorization for low-rank tensor completion. Inverse Probl. Imaging 9(2), 601–624 (2015)
Yang, J.-H., Zhao, X.-L., Ji, T.-Y., Ma, T.-H., Huang, T.-Z.: Low-rank tensor train for tensor robust principal component analysis. Appl. Math. Comput. 367, 124783 (2020)
Yokota, T., Zhao, Q., Cichocki, A.: Smooth PARAFAC decomposition for tensor completion. IEEE Trans. Signal Process. 64(20), 5423–5436 (2016)
Yuan, L., Li, C., Mandic, D., Cao, J., Zhao, Q.: Tensor ring decomposition with rank minimization on latent space: An efficient approach for tensor completion. Proc. AAAI Conf. Artif. Intell. 33, 9151–9158 (2019)
Zhang, X., Ng, M.K.: Low rank tensor completion with Poisson observations. IEEE Trans. Pattern Anal. Mach. Intell. (2021). https://doi.org/10.1109/TPAMI.2021.3059299
Zhang, X., Ng, M.K., Bai, M.: A fast algorithm for deconvolution and Poisson noise removal. J. Sci. Comput. 75, 1535–1554 (2018)
Zhang, Z., Aeron, S.: Exact tensor completion using t-SVD. IEEE Trans. Signal Process. 65(6), 1511–1526 (2017)
Zhao, Q., Zhang, L., Cichocki, A.: Bayesian CP factorization of incomplete tensors with automatic rank determination. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1751–1763 (2015)
Zhao, Q., Zhou, G., Xie, S., Zhang, L., Cichocki, A.: Tensor ring decomposition. (2016). arXiv preprint arXiv:1606.05535
Zhao, X., Bai, M., Ng, M.K.: Nonconvex optimization for robust tensor completion from grossly sparse observations. J. Sci. Comput. 85(46), 1–32 (2020)
Zhao, X.-L., Yang, J.-H., Ma, T.-H., Jiang, T.-X., Ng, M.K., Huang, T.-Z.: Tensor completion via complementary global, local, and nonlocal priors. IEEE Trans. Image Process. (2021). https://doi.org/10.1109/TIP.2021.3138325
Zheng, Y.-B., Huang, T.-Z., Zhao, X.-L., Jiang, T.-X., Ji, T.-Y., Ma, T.-H.: Tensor N-tubal rank and its convex relaxation for low-rank tensor recovery. Inf. Sci. 532, 170–189 (2020)
Zheng, Y.-B., Huang, T.-Z., Zhao, X.-L., Jiang, T.-X., Ma, T.-H., Ji, T.-Y.: Mixed noise removal in hyperspectral image via low-fibered-rank regularization. IEEE Trans. Geosci. Remote Sens. 58(1), 734–749 (2020)
Zheng, Y.-B., Huang, T.-Z., Zhao, X.-L., Zhao, Q., Jiang, T.-X.: Fully-connected tensor network decomposition and its application to higher-order tensor completion. Proc. AAAI Conf. Artif. Intell. 35, 11071–11078 (2021)
Acknowledgements
This work was supported by the National Natural Science Foundation of China (Grant Nos. 12171072, 61876203, 62071132), the Key Project of Applied Basic Research in Sichuan Province (Grant No. 2020YJ0216), the National Key Research and Development Program of China (Grant No. 2020YFA0714001), the Project of Applied Basic Research in Sichuan Province (Grant No. 2021YJ0107), and JSPS KAKENHI (Grant No. 20H04249).
Funding
Ting-Zhu Huang, 12171072, National Natural Science Foundation of China (CN); Xi-Le Zhao, 61876203, National Natural Science Foundation of China; Qibin Zhao, 62071132, National Natural Science Foundation of China; Ting-Zhu Huang, 2020YJ0216, Key Project of Applied Basic Research in Sichuan Province; Ting-Zhu Huang, 2020YFA0714001, National Key Research and Development Program of China; Xi-Le Zhao, 2021YJ0107, Project of Applied Basic Research in Sichuan Province; Qibin Zhao, 20H04249, Japan Society for the Promotion of Science.
Author information
Authors and Affiliations
Corresponding authors
Ethics declarations
Conflict of interest
The authors have not disclosed any competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Zheng, YB., Huang, TZ., Zhao, XL. et al. Tensor Completion via Fully-Connected Tensor Network Decomposition with Regularized Factors. J Sci Comput 92, 8 (2022). https://doi.org/10.1007/s10915-022-01841-8
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10915-022-01841-8