Skip to main content
Log in

Tensor Completion via Fully-Connected Tensor Network Decomposition with Regularized Factors

  • Published:
Journal of Scientific Computing Aims and scope Submit manuscript

Abstract

The recently proposed fully-connected tensor network (FCTN) decomposition has a powerful ability to capture the low-rankness of tensors and has achieved great success in tensor completion. However, the FCTN decomposition-based method is highly sensitive to the choice of the FCTN-rank and can not provide satisfactory results in the recovery of local details. In this paper, we propose a novel tensor completion model by introducing a factor-based regularization to the framework of the FCTN decomposition. The regularization provides a robust performance to the choice of the FCTN-rank and simultaneously enforces the global low-rankness and the local continuity of the target tensor. More specifically, by illustrating that the unfolding matrices of the FCTN factors can be reasonably assumed to be of low-rank in the gradient domain and further imposing a low-rank matrix factorization (LRMF) on them, the proposed model enhances the robustness to the choice of the FCTN-rank. By employing a Tikhonov regularization to the LRMF factors, the proposed model promotes the local continuity and preserves local details of the target tensor. To solve the optimization problem associated with the proposed model, we develop an efficient proximal alternating minimization (PAM)-based algorithm and theoretically demonstrate its convergence. To reduce the running time of the developed algorithm, we design an automatic rank-increasing strategy. Numerical experimental results demonstrate that the proposed method outperforms its competitors.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Data availability

All datasets are publicly available.

Notes

  1. If not specifically stated, \(\mathbf {n}\) denotes an arbitrary reordering of \((1,2,\ldots ,N)\) throughout the paper.

  2. The code is available at https://yubangzheng.github.io.

References

  1. Attouch, H., Bolte, J., Redont, P., Soubeyran, A.: Proximal alternating minimization and projection methods for nonconvex problems: an approach based on the Kurdyka-Łojasiewicz inequality. Math. Oper. Res. 35(2), 438–457 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  2. Attouch, H., Bolte, J., Svaiter, B.F.: Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods. Math. Program. 137(1), 91–129 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  3. Bengua, J.A., Phien, H.N., Tuan, H.D., Do, M.N.: Efficient tensor completion for color image and video recovery: low-rank tensor train. IEEE Trans. Image Process. 26(5), 2466–2479 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  4. Bolte, J., Daniilidis, A., Lewis, A., Shiota, M.: Clarke subgradients of stratifiable functions. SIAM J. Optim. 18(2), 556–572 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  5. Bolte, J., Sabach, S., Teboulle, M.: Proximal alternating linearized minimization for nonconvex and nonsmooth problems. Math. Program. 146, 459–494 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  6. Brewer, J.: Kronecker products and matrix calculus in system theory. IEEE Trans. Circ. Syst. 25(9), 772–781 (1978)

    Article  MathSciNet  MATH  Google Scholar 

  7. Chen, Y., Zhang, X., Qi, L., Xu, Y.: A Barzilai-Borwein gradient algorithm for spatio-temporal internet traffic data completion via tensor triple decomposition. J. Sci. Comput. (2021). https://doi.org/10.1007/s10915-021-01574-0

  8. de Silva, V., Lim, L.-H.: Tensor rank and the ill-posedness of the best low-rank approximation problem. SIAM J. Matrix Anal. Appl. 30(3), 1084–1127 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  9. Ding, M., Huang, T.-Z., Ji, T.-Y., Zhao, X.-L., Yang, J.-H.: Low-rank tensor completion using matrix factorization based on tensor train rank and total variation. J. Sci. Comput. 81(2), 941–964 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  10. Gandy, S., Recht, B., Yamada, I.: Tensor completion and low-n-rank tensor recovery via convex optimization. Inverse Prob. 27(2), 025010 (2011)

  11. Gao, S., Fan, Q.: Robust Schatten-p norm based approach for tensor completion. J. Sci. Comput. 82(11), 1–23 (2020)

    MathSciNet  MATH  Google Scholar 

  12. Gong, X., Chen, W., Chen, J., Ai, B.: Tensor denoising using low-rank tensor train decomposition. IEEE Signal Process. Lett. 27, 1685–1689 (2020)

    Article  Google Scholar 

  13. Hillar, C.J., Lim, L.-H.: Most tensor problems are NP-hard. J. ACM 60(6), 45:1-45:39 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  14. Ji, T.-Y., Huang, T.-Z., Zhao, X.-L., Ma, T.-H., Deng, L.-J.: A non-convex tensor rank approximation for tensor completion. Appl. Math. Model. 48, 410–422 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  15. Kilmer, M.E., Braman, K., Hao, N., Hoover, R.C.: Third-order tensors as operators on matrices: a theoretical and computational framework with applications in imaging. SIAM J. Matrix Anal. Appl. 34(1), 148–172 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  16. Kilmer, M.E., Martin, C.D.: Factorization strategies for third-order tensors. Linear Algebra Appl. 435(3), 641–658 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  17. Kolda, T.G., Bader, B.W.: Tensor decompositions and applications. SIAM Rev. 51(3), 455–500 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  18. Li, B.-Z., Zhao, X.-L., Wang, J.-L., Chen, Y., Jiang, T.-X., Liu, J.: Tensor completion via collaborative sparse and low-rank transforms. IEEE Trans. Comput. Imaging 7, 1289–1303 (2021)

    Article  MathSciNet  Google Scholar 

  19. Li, C., Khan, M.E., Sun, Z., Niu, G., Han, B., Xie, S., Zhao, Q.: Beyond unfolding: Exact recovery of latent convex tensor decomposition under reshuffling. Proc. AAAI Conf. Artif. Intell. 34, 4602–4609 (2020)

    Google Scholar 

  20. Li, X., Ye, Y., Xu, X.: Low-rank tensor completion with total variation for visual data inpainting. Proc. AAAI Conf. Artif. Intell. 31, 2210–2216 (2017)

    Google Scholar 

  21. Lin, J., Huang, T.-Z., Zhao, X.-L., Jiang, T.-X., Zhuang, L.: A tensor subspace representation-based method for hyperspectral image denoising. IEEE Trans. Geosci. Remote Sens. 59(9), 7739–7757 (2021)

    Article  Google Scholar 

  22. Liu, J., Musialski, P., Wonka, P., Ye, J.: Tensor completion for estimating missing values in visual data. IEEE Trans. Pattern Anal. Mach. Intell. 35(1), 208–220 (2013)

    Article  Google Scholar 

  23. Liu, Y.-Y., Zhao, X.-L., Zheng, Y.-B., Ma, T.-H., Zhang, H.: Hyperspectral image restoration by tensor fibered rank constrained optimization and plug-and-play regularization. IEEE Trans. Geosci. Remote Sens. 60, 1–17 (2022)

    Google Scholar 

  24. Lu, C., Feng, J., Chen, Y., Liu, W., Lin, Z., Yan, S.: Tensor robust principal component analysis with a new tensor nuclear norm. IEEE Trans. Pattern Anal. Mach. Intell. 42(4), 925–938 (2020)

    Article  Google Scholar 

  25. Martin, D.R., Reichel, L.: Projected Tikhonov regularization of large-scale discrete ill-posed problems. J. Sci. Comput. 56, 471–493 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  26. Onunwor, E., Reichel, L.: On the computation of a truncated SVD of a large linear discrete ill-posed problem. Numer. Algorithms 75, 359–380 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  27. Oseledets, I.V.: Tensor-train decomposition. SIAM J. Sci. Comput. 33(5), 2295–2317 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  28. Qiu, D., Bai, M., Ng, M.K., Zhang, X.: Robust low transformed multi-rank tensor methods for image alignment. J. Sci. Comput. 87(24), 1–40 (2021)

    MathSciNet  MATH  Google Scholar 

  29. Semerci, O., Hao, N., Kilmer, M.E., Miller, E.L.: Tensor-based formulation and nuclear norm regularization for multienergy computed tomography. IEEE Trans. Image Process. 23(4), 1678–1693 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  30. Shi, C., Huang, Z., Wan, L., Xiong, T.: Low-rank tensor completion based on log-det rank approximation and matrix factorization. J. Sci. Comput. 80, 1888–1912 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  31. Song, G., Ng, M.K., Zhang, X.: Robust tensor completion using transformed tensor singular value decomposition. Numer. Linear Algebra Appl. 27(3), e2299 (2020)

  32. Wang, H., Zhang, F., Wang, J., Huang, T., Huang, J., Liu, X.: Generalized nonconvex approach for low-tubal-rank tensor recovery. IEEE Trans. Neural Netw. Learn. Syst. (2021). https://doi.org/10.1109/TNNLS.2021.3051650

  33. Wang, W., Ng, M.K.: Color image restoration by saturation-value total variation regularization on vector bundles. SIAM J. Imag. Sci. 14(1), 178–197 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  34. Wang, Y., Meng, D., Yuan, M.: Sparse recovery: from vectors to tensors. Natl. Sci. Rev. 5(5), 756–767 (2018)

    Article  Google Scholar 

  35. Xie, Q., Zhao, Q., Meng, D., Xu, Z.: Kronecker-basis-representation based tensor sparsity and its applications to tensor recovery. IEEE Trans. Pattern Anal. Mach. Intell. 40(8), 1888–1902 (2018)

    Article  Google Scholar 

  36. Xu, W.-H., Zhao, X.-L., Ji, T.-Y., Miao, J.-Q., Ma, T.-H., Wang, S., Huang, T.-Z.: Laplace function based nonconvex surrogate for low-rank tensor completion. Signal Process. Image Commun. 73, 62–69 (2019)

    Article  Google Scholar 

  37. Xu, Y., Hao, R., Yin, W., Su, Z.: Parallel matrix factorization for low-rank tensor completion. Inverse Probl. Imaging 9(2), 601–624 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  38. Yang, J.-H., Zhao, X.-L., Ji, T.-Y., Ma, T.-H., Huang, T.-Z.: Low-rank tensor train for tensor robust principal component analysis. Appl. Math. Comput. 367, 124783 (2020)

  39. Yokota, T., Zhao, Q., Cichocki, A.: Smooth PARAFAC decomposition for tensor completion. IEEE Trans. Signal Process. 64(20), 5423–5436 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  40. Yuan, L., Li, C., Mandic, D., Cao, J., Zhao, Q.: Tensor ring decomposition with rank minimization on latent space: An efficient approach for tensor completion. Proc. AAAI Conf. Artif. Intell. 33, 9151–9158 (2019)

    Google Scholar 

  41. Zhang, X., Ng, M.K.: Low rank tensor completion with Poisson observations. IEEE Trans. Pattern Anal. Mach. Intell. (2021). https://doi.org/10.1109/TPAMI.2021.3059299

  42. Zhang, X., Ng, M.K., Bai, M.: A fast algorithm for deconvolution and Poisson noise removal. J. Sci. Comput. 75, 1535–1554 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  43. Zhang, Z., Aeron, S.: Exact tensor completion using t-SVD. IEEE Trans. Signal Process. 65(6), 1511–1526 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  44. Zhao, Q., Zhang, L., Cichocki, A.: Bayesian CP factorization of incomplete tensors with automatic rank determination. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1751–1763 (2015)

    Article  Google Scholar 

  45. Zhao, Q., Zhou, G., Xie, S., Zhang, L., Cichocki, A.: Tensor ring decomposition. (2016). arXiv preprint arXiv:1606.05535

  46. Zhao, X., Bai, M., Ng, M.K.: Nonconvex optimization for robust tensor completion from grossly sparse observations. J. Sci. Comput. 85(46), 1–32 (2020)

    MathSciNet  MATH  Google Scholar 

  47. Zhao, X.-L., Yang, J.-H., Ma, T.-H., Jiang, T.-X., Ng, M.K., Huang, T.-Z.: Tensor completion via complementary global, local, and nonlocal priors. IEEE Trans. Image Process. (2021). https://doi.org/10.1109/TIP.2021.3138325

  48. Zheng, Y.-B., Huang, T.-Z., Zhao, X.-L., Jiang, T.-X., Ji, T.-Y., Ma, T.-H.: Tensor N-tubal rank and its convex relaxation for low-rank tensor recovery. Inf. Sci. 532, 170–189 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  49. Zheng, Y.-B., Huang, T.-Z., Zhao, X.-L., Jiang, T.-X., Ma, T.-H., Ji, T.-Y.: Mixed noise removal in hyperspectral image via low-fibered-rank regularization. IEEE Trans. Geosci. Remote Sens. 58(1), 734–749 (2020)

    Article  Google Scholar 

  50. Zheng, Y.-B., Huang, T.-Z., Zhao, X.-L., Zhao, Q., Jiang, T.-X.: Fully-connected tensor network decomposition and its application to higher-order tensor completion. Proc. AAAI Conf. Artif. Intell. 35, 11071–11078 (2021)

    Google Scholar 

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China (Grant Nos. 12171072, 61876203, 62071132), the Key Project of Applied Basic Research in Sichuan Province (Grant No. 2020YJ0216), the National Key Research and Development Program of China (Grant No. 2020YFA0714001), the Project of Applied Basic Research in Sichuan Province (Grant No. 2021YJ0107), and JSPS KAKENHI (Grant No. 20H04249).

Funding

Ting-Zhu Huang, 12171072, National Natural Science Foundation of China (CN); Xi-Le Zhao, 61876203, National Natural Science Foundation of China; Qibin Zhao, 62071132, National Natural Science Foundation of China; Ting-Zhu Huang, 2020YJ0216, Key Project of Applied Basic Research in Sichuan Province; Ting-Zhu Huang, 2020YFA0714001, National Key Research and Development Program of China; Xi-Le Zhao, 2021YJ0107, Project of Applied Basic Research in Sichuan Province; Qibin Zhao, 20H04249, Japan Society for the Promotion of Science.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Ting-Zhu Huang or Xi-Le Zhao.

Ethics declarations

Conflict of interest

The authors have not disclosed any competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zheng, YB., Huang, TZ., Zhao, XL. et al. Tensor Completion via Fully-Connected Tensor Network Decomposition with Regularized Factors. J Sci Comput 92, 8 (2022). https://doi.org/10.1007/s10915-022-01841-8

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10915-022-01841-8

Keywords

Navigation