Skip to main content

Variable Projection Applied to Block Term Decomposition of Higher-Order Tensors

  • Conference paper
  • First Online:
Latent Variable Analysis and Signal Separation (LVA/ICA 2018)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 10891))

Abstract

Higher-order tensors have become popular in many areas of applied mathematics such as statistics, scientific computing, signal processing or machine learning, notably thanks to the many possible ways of decomposing a tensor. In this paper, we focus on the best approximation in the least-squares sense of a higher-order tensor by a block term decomposition. Using variable projection, we express the tensor approximation problem as a minimization of a cost function on a Cartesian product of Stiefel manifolds. The effect of variable projection on the Riemannian gradient algorithm is studied through numerical experiments.

This work was supported by (1) “Communauté française de Belgique - Actions de Recherche Concertées” (contract ARC 14/19-060), (2) Research Council KU Leuven: C1 project C16/15/059-nD, (3) F.W.O.: project G.0830.14N, G.0881.14N, (4) Fonds de la Recherche Scientifique – FNRS and the Fonds Wetenschappelijk Onderzoek – Vlaanderen under EOS Project no. 30468160 (SeLMA), (5) EU: The research leading to these results has received funding from the European Research Council under the European Union’s Seventh Framework Programme (FP7/2007-2013)/ERC Advanced Grant: BIOTENSORS (no. 339804). This paper reflects only the authors’ views and the Union is not liable for any use that may be made of the contained information.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    The minimizer is unique if and only if the matrix \(\mathbf {P}(\mathbf {U},\mathbf {V},\mathbf {W})\) has full column rank which is the case almost everywhere (with respect to the Lebesgue measure) since \(m \ge n\).

  2. 2.

    The Matlab code that produced the results is available at https://sites.uclouvain.be/absil/2018.01.

  3. 3.

    With these parameters, the BTD \(\mathcal {A}\) in (8) is essentially unique by [6, Theorem 5.3].

References

  1. Cichocki, A., Mandic, D., Phan, A.H., Caiafa, C., Zhou, G., Zhao, Q., De Lathauwer, L.: Tensor decompositions for signal processing applications: from two-way to multiway component analysis. IEEE Signal Process. Mag. 32(2), 145–163 (2015)

    Article  Google Scholar 

  2. Sidiropoulos, N.D., De Lathauwer, L., Fu, X., Huang, K., Papalexakis, E.E., Faloutsos, C.: Tensor decomposition for signal processing and machine learning. IEEE Trans. Signal Process. 65(13), 3551–3582 (2017)

    Article  MathSciNet  Google Scholar 

  3. Cichocki, A., Lee, N., Oseledets, I., Phan, A.H., Zhao, Q., Mandic, D., et al.: Tensor networks for dimensionality reduction and large-scaleoptimization: Part 1 low-rank tensor decompositions. Found. Trends Mach. Learn. 9(4–5), 249–429 (2016)

    Article  Google Scholar 

  4. Cichocki, A., Phan, A.H., Zhao, Q., Lee, N., Oseledets, I., Sugiyama, M., Mandic, D., et al.: Tensor networks for dimensionality reduction and large-scaleoptimization: Part 2 applications and future perspectives. Found. Trends Mach. Learn. 9(6), 431–673 (2017)

    Article  Google Scholar 

  5. De Lathauwer, L.: Decompositions of a higher-order tensor in block terms-Part I: Lemmas for partitioned matrices. SIAM J. Matrix Anal. Appl. 30(3), 1022–1032 (2008)

    Article  MathSciNet  Google Scholar 

  6. De Lathauwer, L.: Decompositions of a higher-order tensor in block terms-Part II: Definitions and uniqueness. SIAM J. Matrix Anal. Appl. 30(3), 1033–1066 (2008)

    Article  MathSciNet  Google Scholar 

  7. De Lathauwer, L., Nion, D.: Decompositions of a higher-order tensor in block terms-Part III: alternating least squares algorithms. SIAM J. Matrix Anal. Appl. 30(3), 1067–1083 (2008)

    Article  MathSciNet  Google Scholar 

  8. Lathauwer, L.: Block component analysis, a new concept for blind source separation. In: Theis, F., Cichocki, A., Yeredor, A., Zibulevsky, M. (eds.) LVA/ICA 2012. LNCS, vol. 7191, pp. 1–8. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-28551-6_1

    Chapter  Google Scholar 

  9. Yang, M., Kang, Z., Peng, C., Liu, W., Cheng, W.: On block term tensor decompositions and its applications in blind signal separation. http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20160105102343471889031

  10. De Lathauwer, L.: Blind separation of exponential polynomials and the decomposition of a tensor in rank-\((L_r, L_r,1)\) terms. SIAM J. Matrix Anal. Appl. 32(4), 1451–1474 (2011)

    Article  MathSciNet  Google Scholar 

  11. Debals, O., Van Barel, M., De Lathauwer, L.: Löwner-based blind signal separation of rational functions with applications. IEEE Trans. Signal Process. 64(8), 1909–1918 (2016)

    Article  MathSciNet  Google Scholar 

  12. Hunyadi, B., Camps, D., Sorber, L., Van Paesschen, W., De Vos, M., Van Huffel, S., De Lathauwer, L.: Block term decomposition for modelling epileptic seizures. EURASIP J. Adv. Signal Process. 2014(1), 139 (2014)

    Article  Google Scholar 

  13. Chatzichristos, C., Kofidis, E., Kopsinis, Y., Moreno, M.M., Theodoridis, S.: Higher-order block term decomposition for spatially folded fMRI data. In: Tichavský, P., Babaie-Zadeh, M., Michel, O.J.J., Thirion-Moreau, N. (eds.) LVA/ICA 2017. LNCS, vol. 10169, pp. 3–15. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-53547-0_1

    Chapter  Google Scholar 

  14. Chatzichristos, C., Kofidis, E., Theodoridis, S.: PARAFAC2 and its block term decomposition analog for blind fMRI source unmixing. In: 2017 25th European Signal Processing Conference (EUSIPCO), pp. 2081–2085, August 2017

    Google Scholar 

  15. Vervliet, N., Debals, O., Sorber, L., Van Barel, M., De Lathauwer, L.: Tensorlab 3.0, March 2016. https://www.tensorlab.net

  16. De Lathauwer, L., De Moor, B., Vandewalle, J.: On the best rank-\(1\) and rank-\(({R_1},{R_2},\dots,{R_N})\) approximation of higher-order tensors. SIAM J. Matrix Anal. Appl. 21(4), 1324–1342 (2000)

    Article  MathSciNet  Google Scholar 

  17. Ishteva, M., Absil, P.-A., Van Huffel, S., De Lathauwer, L.: Best low multilinear rank approximation of higher-order tensors, based on the Riemannian trust-region scheme. SIAM J. Matrix Anal. Appl. 32(1), 115–135 (2011)

    Article  MathSciNet  Google Scholar 

  18. Savas, B., Lim, L.-H.: Quasi-Newton methods on grassmannians and multilinear approximations of tensors. SIAM J. Sci. Comput. 32(6), 3352–3393 (2010)

    Article  MathSciNet  Google Scholar 

  19. Olikier, G., Absil, P.-A., De Lathauwer, L.: A variable projection method for block term decomposition of higher-order tensors. Accepted for ESANN 2018

    Google Scholar 

  20. Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guillaume Olikier .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Olikier, G., Absil, PA., De Lathauwer, L. (2018). Variable Projection Applied to Block Term Decomposition of Higher-Order Tensors. In: Deville, Y., Gannot, S., Mason, R., Plumbley, M., Ward, D. (eds) Latent Variable Analysis and Signal Separation. LVA/ICA 2018. Lecture Notes in Computer Science(), vol 10891. Springer, Cham. https://doi.org/10.1007/978-3-319-93764-9_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-93764-9_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-93763-2

  • Online ISBN: 978-3-319-93764-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics