Abstract
Higher-order tensors have become popular in many areas of applied mathematics such as statistics, scientific computing, signal processing or machine learning, notably thanks to the many possible ways of decomposing a tensor. In this paper, we focus on the best approximation in the least-squares sense of a higher-order tensor by a block term decomposition. Using variable projection, we express the tensor approximation problem as a minimization of a cost function on a Cartesian product of Stiefel manifolds. The effect of variable projection on the Riemannian gradient algorithm is studied through numerical experiments.
This work was supported by (1) “Communauté française de Belgique - Actions de Recherche Concertées” (contract ARC 14/19-060), (2) Research Council KU Leuven: C1 project C16/15/059-nD, (3) F.W.O.: project G.0830.14N, G.0881.14N, (4) Fonds de la Recherche Scientifique – FNRS and the Fonds Wetenschappelijk Onderzoek – Vlaanderen under EOS Project no. 30468160 (SeLMA), (5) EU: The research leading to these results has received funding from the European Research Council under the European Union’s Seventh Framework Programme (FP7/2007-2013)/ERC Advanced Grant: BIOTENSORS (no. 339804). This paper reflects only the authors’ views and the Union is not liable for any use that may be made of the contained information.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
The minimizer is unique if and only if the matrix \(\mathbf {P}(\mathbf {U},\mathbf {V},\mathbf {W})\) has full column rank which is the case almost everywhere (with respect to the Lebesgue measure) since \(m \ge n\).
- 2.
The Matlab code that produced the results is available at https://sites.uclouvain.be/absil/2018.01.
- 3.
References
Cichocki, A., Mandic, D., Phan, A.H., Caiafa, C., Zhou, G., Zhao, Q., De Lathauwer, L.: Tensor decompositions for signal processing applications: from two-way to multiway component analysis. IEEE Signal Process. Mag. 32(2), 145–163 (2015)
Sidiropoulos, N.D., De Lathauwer, L., Fu, X., Huang, K., Papalexakis, E.E., Faloutsos, C.: Tensor decomposition for signal processing and machine learning. IEEE Trans. Signal Process. 65(13), 3551–3582 (2017)
Cichocki, A., Lee, N., Oseledets, I., Phan, A.H., Zhao, Q., Mandic, D., et al.: Tensor networks for dimensionality reduction and large-scaleoptimization: Part 1 low-rank tensor decompositions. Found. Trends Mach. Learn. 9(4–5), 249–429 (2016)
Cichocki, A., Phan, A.H., Zhao, Q., Lee, N., Oseledets, I., Sugiyama, M., Mandic, D., et al.: Tensor networks for dimensionality reduction and large-scaleoptimization: Part 2 applications and future perspectives. Found. Trends Mach. Learn. 9(6), 431–673 (2017)
De Lathauwer, L.: Decompositions of a higher-order tensor in block terms-Part I: Lemmas for partitioned matrices. SIAM J. Matrix Anal. Appl. 30(3), 1022–1032 (2008)
De Lathauwer, L.: Decompositions of a higher-order tensor in block terms-Part II: Definitions and uniqueness. SIAM J. Matrix Anal. Appl. 30(3), 1033–1066 (2008)
De Lathauwer, L., Nion, D.: Decompositions of a higher-order tensor in block terms-Part III: alternating least squares algorithms. SIAM J. Matrix Anal. Appl. 30(3), 1067–1083 (2008)
Lathauwer, L.: Block component analysis, a new concept for blind source separation. In: Theis, F., Cichocki, A., Yeredor, A., Zibulevsky, M. (eds.) LVA/ICA 2012. LNCS, vol. 7191, pp. 1–8. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-28551-6_1
Yang, M., Kang, Z., Peng, C., Liu, W., Cheng, W.: On block term tensor decompositions and its applications in blind signal separation. http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20160105102343471889031
De Lathauwer, L.: Blind separation of exponential polynomials and the decomposition of a tensor in rank-\((L_r, L_r,1)\) terms. SIAM J. Matrix Anal. Appl. 32(4), 1451–1474 (2011)
Debals, O., Van Barel, M., De Lathauwer, L.: Löwner-based blind signal separation of rational functions with applications. IEEE Trans. Signal Process. 64(8), 1909–1918 (2016)
Hunyadi, B., Camps, D., Sorber, L., Van Paesschen, W., De Vos, M., Van Huffel, S., De Lathauwer, L.: Block term decomposition for modelling epileptic seizures. EURASIP J. Adv. Signal Process. 2014(1), 139 (2014)
Chatzichristos, C., Kofidis, E., Kopsinis, Y., Moreno, M.M., Theodoridis, S.: Higher-order block term decomposition for spatially folded fMRI data. In: Tichavský, P., Babaie-Zadeh, M., Michel, O.J.J., Thirion-Moreau, N. (eds.) LVA/ICA 2017. LNCS, vol. 10169, pp. 3–15. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-53547-0_1
Chatzichristos, C., Kofidis, E., Theodoridis, S.: PARAFAC2 and its block term decomposition analog for blind fMRI source unmixing. In: 2017 25th European Signal Processing Conference (EUSIPCO), pp. 2081–2085, August 2017
Vervliet, N., Debals, O., Sorber, L., Van Barel, M., De Lathauwer, L.: Tensorlab 3.0, March 2016. https://www.tensorlab.net
De Lathauwer, L., De Moor, B., Vandewalle, J.: On the best rank-\(1\) and rank-\(({R_1},{R_2},\dots,{R_N})\) approximation of higher-order tensors. SIAM J. Matrix Anal. Appl. 21(4), 1324–1342 (2000)
Ishteva, M., Absil, P.-A., Van Huffel, S., De Lathauwer, L.: Best low multilinear rank approximation of higher-order tensors, based on the Riemannian trust-region scheme. SIAM J. Matrix Anal. Appl. 32(1), 115–135 (2011)
Savas, B., Lim, L.-H.: Quasi-Newton methods on grassmannians and multilinear approximations of tensors. SIAM J. Sci. Comput. 32(6), 3352–3393 (2010)
Olikier, G., Absil, P.-A., De Lathauwer, L.: A variable projection method for block term decomposition of higher-order tensors. Accepted for ESANN 2018
Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Olikier, G., Absil, PA., De Lathauwer, L. (2018). Variable Projection Applied to Block Term Decomposition of Higher-Order Tensors. In: Deville, Y., Gannot, S., Mason, R., Plumbley, M., Ward, D. (eds) Latent Variable Analysis and Signal Separation. LVA/ICA 2018. Lecture Notes in Computer Science(), vol 10891. Springer, Cham. https://doi.org/10.1007/978-3-319-93764-9_14
Download citation
DOI: https://doi.org/10.1007/978-3-319-93764-9_14
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-93763-2
Online ISBN: 978-3-319-93764-9
eBook Packages: Computer ScienceComputer Science (R0)