Tensor decomposition
This article needs additional citations for verification. (June 2021) |
In multilinear algebra, a tensor decomposition [1] [2] is any scheme for expressing a tensor as a sequence of elementary operations acting on other, often simpler tensors. Many tensor decompositions generalize some matrix decompositions.[3]
Tensors are generalizations of matrices to higher dimensions and can consequently be treated as multidimensional fields [4]. The main tensor decompositions are:
- Tensor rank decomposition[5];
- Higher-order singular value decomposition;
- Tucker decomposition;
- matrix product states, and operators or tensor trains;
- Online Tensor Decompositions[6][7][8]
- hierarchical Tucker decomposition; and
- block term decomposition[9][10].
Preliminary Definitions and Notation
This section introduces basic notations and operations that are widely used in the field. A summary of symbols that we use through the whole thesis can be found in the table [11]
- ^ Sidiropoulos, Nicholas D.; De Lathauwer, Lieven; Fu, Xiao; Huang, Kejun; Papalexakis, Evangelos E.; Faloutsos, Christos (2017-07-01). "Tensor Decomposition for Signal Processing and Machine Learning". IEEE Transactions on Signal Processing. 65 (13): 3551–3582. doi:10.1109/TSP.2017.2690524. ISSN 1053-587X.
- ^ Kolda, Tamara G.; Bader, Brett W. (2009-08-06). "Tensor Decompositions and Applications". SIAM Review. 51 (3): 455–500. doi:10.1137/07070111X. ISSN 0036-1445.
- ^ Bernardi, A.; Brachat, J.; Comon, P.; Mourrain, B. (2013-05-01). "General tensor decomposition, moment matrices and applications". Journal of Symbolic Computation. 52: 51–71. arXiv:1105.1229. doi:10.1016/j.jsc.2012.05.012. ISSN 0747-7171. S2CID 14181289.
- ^ Rabanser, Stephan; Shchur, Oleksandr; Günnemann, Stephan (2017). "Introduction to Tensor Decompositions and their Applications in Machine Learning". doi:10.48550/ARXIV.1711.10781.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Papalexakis, Evangelos E. (2016-06-30). "Automatic Unsupervised Tensor Mining with Quality Assessment". Proceedings of the 2016 SIAM International Conference on Data Mining. Society for Industrial and Applied Mathematics: 711–719. doi:10.1137/1.9781611974348.80. ISBN 978-1-61197-434-8.
- ^ Gujral, Ekta; Pasricha, Ravdeep; Papalexakis, Evangelos E. (7 May 2018). "SamBaTen: Sampling-based Batch Incremental Tensor Decomposition". Proceedings of the 2018 SIAM International Conference on Data Mining. doi:10.1137/1.9781611975321.
- ^ Gujral, Ekta; Papalexakis, Evangelos E. (9 October 2020). "OnlineBTD: Streaming Algorithms to Track the Block Term Decomposition of Large Tensors". 2020 IEEE 7th International Conference on Data Science and Advanced Analytics (DSAA). doi:10.1109/DSAA49011.2020.00029.
- ^ Gujral, Ekta (2022). "Modeling and Mining Multi-Aspect Graphs With Scalable Streaming Tensor Decomposition". doi:10.48550/ARXIV.2210.04404.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Lathauwer, Lieven De. "Decompositions of a Higher-Order Tensor in Block Terms—Part II: Definitions and Uniqueness". doi:10.1137/070690729.
- ^ Gujral, Ekta. "Beyond rank-1: Discovering rich community structure in multi-aspect graphs". doi:10.1145/3366423.3380129.
- ^ Gujral, Ekta (2022). "Modeling and Mining Multi-Aspect Graphs With Scalable Streaming Tensor Decomposition". doi:10.48550/ARXIV.2210.04404.
{{cite journal}}
: Cite journal requires|journal=
(help)
Cite error: A list-defined reference with the name ":0" has been invoked, but is not defined in the <references>
tag (see the help page).
Symbols | Definition |
---|---|
Matrix, Column vector, Scalar | |
Set of Real Numbers | |
Vectorization operator |
Introduction
A multi-view graph with K views is a collection of K matrices with dimensions I × J (where I, J are the number of nodes). This collection of matrices is naturally represented as a tensor X of size I × J × K. In order to avoid overloading the term “dimension”, we call an I × J × K tensor a three “mode” tensor, where “modes” are the numbers of indices used to index the tensor.
References