Tensor decomposition
In multilinear algebra, a tensor decomposition [1][2] is any scheme for expressing a tensor as a sequence of elementary operations acting on other, often simpler tensors. Many tensor decompositions generalize some matrix decompositions.[3]
Tensors are generalizations of matrices to higher dimensions and can consequently be treated as multidimensional fields [4]. The main tensor decompositions are:
- Tensor rank decomposition[5];
- Higher-order singular value decomposition;
- Tucker decomposition;
- matrix product states, and operators or tensor trains;
- Online Tensor Decompositions[6][7][8][9]
- hierarchical Tucker decomposition; and
- block term decomposition[10][11]
Preliminary Definitions and Notation
This section introduces basic notations and operations that are widely used in the field. A summary of symbols that we use through the whole thesis can be found in the table [12]
Symbols | Definition |
---|---|
Matrix, Column vector, Scalar | |
Set of Real Numbers | |
Vectorization operator |
Introduction
A multi-view graph with K views is a collection of K matrices with dimensions I × J (where I, J are the number of nodes). This collection of matrices is naturally represented as a tensor X of size I × J × K. In order to avoid overloading the term “dimension”, we call an I × J × K tensor a three “mode” tensor, where “modes” are the numbers of indices used to index the tensor.
References
- ^ Sidiropoulos, Nicholas D. "Tensor Decomposition for Signal Processing and Machine Learning". IEEE Transactions on Signal Processing.
- ^ Kolda, T. G. "Tensor Decompositions and Applications". SIAM Review. doi:10.1137/07070111X.
- ^ Bernardi, A.; Brachat, J.; Comon, P.; Mourrain, B. (2013-05-01). "General tensor decomposition, moment matrices and applications". Journal of Symbolic Computation. 52: 51–71. arXiv:1105.1229. doi:10.1016/j.jsc.2012.05.012. ISSN 0747-7171. S2CID 14181289.
- ^ Rabanser, Stephan. "Introduction to Tensor Decompositions and their Applications in Machine Learning" (PDF).
- ^ Papalexakis, Evangelos E. "Automatic unsupervised tensor mining with quality assessment". doi:10.1137/1.9781611974348.80.
- ^ Zhou, Shuo; Vinh, Nguyen Xuan; Bailey, James; Jia, Yunzhe; Davidson, Ian (13 August 2016). "Accelerating Online CP Decompositions for Higher Order Tensors". Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining: 1375–1384. doi:10.1145/2939672.2939763.
- ^ Gujral, Ekta; Pasricha, Ravdeep; Papalexakis, Evangelos E. (7 May 2018). "SamBaTen: Sampling-based Batch Incremental Tensor Decomposition". Proceedings of the 2018 SIAM International Conference on Data Mining. doi:10.1137/1.9781611975321.
- ^ Gujral, Ekta; Papalexakis, Evangelos E. (9 October 2020). "OnlineBTD: Streaming Algorithms to Track the Block Term Decomposition of Large Tensors". 2020 IEEE 7th International Conference on Data Science and Advanced Analytics (DSAA). doi:10.1109/DSAA49011.2020.00029.
- ^ Gujral, Ekta (2022). "Modeling and Mining Multi-Aspect Graphs With Scalable Streaming Tensor Decomposition". doi:10.48550/arXiv.2210.04404.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Lathauwer, Lieven De. "Decompositions of a Higher-Order Tensor in Block Terms—Part II: Definitions and Uniqueness".
- ^ Gujral, Ekta; Pasricha, Ravdeep; Papalexakis, Evangelos (20 April 2020). "Beyond Rank-1: Discovering Rich Community Structure in Multi-Aspect Graphs". Proceedings of The Web Conference 2020: 452–462. doi:10.1145/3366423.3380129.
- ^ Gujral, Ekta (2022). "Modeling and Mining Multi-Aspect Graphs With Scalable Streaming Tensor Decomposition". doi:10.48550/arXiv.2210.04404.
{{cite journal}}
: Cite journal requires|journal=
(help)