Jump to content

Tensor decomposition

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Datasciencephobia (talk | contribs) at 19:39, 12 March 2023. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In multilinear algebra, a tensor decomposition [1] [2] is any scheme for expressing a tensor as a sequence of elementary operations acting on other, often simpler tensors. Many tensor decompositions generalize some matrix decompositions.[3]

Tensors are generalizations of matrices to higher dimensions and can consequently be treated as multidimensional fields [4]. The main tensor decompositions are:

Preliminary Definitions and Notation

This section introduces basic notations and operations that are widely used in the field. A summary of symbols that we use through the whole thesis can be found in the table.

Table of symbols and their description.
Symbols Definition
Matrix, Column vector, Scalar
Set of Real Numbers
Vectorization operator

Introduction

A multi-view graph with K views is a collection of K matrices with dimensions I × J (where I, J are the number of nodes). This collection of matrices is naturally represented as a tensor X of size I × J × K. In order to avoid overloading the term “dimension”, we call an I × J × K tensor a three “mode” tensor, where “modes” are the numbers of indices used to index the tensor.


References

  1. ^ Sidiropoulos, Nicholas D.; De Lathauwer, Lieven; Fu, Xiao; Huang, Kejun; Papalexakis, Evangelos E.; Faloutsos, Christos (2017-07-01). "Tensor Decomposition for Signal Processing and Machine Learning". IEEE Transactions on Signal Processing. 65 (13): 3551–3582. doi:10.1109/TSP.2017.2690524. ISSN 1053-587X.
  2. ^ Kolda, Tamara G.; Bader, Brett W. (2009-08-06). "Tensor Decompositions and Applications". SIAM Review. 51 (3): 455–500. doi:10.1137/07070111X. ISSN 0036-1445.
  3. ^ Bernardi, A.; Brachat, J.; Comon, P.; Mourrain, B. (2013-05-01). "General tensor decomposition, moment matrices and applications". Journal of Symbolic Computation. 52: 51–71. arXiv:1105.1229. doi:10.1016/j.jsc.2012.05.012. ISSN 0747-7171. S2CID 14181289.
  4. ^ Rabanser, Stephan; Shchur, Oleksandr; Günnemann, Stephan (2017). "Introduction to Tensor Decompositions and their Applications in Machine Learning". doi:10.48550/ARXIV.1711.10781. {{cite journal}}: Cite journal requires |journal= (help)
  5. ^ Papalexakis, Evangelos E. (2016-06-30). "Automatic Unsupervised Tensor Mining with Quality Assessment". Proceedings of the 2016 SIAM International Conference on Data Mining. Society for Industrial and Applied Mathematics: 711–719. doi:10.1137/1.9781611974348.80. ISBN 978-1-61197-434-8.
  6. ^ Gujral, Ekta; Pasricha, Ravdeep; Papalexakis, Evangelos E. (7 May 2018). "SamBaTen: Sampling-based Batch Incremental Tensor Decomposition". Proceedings of the 2018 SIAM International Conference on Data Mining. doi:10.1137/1.9781611975321.
  7. ^ Gujral, Ekta; Papalexakis, Evangelos E. (9 October 2020). "OnlineBTD: Streaming Algorithms to Track the Block Term Decomposition of Large Tensors". 2020 IEEE 7th International Conference on Data Science and Advanced Analytics (DSAA). doi:10.1109/DSAA49011.2020.00029.
  8. ^ Gujral, Ekta (2022). "Modeling and Mining Multi-Aspect Graphs With Scalable Streaming Tensor Decomposition". doi:10.48550/ARXIV.2210.04404. {{cite journal}}: Cite journal requires |journal= (help)
  9. ^ De Lathauwer, Lieven (2008-01). "Decompositions of a Higher-Order Tensor in Block Terms—Part II: Definitions and Uniqueness". SIAM Journal on Matrix Analysis and Applications. 30 (3): 1033–1066. doi:10.1137/070690729. ISSN 0895-4798. {{cite journal}}: Check date values in: |date= (help)
  10. ^ Gujral, Ekta; Pasricha, Ravdeep; Papalexakis, Evangelos (2020-04-20). "Beyond Rank-1: Discovering Rich Community Structure in Multi-Aspect Graphs". Proceedings of The Web Conference 2020. Taipei Taiwan: ACM: 452–462. doi:10.1145/3366423.3380129. ISBN 978-1-4503-7023-3.