Jump to content

Tensor decomposition

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Datasciencephobia (talk | contribs) at 17:27, 12 March 2023 (Tensor Decomposition methods made lot of advancements and this page is not yet updated.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In multilinear algebra, a tensor decomposition is any scheme for expressing a tensor as a sequence of elementary operations acting on other, often simpler tensors. Many tensor decompositions generalize some matrix decompositions.[1]

Tensors are generalizations of matrices to higher dimensions and can consequently be treated as multidimensional fields [2]. The main tensor decompositions are:


References

  1. ^ Bernardi, A.; Brachat, J.; Comon, P.; Mourrain, B. (2013-05-01). "General tensor decomposition, moment matrices and applications". Journal of Symbolic Computation. 52: 51–71. arXiv:1105.1229. doi:10.1016/j.jsc.2012.05.012. ISSN 0747-7171. S2CID 14181289.
  2. ^ Rabanser, Stephan. "Introduction to Tensor Decompositions and their Applications in Machine Learning" (PDF).
  3. ^ Papalexakis, Evangelos E. "Automatic unsupervised tensor mining with quality assessment".
  4. ^ Gujral, Ekta. "Modeling and Mining Multi-Aspect Graphs With Scalable Streaming Tensor Decomposition".
  5. ^ Gujral, Ekta. "OnlineBTD: Streaming Algorithms to Track the Block Term Decomposition of Large Tensors". IEEE. WWW '20: Proceedings of The Web Conference 2020.
  6. ^ Lathauwer, Lieven De. "Decompositions of a Higher-Order Tensor in Block Terms—Part II: Definitions and Uniqueness".
  7. ^ Gujral, Ekta. "Beyond rank-1: Discovering rich community structure in multi-aspect graphs".