Two-dimensional singular-value decomposition
This article may meet Wikipedia's criteria for speedy deletion as a page that needs to be deleted to merge histories, reverse a redirect, or perform other non-controversial technical tasks. See CSD G6.
If this article does not meet the criteria for speedy deletion, please remove this notice. {{db-g6|rationale=reason}} , replacing "reason" with a specific rationale for the requested deletion.Administrators: check links, talk, history (last), and logs before deletion. This page was last edited by Reywas92 (contribs | logs) at 00:38, 24 May 2009 (UTC) (16 years ago) |
![]() |
Two-dimensional singular value decomposition (2DSVD) computes the low-rank approximation of a set of matrices such as 2D images or weather maps in a manner almost identical to SVD (singular value decomposition) which computes the low-rank approximation of a single matrix (or a set of 1D vectors).
SVD
Let matrix contains the set of 1D vectors which have been centered. In PCA/SVD, we construct covariance matrix and Gram matrix
- , ,
and compute their eigenvectors and . Since , we have
If we retain only principal eigenvectors in , this gives low-rank approximation of .
2DSVD
Here we deal with a set of 2D matrices . Suppose they are centered . We construct row-row and column-column covariance matrices
- ,
in exactly the same manner as in SVD, and compute their eigenvectors and . We approximate as
in identical fashion as in SVD. This gives a near optimal low-rank approximation of with the objective function
Error bounds similar to Eckard-Young Theorem also exist.
2DSVD is mostly used in image compression and representation.
References
- Chris Ding and Jieping Ye. "Two-dimensional Singular Value Decomposition (2DSVD) for 2D Maps and Images". Proc. SIAM Int'l Conf. Data Mining (SDM'05), pp:32-43, April 2005. http://ranger.uta.edu/~chqding/papers/2dsvdSDM05.pdf
- Jieping Ye. "Generalized Low Rank Approximations of Matrices". Machine Learning Journal. Vol. 61, pp. 167—191, 2005.