Jump to content

Kernel-independent component analysis

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Tale.Spin (talk | contribs) at 13:44, 21 October 2015 (Main idea: Fixing reference section). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Kernel independent component analysis (Kernel ICA) is an efficient algorithm for independent component analysis based on estimating source components, which are represented in a reproducing kernel Hilbert space based on optimizing a generalized variance contrast function.[1][2] Those contrast functions use the notion of mutual information as a measure of statistical independence.

Main idea

Kernel ICA is based on the idea that correlations between two random variables can be represented in a reproducing kernel Hilbert space (RKHS), denoted by , associated with a feature map defined for a fixed . The -correlation between two random variables and is defined

where the functions range over and

for fixed .[1] Note that the reproducing property implies that for fixed and .<ref name = "Saitoh">Saitoh, Saburou (1988). Theory of Reproducing Kernels and Its Applications. Longman. ISBN 0582035643.

  1. ^ a b Bach, Francis R.; Jordan, Michael I. (2003). "Kernel independent component analysis" (PDF). The Journal of Machine Learning Research. 3: 1–48. doi:10.1162/153244303768966085.
  2. ^ Bach, Francis R.; Jordan, Michael I. (2003). "Kernel independent component analysis" (PDF). IEEE International Conference on Acoustics, Speech, and Signal Processing. doi:10.1109/icassp.2003.1202783.