Jump to content

Contrastive Hebbian learning

From Wikipedia, the free encyclopedia
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

Contrastive Hebbian learning is a biologically plausible form of Hebbian learning.

It is based on the contrastive divergence algorithm, which has been used to train a variety of energy-based latent variable models.[1]

In 2003, contrastive Hebbian learning was shown to be equivalent in power to the backpropagation algorithms commonly used in machine learning.[2]

See also

References

  1. ^ Qiu, Yixuan; Zhang, Lingsong; Wang, Xiao (2019-09-25). "Unbiased Contrastive Divergence Algorithm for Training Energy-Based Latent Variable Models". {{cite journal}}: Cite journal requires |journal= (help) presented at the International Conference on Learning Representations, 2019
  2. ^ Xie, Xiaohui; Seung, H. Sebastian (February 2003). "Equivalence of backpropagation and contrastive Hebbian learning in a layered network". Neural Computation. 15 (2): 441–454. doi:10.1162/089976603762552988. ISSN 0899-7667. PMID 12590814. S2CID 11201868.