Jump to content

Contrastive Hebbian learning

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by The Anome (talk | contribs) at 18:07, 6 April 2021 (presented at the International Conference on Learning Representations, 2019</ref>). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Contrastive Hebbian learning is a biologically plausible form of Hebbian learning. It has been shown to be equivalent in power to the backpropagation algorithms used in machine learning.[1]

It is based on the contrastive divergence algorithm, which has been used to train a variety of energy-based latent variable models.[2]

References

  1. ^ Xie, Xiaohui; Seung, H. Sebastian (2003-02). "Equivalence of backpropagation and contrastive Hebbian learning in a layered network". Neural Computation. 15 (2): 441โ€“454. doi:10.1162/089976603762552988. ISSN 0899-7667. PMID 12590814. {{cite journal}}: Check date values in: |date= (help)
  2. ^ Qiu, Yixuan; Zhang, Lingsong; Wang, Xiao (2019-09-25). "Unbiased Contrastive Divergence Algorithm for Training Energy-Based Latent Variable Models". {{cite journal}}: Cite journal requires |journal= (help), presented at the International Conference on Learning Representations, 2019

See also