Jump to content

Contrastive Hebbian learning

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by The Anome (talk | contribs) at 23:02, 6 April 2021 (fix typo). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Contrastive Hebbian learning is a biologically plausible form of Hebbian learning.

It is based on the contrastive divergence algorithm, which has been used to train a variety of energy-based latent variable models.[1]

In 2021, contrastive Hebbian learning was shown to be equivalent in power to the backpropagation algorithms used in machine learning.[2]

References

  1. ^ Qiu, Yixuan; Zhang, Lingsong; Wang, Xiao (2019-09-25). "Unbiased Contrastive Divergence Algorithm for Training Energy-Based Latent Variable Models". {{cite journal}}: Cite journal requires |journal= (help) presented at the International Conference on Learning Representations, 2019
  2. ^ Xie, Xiaohui; Seung, H. Sebastian (2003-02). "Equivalence of backpropagation and contrastive Hebbian learning in a layered network". Neural Computation. 15 (2): 441โ€“454. doi:10.1162/089976603762552988. ISSN 0899-7667. PMID 12590814. {{cite journal}}: Check date values in: |date= (help)

See also