Jump to content

Contrastive Hebbian learning

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 2a04:ee41:3:c057:f83c:5d83:50fc:827b (talk) at 16:23, 21 June 2021 (At least the reference this sentence is based on stems from 2003 - not 2021.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Contrastive Hebbian learning is a biologically plausible form of Hebbian learning.

It is based on the contrastive divergence algorithm, which has been used to train a variety of energy-based latent variable models.[1]

In 2003, contrastive Hebbian learning was shown to be equivalent in power to the backpropagation algorithms commonly used in machine learning.[2]

References

  1. ^ Qiu, Yixuan; Zhang, Lingsong; Wang, Xiao (2019-09-25). "Unbiased Contrastive Divergence Algorithm for Training Energy-Based Latent Variable Models". {{cite journal}}: Cite journal requires |journal= (help) presented at the International Conference on Learning Representations, 2019
  2. ^ Xie, Xiaohui; Seung, H. Sebastian (February 2003). "Equivalence of backpropagation and contrastive Hebbian learning in a layered network". Neural Computation. 15 (2): 441โ€“454. doi:10.1162/089976603762552988. ISSN 0899-7667. PMID 12590814.

See also