Contrastive Hebbian learning

{{Short description|Biologically plausible form of Hebbian learning with power equivalent to backpropagation}}

Contrastive Hebbian learning is a biologically plausible form of Hebbian learning.

It is based on the contrastive divergence algorithm, which has been used to train a variety of energy-based latent variable models.{{Cite journal|last1=Qiu|first1=Yixuan|last2=Zhang|first2=Lingsong|last3=Wang|first3=Xiao|date=2019-09-25|title=Unbiased Contrastive Divergence Algorithm for Training Energy-Based Latent Variable Models|url=https://openreview.net/forum?id=r1eyceSYPr|language=en}} presented at the International Conference on Learning Representations, 2019

In 2003, contrastive Hebbian learning was shown to be equivalent in power to the backpropagation algorithms commonly used in machine learning.{{Cite journal|last1=Xie|first1=Xiaohui|last2=Seung|first2=H. Sebastian|date=February 2003|title=Equivalence of backpropagation and contrastive Hebbian learning in a layered network|url=https://pubmed.ncbi.nlm.nih.gov/12590814/|journal=Neural Computation|volume=15|issue=2|pages=441–454|doi=10.1162/089976603762552988|issn=0899-7667|pmid=12590814|s2cid=11201868 }}

See also

References

{{reflist}}

{{Hebbian learning}}

Category:Hebbian theory

Category:Artificial neural networks

{{neuroscience-stub}}

{{computing-stub}}