Jump to content

Backpropagation through structure

From Wikipedia, the free encyclopedia
This is the current revision of this page, as edited by Perceptron599 (talk | contribs) at 20:08, 12 November 2024 (This is false). The present address (URL) is a permanent link to this version.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Backpropagation through structure (BPTS) is a gradient-based technique for training recursive neural networks, proposed in a 1996 paper written by Christoph Goller and Andreas Küchler.[1]

References

[edit]
  1. ^ Goller, Christoph; Küchler, Andreas (1996). "Learning Task-Dependent Distributed Representations by Backpropagation Through Structure". Proceedings of International Conference on Neural Networks (ICNN'96). Vol. 1. pp. 347–352. CiteSeerX 10.1.1.49.1968. doi:10.1109/ICNN.1996.548916. ISBN 0-7803-3210-5. S2CID 6536466.