Jump to content

Backpropagation through structure

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Fchaubard (talk | contribs) at 22:25, 20 April 2015 (Created page with 'Backpropagation Through Structure (BPTS) is a gradient-based technique for training Recursive Neural Nets (a superset of Recurrent Neural Nets) and is extensivel...'). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Backpropagation Through Structure (BPTS) is a gradient-based technique for training Recursive Neural Nets (a superset of Recurrent Neural Nets) and is extensively described in a 1996 paper written by Christoph Goller and Andreas Küchler. [1]

  1. ^ Kuchler, Andreas. "Learning Task-Dependent Distributed Representations by Backpropagation Through Structure". psu.edu. psu.edu. Retrieved 20 April 2015.