Jump to content

Batch normalization

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Wikisanchez (talk | contribs) at 16:52, 24 April 2018 (Created page with 'Batch normalisation is a technique for improving the performance and stability of neural networks. It was introduced in a 2015 paper.<ref>{{cite journal|last1=Io...'). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Batch normalisation is a technique for improving the performance and stability of neural networks. It was introduced in a 2015 paper.[1][2]

See also

  1. ^ Ioffe, Sergey; Szegedy, Christian. "Batch Normalization: Accelerating Deep Network Training b y Reducing Internal Covariate Shift" (PDF). {{cite journal}}: Cite journal requires |journal= (help)
  2. ^ "Glossary of Deep Learning: Batch Normalisation". medium.com. Retrieved 24 April 2018.