Jump to content

Generalized iterative scaling

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Rjwilmsi (talk | contribs) at 10:57, 13 August 2014 (Journal cites, Added 2 dois to journal cites using AWB (10365)). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In statistics, generalized iterative scaling (GIS) and improved iterative scaling (IIS) are two early algorithms used to fit log-linear models,[1] notably multinomial logistic regression (MaxEnt) classifiers and extensions of it such as MaxEnt Markov models[2] and conditional random fields. These algorithms have been largely surpassed by gradient-based methods such as L-BFGS[3] and coordinate descent algorithms.[4]

See also

References

  1. ^ Darroch, J.N. and Ratcliff, D. (1972). "Generalized iterative scaling for log-linear models". The Annals of Mathematical Statistics. 43 (5). Institute of Mathematical Statistics: 1470–1480. doi:10.1214/aoms/1177692379.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  2. ^ McCallum, Andrew; Freitag, Dayne; Pereira, Fernando (2000). "Maximum Entropy Markov Models for Information Extraction and Segmentation" (PDF). Proc. ICML 2000. pp. 591–598. {{cite conference}}: Unknown parameter |booktitle= ignored (|book-title= suggested) (help)
  3. ^ Malouf (2002). A comparison of algorithms for maximum entropy parameter estimation (PDF). Sixth Conf. on Natural Language Learning (CoNLL). pp. 49–55. {{cite conference}}: |first= missing |last= (help)
  4. ^ Yu, Hsiang-Fu; Huang, Fang-Lan; Lin, Chih-Jen (2011). "Dual coordinate descent methods for logistic regression and maximum entropy models" (PDF). Machine Learning. 85: 41–75. doi:10.1007/s10994-010-5221-8.