Jump to content

Logistic model tree

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Masssly (talk | contribs) at 19:11, 3 April 2014 (clean up using AWB). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In computer science, a logistic model tree (LMT) is a classification model with an associated supervised training algorithm that combines logistic regression (LR) and decision tree learning.[1][2]

Logistic model trees are based on the earlier idea of a model tree: a decision tree that has linear regression models at its leaves to provide a piecewise linear regression model (where ordinary decision trees with constants at their leaves would produce a piecewise constant model).[1] In the logistic variant, the LogitBoost algorithm is used to produce an LR model at every node in the tree; the node is then split using the C4.5 criterion. Each LogitBoost invocation is warm-started from its results in the parent node. Finally, the tree is pruned.[3]

The basic LMT induction algorithm uses cross-validation to find a number of LogitBoost iterations that does not overfit the training data. A faster version has been proposed that uses the Akaike information criterion to control LogitBoost stopping.[3]

References

  1. ^ a b Logistic model trees (PDF). ECML PKDD. 2003. {{cite conference}}: Unknown parameter |authors= ignored (help)
  2. ^ Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1007/s10994-005-0466-3, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with |doi=10.1007/s10994-005-0466-3 instead.
  3. ^ a b Speeding up logistic model tree induction (PDF). PKDD. Springer. 2005. pp. 675–683. {{cite conference}}: Unknown parameter |authors= ignored (help)

See also