Jump to content

Additive model

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 212.137.36.237 (talk) at 19:31, 17 January 2012 (Disambiguated: ACEAlternating Conditional Expectations). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In statistics, an additive model (AM) is a nonparametric regression method. It was suggested by Jerome H. Friedman and Werner Stuetzle (1981) and is an essential part of the ACE algorithm. The AM uses a one dimensional smoother to build a restricted class of nonparametric regression models. Because of this, it is less affected by the curse of dimensionality than e.g. a p-dimensional smoother. Furthermore, the AM is more flexible than a standard linear model, while being more interpretable than a general regression surface at the cost of approximation errors. Problems with AM include model selection, overfitting, and multicollinearity.

Description

Given a data set of n statistical units, where represent predictors and is the outcome, the additive model takes the form

or

Where , and . The functions are unknown smooth functions fit from the data. Fitting the AM (i.e. the functions ) can be done using the backfitting algorithm proposed by Andreas Buja, Trevor Hastie and Robert Tibshirani (1989).

See also

References

  • Buja, A., Hastie, T., and Tibshirani, R. (1989). "Linear Smoothers and Additive Models", The Annals of Statistics 17(2):453–555.
  • Breiman, L. and Friedman, J.H. (1985). "Estimating Optimal Transformations for Multiple Regression and Correlation", Journal of the American Statistical Association 80:580–598.
  • Friedman, J.H. and Stuetzle, W. (1981. "Projection Pursuit Regression", Journal of the American Statistical Association 76:817–823