Jump to content

Autoregressive moving-average model

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Wile E. Heresiarch (talk | contribs) at 20:50, 29 June 2004 (initial revision). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

In statistics, autoregressive moving average (ARMA) models are typically applied to time series data.

Suppose we have at hand two time series, x0, ..., xt and y0, ..., yt. The series x is conventionally assumed to be unpredictable "shocks" which affect or modify y. We wish to predict yt+1. We may do so in various ways.

If the prediction model contains only x terms, the model is called a moving average (MA) model. The notation MA(q) means a moving average model with q terms.

If the prediction model contains only y terms, the model is called an autoregressive (AR) model. The notation AR(p) means an autoregressive model with p terms.

If the prediction model contains both x and y terms, the model is called a moving average autoregressive (ARMA) model. The notation ARMA(p,q) means a model with p autoregressive terms and q moving average terms.

The dependence of yt+1 on past values of x or y is assumed to be linear unless specified otherwise. If the dependence is nonlinear, the model is specifically called a nonlinear moving average (NMA), nonlinear autoregressive (NAR), or nonlinear autoregressive moving average (NARMA) model.

Autoregressive moving average models can be generalized in other ways. See also autoregressive conditional heteroskedasticity (GARCH) models and autoregressive integrated moving average (ARIMA) models.