Autoregressive moving-average model
In statistics, autoregressive moving average (ARMA) models are typically applied to time series data.
Suppose we have at hand two time series, x0, ..., xt and y0, ..., yt. The series x is conventionally assumed to be unpredictable "shocks" which affect or modify y. We wish to predict yt+1. We may do so in various ways.
If the prediction model contains only x terms, the model is called a moving average (MA) model. The notation MA(q) means a moving average model with q terms.
If the prediction model contains only y terms, the model is called an autoregressive (AR) model. The notation AR(p) means an autoregressive model with p terms.
If the prediction model contains both x and y terms, the model is called a moving average autoregressive (ARMA) model. The notation ARMA(p,q) means a model with p autoregressive terms and q moving average terms.
The dependence of yt+1 on past values of x or y is assumed to be linear unless specified otherwise. If the dependence is nonlinear, the model is specifically called a nonlinear moving average (NMA), nonlinear autoregressive (NAR), or nonlinear autoregressive moving average (NARMA) model.
Autoregressive moving average models can be generalized in other ways. See also autoregressive conditional heteroskedasticity (GARCH) models and autoregressive integrated moving average (ARIMA) models.