Moving-average model

{{Short description|Time series model}}

{{distinguish|Moving average}}

In time series analysis, the moving-average model (MA model), also known as moving-average process, is a common approach for modeling univariate time series.{{Cite book |author=Shumway, Robert H. |url=http://worldcat.org/oclc/966563984 |title=Time series analysis and its applications : with R examples |last2=Stoffer |first2=David S. |date=19 April 2017 |publisher=Springer |isbn=978-3-319-52451-1 |oclc=966563984 |author-link2=David S. Stoffer}}{{Cite web |title=2.1 Moving Average Models (MA models) {{!}} STAT 510 |url=https://online.stat.psu.edu/stat510/lesson/2/2.1 |access-date=2023-02-27 |website=PennState: Statistics Online Courses |language=en}} The moving-average model specifies that the output variable is cross-correlated with a non-identical to itself random-variable.

Together with the autoregressive (AR) model, the moving-average model is a special case and key component of the more general ARMA and ARIMA models of time series,{{Citation |last1=Shumway |first1=Robert H. |title=ARIMA Models |date=2019-05-17 |url=http://dx.doi.org/10.1201/9780429273285-5 |work=Time Series: A Data Analysis Approach Using R |pages=99–128 |access-date=2023-02-27 |place=Boca Raton : CRC Press, Taylor & Francis Group, 2019. |publisher=Chapman and Hall/CRC |isbn=978-0-429-27328-5 |last2=Stoffer |first2=David S. |doi=10.1201/9780429273285-5 |author-link2=David S. Stoffer|url-access=subscription }} which have a more complicated stochastic structure. Contrary to the AR model, the finite MA model is always stationary.

The moving-average model should not be confused with the moving average, a distinct concept despite some similarities.

Definition

The notation MA(q) refers to the moving average model of order q:

: X_t = \mu + \varepsilon_t + \theta_1 \varepsilon_{t-1} + \cdots + \theta_q \varepsilon_{t-q} = \mu + \sum_{i=1}^q \theta_i \varepsilon_{t-i} + \varepsilon_{t},

where \mu is the mean of the series, the \theta_1,...,\theta_q are the coefficients of the model{{examples|date=December 2018}} and \varepsilon_t, \varepsilon_{t-1},..., \varepsilon_{t-q} are the error terms. The value of q is called the order of the MA model. This can be equivalently written in terms of the backshift operator B as{{Cite book |last1=Box |first1=George E. P. |url=https://www.worldcat.org/oclc/908107438 |title=Time series analysis : forecasting and control |last2=Jenkins |first2=Gwilym M. |last3=Reinsel |first3=Gregory C. |last4=Ljung |first4=Greta M. |date=2016 |publisher=John Wiley & Sons, Incorporated |isbn=978-1-118-67492-5 |edition=5th |location=Hoboken, New Jersey |pages=53 |language=en |oclc=908107438}}

:X_t = \mu + (1 + \theta_1 B + \cdots + \theta_q B^q)\varepsilon_t.

Thus, a moving-average model is conceptually a linear regression of the current value of the series against current and previous (observed) white noise error terms or random shocks. The random shocks at each point are assumed to be mutually independent and to come from the same distribution, typically a normal distribution, with location at zero and constant scale.

Interpretation

The moving-average model is essentially a finite impulse response filter applied to white noise, with some additional interpretation placed on it.{{Clarify|reason=What is the additional interpretation?|date=February 2023}} The role of the random shocks in the MA model differs from their role in the autoregressive (AR) model in two ways. First, they are propagated to future values of the time series directly: for example, \varepsilon _{t-1} appears directly on the right side of the equation for X_t. In contrast, in an AR model \varepsilon _{t-1} does not appear on the right side of the X_t equation, but it does appear on the right side of the X_{t-1} equation, and X_{t-1} appears on the right side of the X_t equation, giving only an indirect effect of \varepsilon_{t-1} on X_t. Second, in the MA model a shock affects X values only for the current period and q periods into the future; in contrast, in the AR model a shock affects X values infinitely far into the future, because \varepsilon _t affects X_t, which affects X_{t+1}, which affects X_{t+2}, and so on forever (see Impulse response).

Fitting the model

Fitting a moving-average model is generally more complicated than fitting an autoregressive model.{{Cite web |title=Autoregressive Moving Average ARMA(p, q) Models for Time Series Analysis - Part 1 {{!}} QuantStart |url=https://www.quantstart.com/articles/Autoregressive-Moving-Average-ARMA-p-q-Models-for-Time-Series-Analysis-Part-1/ |access-date=2023-02-27 |website=www.quantstart.com}} This is because the lagged error terms are not observable. This means that iterative non-linear fitting procedures need to be used in place of linear least squares. Moving average models are linear combinations of past white noise terms, while autoregressive models are linear combinations of past time series values.{{Cite web |title=Autoregressive Moving Average ARMA(p, q) Models for Time Series Analysis - Part 2 {{!}} QuantStart |url=https://www.quantstart.com/articles/Autoregressive-Moving-Average-ARMA-p-q-Models-for-Time-Series-Analysis-Part-2/ |access-date=2023-02-27 |website=www.quantstart.com}} ARMA models are more complicated than pure AR and MA models, as they combine both autoregressive and moving average components.

The autocorrelation function (ACF) of an MA(q) process is zero at lag q + 1 and greater. Therefore, we determine the appropriate maximum lag for the estimation by examining the sample autocorrelation function to see where it becomes insignificantly different from zero for all lags beyond a certain lag, which is designated as the maximum lag q.

Sometimes the ACF and partial autocorrelation function (PACF) will suggest that an MA model would be a better model choice and sometimes both AR and MA terms should be used in the same model (see Box–Jenkins method).

Autoregressive Integrated Moving Average (ARIMA) models are an alternative to segmented regression that can also be used for fitting a moving-average model.{{Cite journal |last1=Schaffer |first1=Andrea L. |last2=Dobbins |first2=Timothy A. |last3=Pearson |first3=Sallie-Anne |date=2021-03-22 |title=Interrupted time series analysis using autoregressive integrated moving average (ARIMA) models: a guide for evaluating large-scale health interventions |journal=BMC Medical Research Methodology |volume=21 |issue=1 |pages=58 |doi=10.1186/s12874-021-01235-8 |issn=1471-2288 |pmc=7986567 |pmid=33752604 |doi-access=free}}

See also

References

Further reading

  • {{cite book |last=Enders |first=Walter |chapter=Stationary Time-Series Models |title=Applied Econometric Time Series |location=New York |publisher=Wiley |edition=Second |year=2004 |isbn=0-471-45173-8 |pages=48–107 }}