Time Series Analysis - Autoregressive Moving Average (ARMA) Models

    1. Time series and ARMA model

    Auto-Regression and Moving Average Model (ARMA model) is an important method for studying time series. It is "mixed" based on autoregressive model (AR model) and moving average model (MA model). It has the characteristics of wide application range and small prediction error.

    The general p-order autoregressive process AR(p) is:

        (1-1)

where { } is white noise and is the parameter of the autoregressive model. If it is represented by the lag operator L, then equation (1-1) can be described by the p-order polynomial of the lag operator.

        (1-2)

Among them, it is called characteristic polynomial or autoregressive operator.

    If timing { } satisfies the equation:

        (1-3)

Then { } is called the q-order moving average process , abbreviated as MA(q) . where { } is the white noise and is the parameter of the moving average model.

    The process of autoregressive moving average has the characteristics of randomness, and it includes two different parts, namely autoregression and moving average. If the former p represents the upper limit of the order value of a part, and q represents the upper limit of the order value of the latter part, then the autoregressive moving average process can be expressed as ARMA(p,q) . The specific expression is as follows:

        (1-4)

where { } is the white noise, the parameters of the autoregressive model, and the parameters of the moving average model.

    Second, the establishment of the ARMA model

    ARMA modeling steps

    (1) Judging the input data to determine whether it is a stationary non-pure random sequence, if it is stationary, go directly to step 2;

    (2) Carry out model identification and order determination of the established model through autocorrelation and partial autocorrelation functions combined with AIC or BIC criteria.

    (3) After completing the model identification and order determination, enter the parameter estimation stage of the model.

    (4) After the parameter estimation is completed, the fitness test of the fitted model is carried out. If the fitted model passes the test, the prediction phase begins. If the model test fails, re-identify and test the model, that is, repeat step 2 to select the model again.

    (5) Finally, use a fitting model with high adaptability to predict the future trend of the sequence.


    Data Stationarity Test and Processing

    If the time series meets the following requirements: (1) For any time t, its mean is constant; (2) For any time t and s, the correlation coefficient of the time series is determined by the time period between two time points Yes, the starting point of the two time points does not make any difference. Such a time series is a stationary time series.

    If an AR process is a stationary process, the absolute value of the root of its characteristic equation should be outside the unit circle; while the MA process contains a finite, stationary linear combination of white noise, so the MA process is "naturally" stationary of. The ARMA model can be regarded as a combination of the AR model and the MA model, and the MA process must be stationary. Therefore, the stationarity of the ARMA model only needs to check the stationarity of the AR part.

    The methods of stationarity test include data graph, reverse order test, runs test, unit root test, DF test, ADF test , etc.

    In practice, it is often encountered that the input time series is tested to be non-stationary, so the ARMA model cannot be used, and the usual processing method is to use the difference method to transform them into stationary. After the difference, if the time series is tested to be stationary, the time series after the difference is processed, and the corresponding stationary random process or model can be established. When a non-stationary time series is subjected to d difference processing and becomes a stationary series, a stationary ARMA(p,q) model can be used as its corresponding model, and the original time series is called an autoregressive integral moving average . Time series , represented as ARIMA(p,d,q) .

    Model Identification and Ordering

    There are generally two ways to identify the model, one is the autocorrelation function (ACF), and the other is the partial autocorrelation function (PACF). These two methods are the most efficient ways to identify ARMA models. The truncation properties of the two functions can be used to determine the type of the model.


    When using the autocorrelation function and the truncation of the partial autocorrelation function to judge that the model is an ARMA model, the order of p and q cannot be determined. Apply together. The most widely used today are AIC (A-Information Criterion) and BIC.

    The AIC criterion is a weighted function of the fitting accuracy and the number of parameters, and the model that makes the AIC function reach the minimum value is considered to be the optimal model. Let { } be a sample of a time series, and we describe it with an AR(n) model. is the fitting residual variance, and the AIC criterion function is defined as follows:

        (2-1)

        (2-2)

where M(N) is equal to  or  , and we take it as the optimal autoregressive model order.

    The BIC guidelines are defined as follows:

        (2-3)

Among them, n is the number of parameters. If a certain order is satisfied

        (2-4)

Where M(N) is equal to or  , then it is the best coefficient. 

    Model parameter estimation and fitness testing

    Any ARMA or MA process can be represented by an AR process of infinite order, so if an inappropriate model is chosen, it can still approximate the modeled random process fairly well, as long as the order of the model is high enough. Among these three parameter models, the AR model is widely used, and the reason is that the parameter calculation process of the AR model is a linear equation, which is relatively simple. The MA model generally requires a large number of parameters; although the ARMA model requires the least number of parameters, the parameter estimation algorithm is a nonlinear equation system, and its operation is far more complicated than the AR model. Considering that any ARMA or MA signal model can be represented by an AR model of infinite order or a sufficiently large order, we convert the ARMA model to an AR model and use the Bury recursive algorithm to solve the parameters.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325936932&siteId=291194637