Matlab forecasting model - time series model summary

Forecasting Models - Time Series Models

Time Series Forecasting Model

1. Autoregressive (AR)

In an AR model, we use a linear combination of past values ​​of variables to predict the variable of interest. The term autoregression indicates that it is the regression of a variable on itself.

2. Moving average model (MA)

Unlike AR models, which use past values ​​of predictor variables in regression, MA models focus on past forecast errors or residuals in regression-like models.

3. Autoregressive Moving Average (ARMA)

In an AR model, we predict the variable of interest using a linear combination of past values ​​of the variable and past forecast errors or residuals. It combines autoregressive (AR) and moving average (MA) models.
The AR part involves regressing on lagged (i.e. past) values ​​of the variable itself. The MA part involves modeling the error term as a linear combination of error terms that occurred simultaneously at different times in the past. The notation of the model involves specifying the order of the AR§ and MA(q) models as parameters of the ARMA function, eg ARMA(p,q).

4. ARIMA model

ARIMA model is a very widely used and very classic traditional time series forecasting model. The ARIMA model has a good processing ability for stationary time series or non-stationary data, and has a good effect on most scene predictions. The ARIMA model believes that the trend development of time series is affected by both internal laws and the outside world. Its core idea is to first use the autoregressive algorithm to connect the historical sequence value, current sequence value and external factors through a certain model, and then use a certain Statistical methods obtain appropriate model parameters to fuse the relationship between the three.

5. SARIMA model

The SARIMA model is also called the seasonal difference autoregressive moving average model. The SARIMA model is modified by adding the principle of difference and seasonal difference based on the ARMA model.

6. SARIMA (SARIMAX) with exogenous variables

The SARIMAX model is an extension of the traditional SARIMA model, including modeling of exogenous variables. It is an abbreviation for Seasonal Autoregressive Integrated Moving-Average with Exogenous Regressors. An
exogenous variable is a variable whose value is determined outside the model and imposed on the model. They are also called covariates. Observations of exogenous variables are included directly in the model at each time step, and are modeled differently than those used for the main endogenous series.
The SARIMAX method can also be used to simulate other variations with exogenous variables, such as ARX, MAX, ARMAX, and ARIMAX, by including them.

7. Vector autoregression (VAR)

VAR models are a generalization of univariate autoregressive models for forecasting time series vectors or multiple parallel time series, such as multivariate time series. It is one equation for each variable in the system.
If the series are stationary, they can be predicted by fitting a VAR directly to the data (called "VAR in levels"). If the series is non-stationary, we take differences in the data to make it stationary, and then fit a VAR model (called "VAR in differences").
We call this a VAR§ model, a vector autoregressive model of order p.

Eight, vector autoregressive moving average model (VARMA)

The VARMA method is a generalization of ARMA to multiple parallel time series, such as multivariate time series. A finite-order VAR process with a finite-order MA error term is called VARMA.
The formulation of the model specifies AR§ and the order of the MA(q) model as parameters of the VARMA function, eg VARMA(p,q). VARMA models can also be used with VAR or VMA models.

9. Vector autoregressive moving average model (VARMAX) with exogenous variables

Vector Autoregression Moving-Average with Exogenous Regressors (VARMAX) is an extension of the VARMA model that also includes modeling using exogenous variables. It is an extension of the ARMAX method to multiple parallel time series, that is, a multivariate version of the ARMAX method.
The VARMAX method can also be used to model inclusion models that include exogenous variables, such as VARX and VMAX.

10. Exponential smoothing model

Exponential smoothing model is a traditional time series forecasting method, which is improved from moving average model. The principle of the exponential smoothing model is to average the current actual value and historical data values ​​by a certain ratio, and change the weight of the current value to obtain a smooth value, and then form a forecast model through certain calculations. Exponential smoothing models fit all historical data, but with exponentially decaying weights.

11. Holt-Winters method

In early 1957, Holt extended the simple exponential smoothing method to make it possible to predict trended data. This method, known as Holt's linear trend, consists of a forecasting equation and two smoothing equations (one for level and one for trend) with corresponding smoothing parameters α and β. Later, in order to avoid infinite repetition of the trend pattern, the damped trend method was introduced, which proved to be very successful and the most popular single method when many series need to be forecasted. In addition to the two smoothing parameters, it includes an additional parameter called the damping parameter φ.
Once a trend can be captured, the Holt-Winters method extends the traditional Holt method to capture seasonality. The Holt-Winters method of seasonality consists of a forecasting equation and three smoothing equations - one for the level, one for the trend, and one for the seasonal component, with corresponding smoothing parameters α, β, and γ.
There are two variants of this method, which differ in the nature of the seasonal component. The additive method is preferred when the seasonal variation is approximately constant across the series, and the multiplicative method is preferred when the seasonal variation varies proportionally to the series levels.

12. LSTM model

LSTM is a time recurrent neural network, and its reason is to solve a fatal flaw of RNN. The original RNN will encounter a big problem called The vanishing gradient problem for RNNs, that is, Alzheimer's disease will appear at later nodes, that is, forgetting things, which makes RNN have not been affected for a long time. Pay attention, as long as the network is deep, it cannot be trained. Later, some big cows began to use recurrent neural networks to model temporal relationships. According to the exposition of the three big cows of deep learning, for time series data, LSTM network has been proved to be more effective than traditional RNNS.
Neural network models suitable for multiple input variables have always been a headache for developers, but (LSTM)-based recurrent neural networks can almost perfectly solve the problem of multiple input variables. The LSTM-based recurrent neural network can be well used in time series forecasting, because many classical linear methods are difficult to adapt to multi-variable or multi-input forecasting problems.

Guess you like

Origin blog.csdn.net/weixin_43599390/article/details/131358392