Time Series

1, the time series basic rules of law - Cycle Factor Method

 

2, the linear regression - time characteristics using linear regression do

  • The periodical characteristics as time characteristics, then each training set sample is "characterized Time -> Target Value", the time dependence sequence is removed, the sliding window need not be strictly dependent on the training samples taken. Common is the time to express 0-1 dummy variable, following several characteristics:
    • In order to convert the week 0-1 variables, from Monday to Sunday day, one-hot encoding a total of seven variables
    • 0-1 into the variable holidays, as the number of holidays, can be simply divided into two categories, "there holiday" - "no holiday", a total of two hot encoded variables; or impart different coding values, such as the distinction Day, Year , Labor Day, etc. using 1,2,3
    • Early into the binary variables, two types represented simply as "a month" - "non-month", wherein a total of 2
    • Similar months, can be converted to 0-1 early in the variable
    • Time granularity of control, distinguish weekday or weekend
  • Observation sequence, a periodic sequence, when present, may be used as baseline linear regression

Linear Model 3, a conventional sequence modeling method, ARMA / ARIMA like. reference:

  • Addressed to your Financial Time Series Analysis: Basics
  • Autoregressive / moving average order determination Identifying the orders of AR and MA terms in an ARIMA model include the 11 general principle, which refers to:
  • Differential method eliminates the introduction of a positive correlation but at the same time negatively correlated
  • AR term positive correlation can be eliminated, MA term to eliminate negative correlation
  • AR and MA term action items will cancel each other out, can be an attempt to reduce the time usually contains two elements, to avoid over-fitting

4, the decomposition of time series, using an adder or a multiplicative model models original sequence split into four parts.

5, wherein the project started, the sliding window of time to change the data is organized using xgboost / LSTM model / time convolutional networks. reference:

6, the data set into supervised learning using xgboot / LSTM model / time convolutional network / seq2seq (attention_based_model). reference:

7, Facebook-prophet, similar to the STL decomposition idea, because I think in the degree of control and interpretability than the traditional timing models have an advantage, so a separate train. reference:

  • Official website Description (English)
  • Official website notbook (English)
  • Chinese Recommended Articles from principle to use are introduced, it is conscience. CHANG: study Facebook's time series forecasting algorithm Prophet
  • Understand, would like to further good use, you can take a look at papers and official website, the time line and over the python source code
  • Prior_scale understand how to control the trend item in the code, seasonal items and holiday items
  • For Trend parameters changepoint_range, changepoint_prior_scale model fitting and how it affects the degree of generalization
  • Uncertainty-Intervals (interval_width parameters) Trend of how to use the results to predict
  • Papers in "Simulated Historical Forecasts" corresponding to the prophet's Diagnostics tool, you can use the tool to do time-series cross-validation of the accuracy of the evaluation model, how to use this tool to adjust the model

8, depth of the learning network, in conjunction with CNN + RNN + Attention, each with a different co-ordination. Currently only read the paper, we have the code given the way the code link, the code did not look.

The main design philosophy:

  • CNN capture the short-term local dependencies
  • RNN capture long-term macro-dependency
  • Attention significant period or weighted variables
  • AR-scale change data capture (not too get to know what the meaning of ~)

method:

  • LSTNet : suitable for autocorrelation time series chart exhibits a significant period, otherwise fairly traditional methods. Pytorch-LSTNet , LSTNet-Keras , LSTNet-gluon (Mxnet) .
  • LSTM-TPA : Improved attention mechanism, focusing on selected key variables, rather than selecting the time step; the experimental effect is not obvious to say the period of the time series can have a good effect. TPA-LSTM-Tensorflow

Code



Author: BINGO Hong
link: https: //zhuanlan.zhihu.com/p/67832773

Guess you like

Origin www.cnblogs.com/Allen-rg/p/11752765.html