Python program for forecasting time series based on BILSTM

 Python program for forecasting time series based on BILSTM

Features: 1. Single-variable and multi-variable input, free switching

          2. Single-step forecast, multi-step forecast, automatic switching

          3. Based on Pytorch architecture

          4. Multiple evaluation indicators (MAE, MSE, R2, MAPE, etc.)

           5. The data is read from the excel file, easy to replace

          6. Standard framework, the data is divided into training set, verification set and test set

 All the complete code, the code that is guaranteed to run can be seen here.

http://t.csdn.cn/obJlChttp://t.csdn.cn/obJlC

  ! ! ! If the first link cannot be opened, please click on the personal homepage to view my personal introduction.

(After searching for the product, click on the avatar to see all the codes)

Blog of Black Technology Little Potato_CSDN Blog-Deep Learning, Blogger in 32 MCU Field

The BILSTM (Bidirectional Long Short-Term Memory) model is an improvement to the LSTM model (Long Short-Term Memory), and it is also a bidirectional recurrent neural network. The LSTM model effectively solves the problem of long-term dependence by introducing mechanisms such as input gate, forget gate, and output gate, and is more suitable for processing time series data. Different from the LSTM model, the BILSTM model looks for the output at the current moment through the hidden state of the LSTM in the forward and reverse directions, so as to better represent the context information of the time series data.

In the BILSTM model, the forward and reverse two LSTM hidden states are concatenated to form the final output, so as to obtain a more comprehensive time series data representation. Compared with the unidirectional LSTM model, the BILSTM model can use historical information more effectively, thereby improving the modeling ability and predictive performance of sequence data.

The advantages of the BILSTM model include:

  1. By introducing mechanisms such as input gate, forget gate and output gate, the LSTM model can effectively solve the problem of long-term dependence and improve the modeling ability and predictive performance of sequence data.
  2. Compared with the unidirectional LSTM model, the BILSTM model can make fuller use of historical information, obtain a more comprehensive representation of time series data, and thus improve forecasting performance. Therefore, the BILSTM model is an effective model suitable for time series forecasting, and has a wide range of applications in text analysis, speech recognition, sentiment analysis and other fields.
class BiLSTM(nn.Module):
    def __init__(self, input_size, hidden_size, num_layers, batch_size, device="cpu"):
        super().__init__()
        self.device = device
        self.input_size = input_size
        self.hidden_size = hidden_size
        self.num_layers = num_layers
        self.batch_size = batch_size
        self.lstm = nn.LSTM(self.input_size, self.hidden_size, self.num_layers, batch_first=True, bidirectional=True)

    def forward(self, input_seq):
        batch_size, seq_len = input_seq.shape[0], input_seq.shape[1]
        h_0 = torch.randn(self.num_layers*2, batch_size, self.hidden_size).to(self.device)
        c_0 = torch.randn(self.num_layers*2, batch_size, self.hidden_size).to(self.device)
        output, (h,c) = self.lstm(input_seq, (h_0, c_0))
        return output, h

 

Guess you like

Origin blog.csdn.net/qq_41728700/article/details/129845034