pytorch of LSTM notes

Sentence length at LSTM initialization pytorch is not fixed, can be dynamically adjusted, just as batch training, need to ensure that the length of the sentence is consistent.

keras model must pass initialization sentence length, i.e. the number of units lstm, this is part of the model parameters of the
experiments show that different input lengths, the total amount of network parameters are the same lstm

Parameters Parameter lstm that the interior of the neuron, network sentence length not lstm

Guess you like

Origin www.cnblogs.com/rise0111/p/11527323.html