LSTM多层出现的问题:MultiRNNCell出现的错误问题以及解决方案

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/yjw19901214/article/details/77570749

错误:ValueError: Attempt to reuse RNNCell <tensorflow.contrib.rnn.python.ops.core_rnn_cell_impl.GRUCell object at 0x11d32cbd0> with a different variable scope than its first use. First use of cell was with scope 'rnn/multi_rnn_cell/cell_0/gru_cell', this attempt is with scope 'rnn/multi_rnn_cell/cell_1/gru_cell'. Please create a new instance of the cell if you would like it to use a different set of weights. If before you were using: MultiRNNCell([GRUCell(...)] * num_layers), change to: MultiRNNCell([GRUCell(...) for _ in range(num_layers)]). If before you were using the same cell instance as both the forward and reverse cell of a bidirectional RNN, simply create two instances (one for forward, one for reverse). In May 2017, we will start transitioning this cell's behavior to use existing stored weights, if any, when it is called with scope=None (which can lead to silent model degradation, so this error will remain until then.)

原始代码:
from tensorflow.contrib import rnn
inputs = tf.placeholder(dtype=tf.int32, shape=[None, None], name="inputs")
keep_prob = tf.placeholder(dtype=tf.float32, name="keep_prob")
cell = rnn.GRUCell(10)
cell = rnn.DropoutWrapper(cell=cell, input_keep_prob=keep_prob)
cell = rnn.MultiRNNCell([cell for _ in range(5)], state_is_tuple=True)

outs, states = tf.nn.dynamic_rnn(cell=cell, inputs=look_up, dtype=tf.float32)


解决办法:
inputs = tf.placeholder(dtype=tf.int32, shape=[None, None], name="inputs")
keep_prob = tf.placeholder(dtype=tf.float32, name="keep_prob")
cell = rnn.MultiRNNCell([rnn.DropoutWrapper(rnn.GRUCell(10), input_keep_prob=keep_prob) for _ in range(5)] , state_is_tuple=True)

outs, states = tf.nn.dynamic_rnn(cell=cell, inputs=look_up, dtype=tf.float32)

猜你喜欢

转载自blog.csdn.net/yjw19901214/article/details/77570749