【Mindspore】The lambda function reports an error: TypeError: Parse Lambda Function Fail. Node type must be Lambda, but got Call.

1. Brief description of error reporting

  • mindspore version:mindspore 2.0 alpha
  • Part of the code is as follows:
# 编码器部分代码
class EncoderLayer(nn.Cell):
    def __init__(self, size, self_attn, feed_forward, dropout):
        """ size            词嵌入维度大小
            self_attn       多头自注意力子层的实例化对象
            feed_forward    前馈全连接层的实例化对象
            dropout         丢弃率 """
        super(EncoderLayer, self).__init__()
        
        self.self_attn = self_attn
        self.feed_forward = feed_forward
        self.sublayer = clones(SublayerConnection(size, dropout), 2)
        self.size = size

    def construct(self, x, mask):
        """ x       上一层输入
            mask    掩码张量 """
        x = self.sublayer[0](x, lambda x: self.self_attn(x, x, x, mask))
        return self.sublayer[1](x, self.feed_forward)


# 模型部分代码
model = EncoderDecoder(
        Encoder(EncoderLayer(d_model, c(attn), c(ff), dropout), N),
        Decoder(DecoderLayer(d_model, c(attn), c(attn), c(ff), dropout), N),
        nn.SequentialCell([Embeddings(d_model, source_vocab), c(position)]),
        nn.SequentialCell([Embeddings(d_model, target_vocab), c(position)]),
        Generator(d_model, target_vocab)
    )
  • The error message is as follows:
File "E:\Codes\TeaGroupCodes\transformer_ms\model\encoder.py", line 24, in construct
    x = self.sublayer[0](x, lambda x: self.self_attn(x, x, x, mask))

TypeError: Parse Lambda Function Fail. Node type must be Lambda, but got Call.

2. Problem location

  • This code of mine mainly defines a model in which the encoder ( Encoder) consists of Nencoder layers ( EncoderLayer). The error occurs at EncoderLayerthis line in : x = self.sublayer[0](x, lambda x: self.self_attn(x, x, x, mask)).
  • I'm self.sublayer[0](x, sublayer)passing two parameters into the and the problem is with the second parameter .
  • If you directly use a lambda expression as a function parameter, there is no problem in forward reasoning, but it seems that an error will be reported when optimizing parameters.

3. Solutions

Assign the lambda expression to a variable, and then pass the variable as a parameter to the function.
For example x = self.sublayer[0](x, lambda x: self.self_attn(x, x, x, mask)), change to:

func = lambda x: self.self_attn(x, x, x, mask)
x = self.sublayer[0](x, func)

The program runs successfully!
insert image description here

Guess you like

Origin blog.csdn.net/weixin_45800258/article/details/130309886