softmax(logits, dim=-1, name=None):
"""Computes softmax activations.For each batch `i` and class `j` we have
softmax = exp(logits) / reduce_sum(exp(logits), dim)
Args:
logits: A non-empty `Tensor`. Must be one of the following types: `half`, `float32`, `float64`.
dim: The dimension softmax would be performed on. The default is -1 which indicates the last dimension.
name: A name for the operation (optional).Returns:
A `Tensor`. Has the same type as `logits`. Same shape as `logits`.
Raises:
Invalid Argument Error: if `logits` is empty or `dim` is beyond the last dimension of `logits`.
sofmax函数 tf.nn.softmax(logits, dim=-1, name=None)
猜你喜欢
转载自blog.csdn.net/zz2230633069/article/details/81545496
今日推荐
周排行