[tensorflow and keras] 自定义带权重的logloss

带权重的logloss — tensorflow

def weighted_loss(labels, logits):
    '''
    Weighted loss.
    Args:
    labels: without onehot
    logits: after sorfmax

    Return loss
    '''
    sf_logits_log = (-1) * tf.log(logits) #[N, c]
    num_class = logits.shape[-1]
    oh_labels = tf.one_hot(labels, num_class, dtype = tf.float32) #[N, c]
    #set 1.2 for tumor and 0.8 for normal
    y_true = 1.2 * oh_labels[:, 1:]
    y_false = 0.8 * oh_labels[:, 0:1]
    weight_labels = tf.concat([y_false, y_true], axis = 1)
    loss = tf.reduce_sum(sf_logits_log * weight_labels, axis = 1)
    loss = tf.reduce_mean(loss)
    return loss

改进,带权重的logloss — keras

from keras import backend as K
def weighted_loss(labels, logits):
    '''
    Weighted loss.
    Args:
    labels: without onehot
    logits: after sorfmax

    Return loss
    '''
    sf_logits_log = (-1) * K.log(logits) #[N, c]
    num_class = logits.shape[-1]
    oh_labels = K.one_hot(labels, num_class, dtype = tf.float32) #[N, c]
    #set 1.2 for tumor and 0.8 for normal
    y_true = 1.2 * oh_labels[:, 1:]
    y_false = 0.8 * oh_labels[:, 0:1]
    weight_labels = K.concatenate([y_false, y_true], axis = 1)
    loss = K.sum(sf_logits_log * weight_labels, axis = 1)
    loss = K.mean(loss)
    return loss

带权重的logloss能够使我们对不均衡的样本----包含很少的正样本和大量的负样本----做出很好的训练。

猜你喜欢

转载自blog.csdn.net/m0_37477175/article/details/83277818