tensorflow2.x study notes eight: cross-entropy tensorflow (keras) loss of function

Here are some of my own simple summary, if wrong place, please point it out.

A, BinaryCrossentropy classes and functions binary_crossentropy

BinaryCrossentropy class using:

tf.keras.losses.BinaryCrossentropy(
from_logits=False, label_smoothing=0,
reduction=losses_utils.ReductionV2.AUTO,
name='binary_crossentropy'
)
1
2
3
4
5
参数解释:

from_logits: True, the result indicates a non-predicted probability distributions, but the exact value of the category; False output is represented as probability distributions
reduction: When a plurality of sets of data (such as multiple batch) is calculated, the calculation used to represent the loss what to do the processing, several values of the attribute value included tf.keras.losses.Reduction, mainly:
value role
AUTO processing mode is determined according to the context
NONE without any treatment, to keep the default
SUM obtained for each group loss summing
SUM_OVER_BATCH_SIZE income loss for each group averaging
first look only when a set of data, its formula is how:

y_true=[1., 0., 1., 0.]
y_pred=[1., 1., 1., 0.]
bce = tf.keras.losses.BinaryCrossentropy()
loss=bce(y_true, y_pred)
print('Loss: ', loss.numpy())

#输出:Loss: 3.833
'''
# EPSILON = 1e-7, y = y_true, y` = y_pred, Y_MAX = 0.9999999
# y` = clip_ops.clip_by_value(output, EPSILON, 1. - EPSILON)
# y` = [Y_MAX, Y_MAX, Y_MAX, EPSILON]

The Metric = # - (Y log (y` EPSILON +) + (. 1 - Y) log (. 1 - EPSILON y` +))
# = [-log (Y_MAX + EPSILON), -log (. 1 - EPSILON Y_MAX +),
-log # (+ Y_MAX EPSILON), -log (. 1)]
# = [(0 + 15.33) / 2, (0 + 0) / 2]
# Metric Reduced = 7.665 / 2 = 3.833
'' '
. 1
2
. 3
. 4
. 5
. 6
. 7
. 8
. 9
10
. 11
12 is
13 is
14
15
16
. 17
18 is
to demonstrate the specific role of reduction parameters, we provide two sets of data are calculated for each group is calculated and the above example is the same:

y_true = ([1., 0, 1, 0], [1, 1, 1, 0])
y_pred = ([1., 1, 1, 0], [1 ., 0., 1., 0.])

= tf.keras.losses.BinaryCrossentropy BCE ()
Loss = BCE (y_true, y_pred)
Print ( 'Loss:', loss.numpy ())
# default output: Loss: 3.8447733
. 1
2
. 3
. 4
. 5
. 6
. 7
it can be seen default is seeking an average value in the case, this time to change the effect, as in the example shown in the following reduction look:

bce = tf.keras.losses.BinaryCrossentropy(
reduction=tf.keras.losses.Reduction.NONE)
loss1=bce(y_true, y_pred)
print('Loss: ', loss1.numpy())

#输出:Loss: [3.8333097 3.8562372]
1
2
3
4
5
6
bce = tf.keras.losses.BinaryCrossentropy(
reduction=tf.keras.losses.Reduction.SUM)

loss2=bce(y_true, y_pred)
print('Loss: ', loss2.numpy())

#求和输出:Loss: 7.6895466
1
2
3
4
5
6
7
bce = tf.keras.losses.BinaryCrossentropy(
reduction=tf.keras.losses.Reduction.SUM_OVER_BATCH_SIZE)

loss3=bce(y_true, y_pred)
print('Loss: ', loss3.numpy())

# Averaged output: Loss: 3.8447733
. 1
2
. 3
. 4
. 5
. 6
. 7
use binary_crossentropy function:

tf.keras.losses.binary_crossentropy(
y_true, y_pred, from_logits=False, label_smoothing=0
)
1
2
3
y_true=[1., 0., 1., 0.]
y_pred=[1., 1., 1., 0.]
loss=tf.keras.losses.binary_crossentropy(
y_true, y_pred)
print('Loss: ', loss.numpy())
#输出:3.833
1
2
3
4
5
6
二、CategoricalCrossentropy类和categorical_crossentropy函数。

CategoricalCrossentropy class

tf.keras.losses.CategoricalCrossentropy(
from_logits=False, label_smoothing=0,
reduction=losses_utils.ReductionV2.AUTO,
name='categorical_crossentropy'
)
1
2
3
4
5
y_true=([0, 1, 0], [0, 0, 1])
y_pred=([0.05, 0.95, 0], [0.1, 0.8, 0.1])
cce = tf.keras.losses.CategoricalCrossentropy()
loss=cce(y_true, y_pred)
print('Loss: ', loss.numpy())

#输出:Loss: 1.176
'''
# EPSILON = 1e-7, y = y_true, y` = y_pred
# y` = clip_ops.clip_by_value(output, EPSILON, 1. - EPSILON)
# y` = [[0.05, 0.95, EPSILON], [0.1, 0.8, 0.1]]

-Sum xent = # (Y * log (Y '), Axis = -1)
# = - ((log 0.95), (0.1 log))
# = [0.051, 2.302]
# Reduced xent = (0.051 + 2.302) / 2 = 1.176
''
. 1
2
. 3
. 4
. 5
. 6
. 7
. 8
. 9
10
. 11
12 is
13 is
14
15
16
. 17
can also be seen that the calculated average value of default, the calculation result may be changed to a plurality of sets of data by setting the value of the reduction of exactly the same approach, usage and BinaryCrossentropy described, do not say here, you can try it for yourself.
categorical_crossentropy function

tf.keras.losses.categorical_crossentropy(
y_true, y_pred, from_logits=False, label_smoothing=0
)
1
2
3
y_true=([0, 1, 0], [0, 0, 1])
y_pred=([0.05, 0.95, 0], [0.1, 0.8, 0.1])
loss=tf.keras.losses.categorical_crossentropy(
y_true, y_pred)
print('Loss: ', loss.numpy())

# Output: Loss: 1.176
. 1
2
. 3
. 4
. 5
. 6
. 7
in addition to the usage, they can also be used as model.compile () in the loss parameters, as follows:
Import tensorflow TF AS

= tf.keras.Input Inputs (Shape = (. 3,))
X = tf.keras.layers.Dense (. 4, Activation = tf.nn.relu) (Inputs)
Outputs tf.keras.layers.Dense = (. 5, = tf.nn.softmax Activation) (X)
Model = tf.keras.Model (Inputs = Inputs, Outputs Outputs =)
. 1
2
. 3
. 4
. 5
. 6
model.compile ( 'SGD',
Loss = tf.keras.losses.BinaryCrossentropy ())
model.compile ( 'sgd',
Loss = tf.keras.losses.binary_crossentropy)
 Zhengzhou men's hospital to see that good home: http: //www.xasgnanke.com/ Zhengzhou which hospital treatment of male: http: / /www.xasgnanke.com/ Zhengzhou men and hospital ranking: HTTP: //www.xasgnanke.com/
model.compile ( 'sgd',
Loss = tf.keras.losses.CategoricalCrossentropy ())
model.compile ( 'sgd',
loss = tf.keras.losses.categorical_crossentropy)

. 1
2
. 3
. 4
. 5
. 6
. 7
. 8
. 9
10
One thing to note is that, if using a class, do not forget it there is a rear bracket, for obtaining instantiated object; write only function if the function name on it

Third, and finally, furthermore, that is, whether or BinaryCrossentropy CategoricalCrossentropy, in fact, both can be used in binary or multiple classification, not so obvious boundaries, which is extended on the basis of the former come on. However, under normal circumstances are carried out according to the default usage used in the above example.
----------------
Disclaimer: This article is CSDN blogger "universal black Alex 'original article, follow the CC 4.0 BY-SA copyright agreement, reproduced, please attach the original source link and this statement.
Original link: https: //blog.csdn.net/qq_39507748/article/details/105005427

Guess you like

Origin www.cnblogs.com/sushine1/p/12642821.html
Recommended