In statistics, the Huber loss is a loss function used in robust regression, that is less sensitive to outliers in data than the squared error loss. A variant for classification is also sometimes used.
def huber_fn(y_true, y_pred):
error = y_true - y_pred
is_small_error = tf.abs(error) < 1
squared_loss = tf.square(error) / 2
linear_loss = tf.abs(error) - 0.5
return tf.where(is_small_error, squared_loss, linear_loss)
Note that the return value of the custom loss function is a vector rather than the average loss , and each element corresponds to an instance. The advantage of this is that Keras can pass class_weight
or sample_weight
adjust the weights.
huber_fn(y_valid, y_pred)
<tf.Tensor: id=4894, shape=(3870, 1), dtype=float64, numpy=
array([[0.10571115],
[0.03953311],
[0.02417886],
...,
[0.00039475],
[0.00245003],
[0.12238744]])>