xgb自定义obj

lightGBM本身同时支持mean squared error和mean absolute error

但是xgb不支持mean absolute error,但是可以自定义obj,只要给出一阶和二阶导即可

这里给出一种逼近abs函数的方法(还有其它的逼近函数,有兴趣可查看https://stackoverflow.com/questions/45006341/xgboost-how-to-use-mae-as-objective-function

fair loss:

fair loss代码如下:

gradient和hessian的取值计算严格按照导数定义,对大于0和小于0两种情况进行合并即可。证明略过,很简单。

def fair_obj(preds, dtrain):
    """y = c * abs(x) - c**2 * np.log(abs(x)/c + 1)"""
    x = preds - dtrain.get_labels()
    c = 1
    den = abs(x) + c
    grad = c*x / den
    hess = c*c / den ** 2
    return grad, hess

下图中给出不同c取值下的函数图示:

同时附上作图代码:

import numpy as np
import matplotlib.pyplot as plt

x = np.linspace(-3,3,1000)
y = np.abs(x)
plt.plot(x,y,label="abs function")

for c in (1,2,3):
    y = c*np.abs(x)-c*c*np.log( np.abs(x)/c + 1 )
    plt.plot(x,y,label="fair function(c = {} )".format(c))

plt.legend()
plt.show()

猜你喜欢

转载自blog.csdn.net/qq_39638957/article/details/88624981
xgb
今日推荐