DeepLearning xgboost source code reading

Insert picture description here

For the classification problem, of course it has to be converted to OHE, and then out_dimsfit each

Insert picture description here 1 2 G L 2 H L + λ + 1 2 G R 2 H R + λ − 1 2 ( G L + G R ) 2 H L + H R + λ − γ \frac{1}{2} \frac{G_{L}^{2}}{H_{L}+\lambda}+\frac{1}{2} \frac{G_{R}^{2}}{H_{R}+\lambda}-\frac{1}{2} \frac{\left(G_{L}+G_{R}\right)^{2}}{H_{L}+H_{R}+\lambda}-\gamma 21HL+λGL2+21HR+λGR221HL+HR+λ(GL+GR)2c

XGBoostRegressionTreeInherited DecisionTree, good guys, don’t rewrite the function. In such an obscure way, I was shocked for a few minutes to understand. Maybe the author wants us to understand through the function name

Insert picture description here

class SquareLoss(Loss):
    
    def __init__(self): 
        pass

    def loss(self, y, y_pred):
        pass

    def grad(self, y, y_pred):
        return -(y - y_pred)
    
    def hess(self, y, y_pred):
        return 1

class CrossEntropyLoss(Loss):
    
    def __init__(self): 
        pass

    def loss(self, y, y_pred):
        pass

    def grad(self, y, y_pred):
        return - (y - y_pred)  
    
    def hess(self, y, y_pred):
        return y_pred * (1-y_pred)

Guess you like

Origin blog.csdn.net/TQCAI666/article/details/113358066