Statistical learning method 6-Logistic regression and maximum entropy

  • Logistic Regression: Discriminant model, classification

  • Binomial Logistic Regression idea:
    the predicted value of linear regression is a series of real values. In order to make the output value become 0 and 1 required for classification, a mapping is needed to change the output of linear regression between (0, 1). This function is the sigmoid function.
    Insert picture description here
    Combine the linear function y = wx+b with b into the matrix first: y = w'x', and substitute it into the sigmoid function:
    Insert picture description here
    but now only the output is limited to (0, 1), and the 0, 1 classification has not been completed yet , There should be a threshold as the judgment limit of 0 or 1. This threshold is what you want to learn, which is reflected in the parameters as w and b.
    Insert picture description here
    Insert picture description here
    Insert picture description here
    Then the goal is to change P (y ∣ x) P(y∣x)P(yx ) is maximized, output maximizes w and b

  • Binomial Logistic regression can be extended to multiple Logistic Regression, the principle is the same, but the classification is not just 0 and 1

  • Maximum entropy model: discriminant model

  • Core: Solve constraint optimization problems

  • Idea: Under a given training set, that is, given constraints (experience knowledge), a set of conditional probability models that meet the constraints {P(y|X)} can be obtained. Among them, the constraint is related to the conditional probability through the characteristic function, and the conditional probability is selected through entropy.

  • Maximum entropy: In the case of constraints, there will be many models that meet the constraints, which one should you choose? Choose the one with the largest entropy, that is, the most uncertain one, which is the most in line with the laws of nature.

Reference: https://blog.csdn.net/Smile_mingm/article/details/108388124?spm=1001.2014.3001.5501

Guess you like

Origin blog.csdn.net/weixin_48760912/article/details/114700596