The task of machine learning is to predict the label Y from the attribute X, that is, to find the probability P(Y|X);
supervised learning
The training data gives the correct answer, that is, the label, and the task is to build a corresponding model and train the data outside the sample set for classification and prediction.
generative model
Generative models learn a joint probability distribution P(x, y).
Common discriminant methods include k-nearest neighbor method, perceptron, decision tree, logistic regression, linear regression, maximum entropy model, support vector machine (SVM), boosting method, conditional randomization. Airport (CRF)
discriminative model
The discriminative model learns a conditional probability distribution P(y|x)
Common generation methods include Gaussian mixture model, Naive Bayes method and invisible Markov model
Example of discriminant model: To determine whether a sheep is a goat or a sheep, the method of using the discriminant model is to learn the model from historical data, and then predict the probability that the sheep is a goat by extracting the characteristics of the sheep. probability.
Example of generative model: Using the generative model is to first learn a goat model according to the characteristics of the goat, and then learn a sheep model according to the characteristics of the sheep, and then extract the features from the sheep, and put them into the goat model to see the probability is How much, what is the probability in the sheep model, whichever is bigger is whichever.
The discriminative model can directly give the probability of a sheep according to the characteristics of the sheep (such as logistic regression, when the probability is greater than 0.5, it is a positive example, otherwise it is a negative example), and the generative model is to try it all, The one with the greatest probability is the final result~
unsupervised learning
Training and learning on unlabeled samples is more than discovering structural knowledge in these samples. (KMeans,DL)