supervised learning
In supervised learning, all data has labels or true values, and the loss function is directly calculated for the network output and labels for training. Common applications include image classification, speech recognition, natural language processing, and more. For example, in an image classification task, a supervised learning algorithm can learn to classify images into different categories by being trained on a large number of labeled images.
unsupervised learning
In unsupervised learning, the data is not labeled and the learning algorithm learns patterns from the data. The main applications of unsupervised learning include clustering, dimensionality reduction, latent variable modeling, etc. For example, in clustering tasks, unsupervised learning algorithms can cluster unlabeled data and find similarities between data.
semi-supervised learning
In semi-supervised learning, a small amount of labeled data and a large amount of unlabeled data are used for training to improve the performance of the learning model. The main applications of semi-supervised learning include anomaly detection, generative models, etc. For example, in anomaly detection tasks, semi-supervised learning algorithms can use a small amount of labeled normal data and a large amount of unlabeled abnormal data to learn a model that can identify abnormal data.
self-supervised learning
In self-supervised learning, the contextual information of the data itself is used to provide supervision for the model, which does not rely on manually labeled data. Major applications of self-supervised learning include language models, video prediction, etc. For example, in language modeling tasks, self-supervised learning algorithms can learn a model that can generate text by predicting the context information of the text.