[Pattern recognition] Experiment report - Bayesian, Fisher linear discriminant, K nearest neighbor, LeNET, PCA transformation experiment

This resource integrates five experimental reports of the pattern recognition course, which are Bayesian classifier, Fisher linear discriminant method, K nearest neighbor algorithm, using PCA transformation method to perform face recognition on ORL face dataset, and using LeNet neural network Recognition of MINIST handwritten digits. The download method is at the end of the article , and the brief introduction of each experiment is as follows:

1. Bayesian method for gender classification

This experiment uses the Bayesian method for gender classification. It needs to give the prior probability (that is, the ratio of male and female sex), and then use the given FAMLE.T X T and MALE.T X T as the training sample set, through Combining the different requirements of "single feature" or "two features are not correlated" or "two features are correlated", use the corresponding method to estimate the probability density function, and establish the corresponding Bayesian according to the "minimum error rate" or "minimum risk" Adams classifier.

Among them, when using a single feature, the probability density function can be obtained only according to the maximum likelihood method and the corresponding library function of sklearn. When using multiple features, it is necessary to set its correlation coefficient, and reproduce the normal distribution function of multiple features through code, and then obtain its probability density function.

Among them, when calculating the minimum risk, it is only necessary to multiply the result obtained according to the minimum error rate by the corresponding risk.

2. Use the linear discriminant method to determine gender

This experiment uses the linear discriminant method for gender classification. The first question is to use the Fisher linear discriminant method to construct a classifier, which needs to be solved by using various mean vectors and intra-class dispersion matrices. According to the intra-class discrete matrix and the optimal projection of Fisher's criterion, the optimal Fisher's criterion interface can be calculated by combining the formula.

3. K-nearest neighbor method for gender classification

This experiment uses the nearest neighbor method for gender classification. The nearest neighbor algorithm or KNN algorithm is a basic classification and regression method, and it is one of the simplest technologies in data mining technology. The so-called nearest neighbor is to first select a threshold value K, and vote for the point closest to the test sample within the threshold range. The category with the most votes is the category of the test sample. This is a classification problem. Then the regression problem is the same, take the mean value of the point closest to the test sample within the threshold range, then this value is the predicted value of this sample point.

4. Use the method of PCA transformation to perform face recognition on the ORL face dataset

This experiment uses the method of PCA transformation to perform face recognition on the ORL face dataset. The method is as follows:

① Use 10-fold cross-validation for 10 pictures of 40 volunteers in the ORL face database. Each person takes out one piece of verification data in turn, performs 10 rounds of verification, and finally obtains the mean value of the accuracy rate. For each operation, the training set size is 360 images and the validation set size is 40 images. For each image, expand it into a one-dimensional vector Xi .

② Find the mean value vector u of each row of X , and subtract it from X to perform zero-meanization, and obtain C=( X 1-u, X 2-u, ..., X 360-u)

③ Construct the covariance matrix CC T, solve the eigenvalues ​​of the covariance matrix, select the largest K ones, find the corresponding K eigenvectors, and arrange them into a transformation matrix P with a dimension of (10304* K ) ;

④ Calculate the projection of the picture in the training set under the above feature vector, that is, Y i =PT ( X i - u ) , as a search set, perform the same projection operation on the picture to be recognized to get Z.

⑤ Traversing the search search set, satisfying the min|| Y i - Z || condition, that is, the picture to be recognized and the picture corresponding to Y i belong to the same category, that is, find the person corresponding to the picture to be recognized.

5. Recognition of MINIST handwritten digits using LeNet neural network

  • Dataset overview

The MINIST data set contains a total of 70,000 handwritten digital pictures, which are divided into training set and test set according to the ratio of 6:1. The size of the picture is 28×28, and the number of channels is 1. Each picture has a black background and white text. The black background is represented by 0 in the tensor, and the white text is represented by a floating point number between 0 and 1.

  • Detailed explanation of LeNet and LeNet5 network structure

LetNet-5 is a simpler convolutional neural network. The following figure shows its structure: the input two-dimensional image (single channel), first goes through two convolutional layers to the pooling layer, then goes through the fully connected layer, and finally is the output layer.

  • Common Problems and Techniques for Image Classification

For the image classification problem of a single label (contains only 1 category and background), the evaluation indicators mainly include Accuracy, Precision, Recall, F-score, etc. Let TP indicate that the positive class is predicted as a positive class, and FP means that the negative class is predicted as Positive class, TN means predicting negative class as negative class, FN means predicting positive class as negative class. Then Accuracy=(TP+TN)/NUM, Precision=TP/(TP+FP), Recall=TP/(TP+FN).

For image classification problems with a number of categories greater than 1, the evaluation indicators mainly include Accuary and Class-wiseAccuracy. Accuary indicates the percentage of the number of correctly predicted images in all categories to the total number of images; Class-wiseAccuracy calculates Accuracy for each category of images, and then Then average the Accuracy of all categories.

download method

https://download.csdn.net/download/m0_55080712/87167411

It doesn't matter if you don't have points. After you like this article , send the screenshot to [email protected] to request, and give a reply within 12 hours.

Guess you like

Origin blog.csdn.net/m0_55080712/article/details/128054723