m Matlab simulation of animal recognition system based on deep learning network, with GUI interface

Table of contents

1. Algorithm simulation effect

2. Algorithms involve an overview of theoretical knowledge

3. MATLAB core program

4. Complete algorithm code file


1. Algorithm simulation effect

The matlab2022a simulation results are as follows:

2. Algorithms involve an overview of theoretical knowledge

        The animal recognition system based on deep learning network is a system that uses deep learning technology to identify and locate animals. The system works by analyzing images or videos using deep neural networks to identify animals within them and determine their location.

       Deep learning networks, specifically convolutional neural networks (CNNs), are at the heart of this system. CNN is a neural network that is particularly suitable for processing image data. It extracts and recognizes features in images through a series of convolutional layers, pooling layers, and fully connected layers. For animal recognition systems, CNNs need to be trained to recognize various animal features, including shape, color, texture, etc.

        The training process of the system usually requires a large amount of image data. First, images of various animals need to be collected, including various angles, lighting, backgrounds, etc. After these images are preprocessed, they are used as the training set to train the CNN. The training process is to continuously adjust the weight of CNN by repeatedly iterating input images and corresponding labels, so that CNN can achieve the best performance on a given task (animal recognition).

       The trained CNN model can recognize the animals appearing in the training set, and can mark its position in the image. This process involves image segmentation and object detection techniques. Generally speaking, CNN will output a bounding box (bounding box) containing the location of the animal and a classification label of the animal.

        The trained model can be integrated into various applications such as camera surveillance systems, image editing software, games, security systems, etc. Users can upload pictures or videos, or use real-time cameras to obtain animal recognition and location results. The system can also provide visualization results, such as marking the identified animals on the original image, or generating a table containing animal information.

         In general, the animal recognition system based on deep learning network is a powerful tool that can help people better understand and protect animals, and it also provides new possibilities for scientific research, security, entertainment and other fields.
A CNN model usually includes the following main parts:

(1) Input layer: used to receive input image data.

(2) Convolution layer: Extract the features of the image through a series of convolution operations.

(3) Pooling layer: downsampling of features to reduce computation and avoid overfitting.

(4) Fully connected layer: The extracted features are used for the final classification and localization tasks.

  1. Loss Function and Optimizer
    When training a CNN model, it is necessary to define a loss function to measure how wrong the model is. Commonly used loss functions include cross-entropy loss (for classification tasks) and mean square error loss (for regression tasks). The optimizer is used to update the weights of the model to minimize the loss function. Common optimizers include stochastic gradient descent (SGD), Adam, etc.

  2. Data augmentation and preprocessing
    In order to improve the performance of the model, it is usually necessary to augment and preprocess the training data. Data augmentation can increase the amount of data through operations such as rotation, scaling, and cropping. Preprocessing includes operations such as normalization and denoising to make the data more in line with the input requirements of the model.

  3. Model optimization techniques
    In order to further improve the performance of the model, some optimization techniques can be used, such as batch normalization (Batch Normalization), dropout (used to prevent overfitting), early stopping (early stopping), etc.

  4. Target detection algorithm
    When performing animal positioning tasks, it may be necessary to use some target detection algorithms, such as YOLO, Faster R-CNN, etc. These algorithms can detect the location and category of objects in images, providing input for animal recognition systems.

3. MATLAB core program

% --- Executes just before tops is made visible.
function tops_OpeningFcn(hObject, eventdata, handles, varargin)
% This function has no output args, see OutputFcn.
% hObject    handle to figure
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
% varargin   command line arguments to tops (see VARARGIN)

% Choose default command line output for tops
handles.output = hObject;

% Update handles structure
guidata(hObject, handles);

% UIWAIT makes tops wait for user response (see UIRESUME)
% uiwait(handles.figure1);


% --- Outputs from this function are returned to the command line.
function varargout = tops_OutputFcn(hObject, eventdata, handles) 
% varargout  cell array for returning output args (see VARARGOUT);
% hObject    handle to figure
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)

% Get default command line output from handles structure
varargout{1} = handles.output;


% --- Executes on button press in pushbutton1.
function pushbutton1_Callback(hObject, eventdata, handles)
% hObject    handle to pushbutton1 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
global im;
global Predicted_Label;
cla (handles.axes1,'reset')
 
axes(handles.axes1);
set(handles.edit2,'string',num2str(0));
load gnet.mat

[filename,pathname]=uigetfile({'*.bmp;*.jpg;*.png;*.jpeg;*.tif'},'选择一个图片','F:\test');
str=[pathname filename];
% 判断文件是否为空,也可以不用这个操作!直接读入图片也可以的
% im = imread(str);
% imshow(im)
if isequal(filename,0)||isequal(pathname,0)
    warndlg('please select a picture first!','warning');
    return;
else
    im = imread(str);
    imshow(im);
end
II(:,:,1) = imresize(im(:,:,1),[224,224]);
II(:,:,2) = imresize(im(:,:,2),[224,224]);
II(:,:,3) = imresize(im(:,:,3),[224,224]);
[Predicted_Label, Probability] = classify(net, II);

% --- Executes on button press in pushbutton2.
function pushbutton2_Callback(hObject, eventdata, handles)
% hObject    handle to pushbutton2 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
% global im;
%  
% 
% 
% [Predicted_Label, Probability] = classify(net, II);
% imshow(im);
%  
 
global im;
global Predicted_Label;
set(handles.edit2,'string',Predicted_Label);
 

% --- Executes on button press in pushbutton3.
 


% --- Executes on button press in pushbutton5.
function pushbutton5_Callback(hObject, eventdata, handles)
% hObject    handle to pushbutton5 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
clc;
clear;
close all;


function edit1_Callback(hObject, eventdata, handles)
% hObject    handle to edit1 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit1 as text
%        str2double(get(hObject,'String')) returns contents of edit1 as a double


% --- Executes during object creation, after setting all properties.
function edit1_CreateFcn(hObject, eventdata, handles)
% hObject    handle to edit1 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.
%       See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
    set(hObject,'BackgroundColor','white');
end



function edit2_Callback(hObject, eventdata, handles)
% hObject    handle to edit2 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit2 as text
%        str2double(get(hObject,'String')) returns contents of edit2 as a double


% --- Executes during object creation, after setting all properties.
function edit2_CreateFcn(hObject, eventdata, handles)
% hObject    handle to edit2 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.
%       See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
    set(hObject,'BackgroundColor','white');
end


 

 

function edit5_Callback(hObject, eventdata, handles)
% hObject    handle to edit5 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit5 as text
%        str2double(get(hObject,'String')) returns contents of edit5 as a double


% --- Executes during object creation, after setting all properties.
function edit5_CreateFcn(hObject, eventdata, handles)
% hObject    handle to edit5 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.
%       See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
    set(hObject,'BackgroundColor','white');
end



function edit6_Callback(hObject, eventdata, handles)
% hObject    handle to edit6 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit6 as text
%        str2double(get(hObject,'String')) returns contents of edit6 as a double


% --- Executes during object creation, after setting all properties.
function edit6_CreateFcn(hObject, eventdata, handles)
% hObject    handle to edit6 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.
%       See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
    set(hObject,'BackgroundColor','white');
end


% --- Executes on button press in pushbutton6.
function pushbutton6_Callback(hObject, eventdata, handles)
% hObject    handle to pushbutton6 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)


Name1   = get(handles.edit7, 'String');
NEpochs = str2num(get(handles.edit8, 'String'));
NMB     = str2num(get(handles.edit9, 'String'));
LR      = str2num(get(handles.edit10, 'String'));
Rate    = str2num(get(handles.edit11, 'String'));


% 使用 imageDatastore 加载图像数据集
Dataset = imageDatastore(Name1, 'IncludeSubfolders', true, 'LabelSource', 'foldernames');
% 将数据集分割为训练集、验证集和测试集
[Training_Dataset, Validation_Dataset, Testing_Dataset] = splitEachLabel(Dataset, Rate, (1-Rate)/2, (1-Rate)/2);
% 加载预训练的 GoogleNet 网络
load googlenet.mat
 
 
% 获取输入层的大小
Input_Layer_Size = net.Layers(1).InputSize(1:2);

% 将图像数据集调整为预训练网络的输入尺寸
Resized_Training_Dataset   = augmentedImageDatastore(Input_Layer_Size ,Training_Dataset);
Resized_Validation_Dataset = augmentedImageDatastore(Input_Layer_Size ,Validation_Dataset);
Resized_Testing_Dataset    = augmentedImageDatastore(Input_Layer_Size ,Testing_Dataset);
...............................................................................
0Y_005m

4. Complete algorithm code file

V

Guess you like

Origin blog.csdn.net/hlayumi1234567/article/details/132652445