mMatlab simulation of personnel smoking behavior detection system based on Faster-RCNN network, with GUI operation interface

Table of contents

1. Algorithm simulation effect

2. Algorithms involve an overview of theoretical knowledge

2.1. Faster-RCNN network introduction

2.2. Working principle of Faster-RCNN

2.3 Sight steps of Faster-RCNN

2.4. Application of detection of personnel smoking behavior

3. MATLAB core program

4. Complete algorithm code file


1. Algorithm simulation effect

The matlab2022a simulation results are as follows:

2. Algorithms involve an overview of theoretical knowledge

         Personnel smoking behavior detection systems are widely used in public places such as schools, hospitals, and public transportation. Such systems usually use image or video analysis to detect whether a person is smoking. Among them, smoking behavior detection based on Faster-RCNN network is a commonly used method. The principles, mathematical formulas, and details of such a system are described below.

2.1. Faster-RCNN network introduction

         Faster-RCNN is a popular deep learning target detection algorithm that achieves efficient and accurate target detection by using Region Proposal Network (RPN). Compared with other target detection algorithms, such as R-CNN and SPP-Net, Faster-RCNN has higher efficiency and accuracy.

2.2. Working principle of Faster-RCNN

Faster-RCNN consists of two main parts: RPN and RCNN.

  1. RPN: This network scans over the image by sliding a small window and predicts areas within the window (called "proposals") where objects are likely to exist. It does this by using a method called a "Gaussian mixture model" to classify pixels in a window to determine whether there is a likelihood of an object being present. For each possible region, RPN generates a set of coordinates that represent the location of the region on the original image.
  2. RCNN: This network receives proposals generated by RPN and performs feature extraction on each proposal using a convolutional neural network (CNN). These features are then fed into a fully connected layer to generate a classification (i.e., object or background) and bounding box (i.e., the location of the object in the image) for each proposal.

2.3 Sight steps of Faster-RCNN

  1. For each sliding window, RPN uses a Gaussian mixture model to classify the pixels within the window to determine whether an object is likely to exist. This usually involves computing how well each pixel matches a Gaussian distribution, and classifying the pixels based on how well they match.
  2. RCNN receives proposals generated by RPN and performs feature extraction on them using a convolutional neural network. This usually involves a series of convolutional layers, ReLU activation functions, and pooling layers to extract useful features from the image.
  3. These features are fed into a fully connected layer to generate classification and bounding boxes for each proposal. Fully connected layers typically normalize classification using a softmax function to generate the probability that each proposal belongs to the target or background. At the same time, the fully connected layer also outputs the coordinates of the bounding box to indicate the location of the target in the image.

2.4. Application of detection of personnel smoking behavior

        In the detection of people's smoking behavior, Faster-RCNN can be used to detect whether people in the video have smoking behavior. First, the algorithm generates object region proposals on video frames through the RPN network. Then, the RCNN network takes these proposals and performs feature extraction on them. Finally, a fully-connected layer generates a classification and bounding box for each proposal based on these features, thereby determining the presence or absence of smoking behavior.

       In addition, other technologies such as behavior recognition algorithms can be combined to improve detection accuracy. For example, by analyzing characteristics such as body movements and facial expressions of people in pictures, it can be determined whether they are smoking.

        In summary, the personnel smoking behavior detection system based on the Faster-RCNN network achieves efficient and accurate target detection through deep learning algorithms, and can be combined with other technologies to improve detection accuracy. The application of this method in public places will help improve the effectiveness of smoking control and maintain public health and safety.

3. MATLAB core program

function varargout = tops(varargin)
% TOPS MATLAB code for tops.fig
%      TOPS, by itself, creates a new TOPS or raises the existing
%      singleton*.
%
%      H = TOPS returns the handle to a new TOPS or the handle to
%      the existing singleton*.
%
%      TOPS('CALLBACK',hObject,eventData,handles,...) calls the local
%      function named CALLBACK in TOPS.M with the given input arguments.
%
%      TOPS('Property','Value',...) creates a new TOPS or raises the
%      existing singleton*.  Starting from the left, property value pairs are
%      applied to the GUI before tops_OpeningFcn gets called.  An
%      unrecognized property name or invalid value makes property application
%      stop.  All inputs are passed to tops_OpeningFcn via varargin.
%
%      *See GUI Options on GUIDE's Tools menu.  Choose "GUI allows only one
%      instance to run (singleton)".
%
% See also: GUIDE, GUIDATA, GUIHANDLES

% Edit the above text to modify the response to help tops

% Last Modified by GUIDE v2.5 29-Aug-2023 13:49:55


% Begin initialization code - DO NOT EDIT
gui_Singleton = 1;
gui_State = struct('gui_Name',       mfilename, ...
                   'gui_Singleton',  gui_Singleton, ...
                   'gui_OpeningFcn', @tops_OpeningFcn, ...
                   'gui_OutputFcn',  @tops_OutputFcn, ...
                   'gui_LayoutFcn',  [] , ...
                   'gui_Callback',   []);
if nargin && ischar(varargin{1})
    gui_State.gui_Callback = str2func(varargin{1});
end

if nargout
    [varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
    gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT


% --- Executes just before tops is made visible.
function tops_OpeningFcn(hObject, eventdata, handles, varargin)
% This function has no output args, see OutputFcn.
% hObject    handle to figure
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
% varargin   command line arguments to tops (see VARARGIN)

% Choose default command line output for tops
handles.output = hObject;

% Update handles structure
guidata(hObject, handles);

% UIWAIT makes tops wait for user response (see UIRESUME)
% uiwait(handles.figure1);


% --- Outputs from this function are returned to the command line.
function varargout = tops_OutputFcn(hObject, eventdata, handles) 
% varargout  cell array for returning output args (see VARARGOUT);
% hObject    handle to figure
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)

% Get default command line output from handles structure
varargout{1} = handles.output;


% --- Executes on button press in pushbutton1.
function pushbutton1_Callback(hObject, eventdata, handles)
% hObject    handle to pushbutton1 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
global im;
cla (handles.axes1,'reset')
cla (handles.axes2,'reset')

set(handles.edit2,'string',num2str(0));
set(handles.edit5,'string',num2str(0));

load net015.mat
axes(handles.axes1);
[filename,pathname]=uigetfile({'*.bmp;*.jpg;*.png;*.jpeg;*.tif'},'选择一个图片','F:\test');
str=[pathname filename];
% 判断文件是否为空,也可以不用这个操作!直接读入图片也可以的
% im = imread(str);
% imshow(im)
if isequal(filename,0)||isequal(pathname,0)
    warndlg('please select a picture first!','warning');
    return;
else
    im = imread(str);
    imshow(im);
end


..............................................................................

 

function edit5_Callback(hObject, eventdata, handles)
% hObject    handle to edit5 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit5 as text
%        str2double(get(hObject,'String')) returns contents of edit5 as a double


% --- Executes during object creation, after setting all properties.
function edit5_CreateFcn(hObject, eventdata, handles)
% hObject    handle to edit5 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.
%       See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
    set(hObject,'BackgroundColor','white');
end



function edit6_Callback(hObject, eventdata, handles)
% hObject    handle to edit6 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit6 as text
%        str2double(get(hObject,'String')) returns contents of edit6 as a double


% --- Executes during object creation, after setting all properties.
function edit6_CreateFcn(hObject, eventdata, handles)
% hObject    handle to edit6 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.
%       See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
    set(hObject,'BackgroundColor','white');
end
0Y_003m

4. Complete algorithm code file

V

Guess you like

Origin blog.csdn.net/hlayumi1234567/article/details/132787234