WeChat jumps - Matlab can also play~

     Presumably everyone will play WeChat to jump to some extent. Since other languages ​​can automatically play games, out of curiosity, Matlab can of course do better! Set labeling-->training-->testing-->final automatic (or manual) in one, and it does not require so much code + mobile phone needs root and other troubles! In order to get a stable screenshot for each jump step, I set a pause of 3 seconds, and the program recognition speed is still very fast (if the screen stability is not considered, there will be a result after the pause of 3 seconds before the update has time), basically no for loop, Try to use the functions and functions that come with matlab.


Running software requirements:

1. MATLAB R2017a or the latest version.

2. Download and install the TeamView remote control software on the computer.

3. Download the QuickSupport APP + Add on plug-in for your model on the Android phone (for computer to control the phone in real time).


Ideas:

Calculate the time by identifying the distance between the throwing piece and the chess piece on the game screen, and then hand it over to TeamView to control the mobile phone to press the screen in real time to achieve a jump. The recognized image ROI is selected within 1/5 of the height of the mobile phone screen from the upper and lower edges, and the width is the screen width. Of course, there are many ways to identify checkerboard squares and throwing pieces. The algorithms used below include ACF, cascade, template matching, and identification of details such as color and edges. If one of the detectors is not detected, let the other detector detect it. If it is not detected, it is replaced by manually taking the coordinates of the center points of the two to obtain the distance, and then multiplying the distance by the coefficient to obtain the pressing time, and finally calling Windows The API function controls the mouse to press on the screen for a fixed time.


step:

1. Image annotation

(I have given the download link in the appendix, readers can skip this step)

First, I took a few screenshots of the game from my mobile phone and stored them on the computer disk. I got about 245 images here. Import it into the training Image Labeler/ imageLabeler APP of Matlab, label two types of ROIs, as shown in Figure 1, and export to the workspace after labeling, and the result is shown in Figure 2. Save the labeling result as weChartlabel.mat.


Figure 1 Image annotation


Figure 2 Annotation results

2. Image training

Train 2 object types, throwing dice, chess block square targetObject. I chose to use cascade for dice training, with ACF, template matching and other methods for training, because the size and color of each throw dice are relatively stable; chess squares use ACF, edge detection and other features, because the next time you have to jump The square is always at the top of the screen, and the upper vertex of the chess piece can basically be found through edge detection. ACF training is shown in a few lines of code in Figure 3, and cascade training is shown in Figure 4.

ACF training throws and chess squares:

    load weChartlabel.mat % mark the result file, the variable mylabel stored in it
    Detector =trainACFObjectDetector(mylabel(1:100,[1,2])); % training throwing subdice
    Detector2 =trainACFObjectDetector(mylabel(1:100,[1,3]));% training target chess piece targetObject
    save weChartDetector.mat Detector Detector2

 cascade training thrower:

%% prepare
load weChartlabel.mat
positiveSamples = mylabel(1:200,1:2);
negativeSamples = imageDatastore('./negativeSamples');

%% train
trainCascadeObjectDetector ('diceDetector.xml', positiveSamples, ...
    negativeSamples,'FalseAlarmRate',0.1,'NumCascadeStages',5);

In the above ACF training, I only used 100 samples, and cascade used 200 samples. You can choose according to your own situation. After training, various detectors are saved in the weChartDetector.mat and diceDetector.xml files in the current directory. In addition, the thrower is relatively simple, and the template matching method can also be used. Thinking of using the template matching vision.TemplateMatcher in matlab, the pixel-by-pixel calculation method is more time-consuming. Therefore, the library function of OpenCV is called to realize this part, which has been realized by mexFunction.

3. Image test

It mainly gives the template matching graph of throwing pieces and the edge detection graph of chess pieces. Figure 3 is the ROI of the original image, Figure 4 is the edge detection map, and Figure 5 is the map of my throwing sub-matching.

                              

Figure 3 Original ROI

                       

Figure 4 Edge detection

                              

Figure 5 Template matching result graph

4. Automatic detection/manual detection

The process of saving the images of each step in the middle is set up. The images in the folder will be emptied before each operation, and the original images and annotation images of the results will be automatically saved after the operation. These two folders are save_Imgs and save_RGB in the current directory. An example image of it is shown in Figure 6. Manual annotation is shown in Figure 7.

                                       

                                     

                                         

Figure 6 Automatic detection


Figure 7 Manually mark the center

Main source code:

% author:cuixing
% email:[email protected]
% date:2018-01-7
%

%% main
addpath ('./ matchTemplateDice')

%% Setting parameters
screenSize = get(groot,'ScreenSize');% Get the computer screen size
flagAUTO = true; % whether it is automatic, false is to manually click the center of the two to get the distance
rate_time = 800/195; % The approximate relationship between distance and time is: 800ms==195 pixels, you can adjust it yourself
videoObj = vision.VideoPlayer ();
numFrame = 1;
cursorXRange = [262,411];% Computer screen coordinates, simulate the finger pressing within this range, which can be estimated based on get(groot,'PointerLocation')
cursorYRange = [119,252];% Computer screen coordinates, simulating finger pressing within this range, can be estimated based on get(groot,'PointerLocation')
xMag = cursorXRange (2) - cursorXRange (1);
yMag = cursorYRange(2)- cursorYRange(1);
templateImg = imread('./matchTemplateDice/template.jpg');

%% Ready to work
figurePosition = [screenSize(3)/2,screenSize(4)/3,screenSize(4)/2,screenSize(4)/2];
h = figure('Name','RGB','position',figurePosition);
saveImgPath = './save_Imgs';% for viewing only
saveRGBPath = './save_RGB';% for viewing only
if ~exist(saveImgPath,'dir')
    mkdir(saveImgPath)
end
if ~exist(saveRGBPath,'dir')
    mkdir(saveRGBPath)
end

% delete the last stored image
s1 = struct2cell(dir([saveImgPath,'/','*.jpg']))';
s2 = struct2cell(dir([saveRGBPath,'/','*.jpg']))';
if ~isempty(s1)
    cellTem = s1(:,1);
    cellfun(@(x)deleteImgs(x,saveImgPath),cellTem);
end
if ~isempty(s2)
    cellTem = s2(:,1);
    cellfun(@(x)deleteImgs(x,saveRGBPath),cellTem);
end

%% main program
constWidth = 396;% Set the fixed width of the image, which is related to the template image
while isvalid(h) % If you want to exit the loop and close the window
    % Take a screenshot to get the screen of the mobile phone
    set(groot,'PointerLocation',[410,65]);% The location of the screenshot window where the mouse is located, set it yourself
    flag1 =mouseclick(0); % 0 means click, left click to select the current mouse position, mex compile function
    Img = cropFunction(1);% If there is no parameter, the entire screen will be captured, and if one parameter is selected, the window will be captured.
    imwrite(Img,['save_Imgs/',...
        datestr(now,'yyyy-mm-dd-HH-MM-SS'),...
        '_snopshot_',num2str(numFrame),'.jpg']);
    
    %% Calculate the distance based on the current mobile phone screen image
    rows = size(Img,1);
    cols = size(Img,2);
    ROI = [0,round(rows/5),cols,round(3*rows/5)];
    targetImg = imcrop(Img,ROI);
    targetHight = size(targetImg,1);
    targetWidth = size(targetImg,2);
    constHight = round(targetHight*constWidth/targetWidth); % proportional
    targetImg = imresize(targetImg,[constHight,constWidth]);% fixed size
    if flagAUTO
        [currentD,RGB,flag] = getD_jump(targetImg,templateImg);
        If flag % throws a piece or the piece is not detected, use the mouse to select the point at this time
            imshow(targetImg);
            [x, y] = ginput (2);
             message = sprintf('%s\n%s\n','Please use the mouse to click the center of the throwing sub,',...
                'Then click on the center of the square block');
            msgbox(message,'The throwing piece or chess piece is not detected!');
            currentD = pdist([x,y],'euclidean');
            RGB = targetImg;
            close(gcf)
        end
    else
        h.Position = figurePosition;
        imshow(targetImg);
        [x, y] = ginput (2);
         message = sprintf('%s\n%s\n','Please use the mouse to click the center of the throwing sub,',...
            'then click on the center of the pawn square');
        msgbox(message,'Manual operation!');
        currentD = pdist([x,y],'euclidean');
        RGB = targetImg;
        close(gcf)
    end
    
    imwrite(RGB,['save_RGB/',...
        datestr(now,'yyyy-mm-dd-HH-MM-SS'),...
        '_RGB_',num2str(numFrame),'.jpg']);
    h = figure(1);
    h.Position = figurePosition;
    imshow(RGB);
    fprintf('The distance of the dice of the %d picture: %f pixels...\n',...
        numFrame,currentD);
    
    % Simulate finger clicking on the screen within a certain range, because people cannot click a point with high precision
    position = set(groot,'PointerLocation',...
        [cursorXRange(1)+xMag*rand(),cursorYRange(1)+yMag*rand()]);% set the mouse position in the screenshot window
    
    flag2 = mouseclick(3, rate_time*currentD); % 3 means left click does not move, the second parameter means pause time in milliseconds; mex compile function
    pause(3) % Pause for 3 seconds until the next screen is stable
    numFrame = numFrame+1;
end

%%
rmpath ('./ matchTemplateDice');
getD_jump.m is as follows:

function [distance,RGB,flag] = getD_jump(image,templateImg)
% Input 1 screenshot image of mobile phone, output the distance from the dice to the next chess piece
% and marked image RGB
% flag flag bit, 0 means normal detection, 1 means the thrower is not detected, 2 means the square is not detected

%% prepare data
% path = 'F:\imagesData\WeChat jump jump';
% imds = imageDatastore(path,'includesubfolders',true,...
%     'fileExtensions',{'.png'},...
%     'LabelSource','none');

%% train
load weChartDetector.mat % detector file, which stores the Detector of the throwing piece and the Detector2 of the chess piece
if ~isvalid(Detector)
    load weChartlabel.mat % mark the result file, the variable mylabel stored in it
    Detector =trainACFObjectDetector(mylabel(1:100,[1,2])); % training throwing subdice
    Detector2 =trainACFObjectDetector(mylabel(1:100,[1,3]));% training target chess piece targetObject
    save weChartDetector.mat Detector Detector2
end

%% detect
% detect vibrator
diceDetector = vision.CascadeObjectDetector ('diceDetector.xml');
bboxes = step(diceDetector,image);% 默认cascade
if isempty(bboxes) % use template matching, or acf detection
%     ROI = matchTemplateDice(image,templateImg);
%     box = ROI;
    [bboxes,scores] = detect(Detector,image);
    [~,maxIndex] = max(scores);
    box = bboxes(maxIndex,:);
else
    box = bboxes(1,:);
end
if isempty(box)
    fprintf('The thrower was not detected!\n')
    flag = 1;
    distance = 0;
    RGB = image;
    return;
end
center1 = [box(1)+box(3)/2,box(2)+box(4)/2];

% detect pawns
[bboxes2,~] = detect(Detector2,image);
[score2,minIndex2] = min(bboxes2(:,2));% the top checker square
box2 = bboxes2 (minIndex2, :);
if isempty(box2) % acf does not detect the box, use canny to detect
    gray = rgb2gray(image);
    edgeLines = edge(gray,'canny');
    
    % Find the coordinates of the top white point
    [y, x] = find (edgeLines == 1);
    [~,index] = min(y);
    upperPoint = [x(index),y(index)];% The top point, this point is more reliable to find every time
    center2 = [upperPoint(1),upperPoint(2)+20];% about 20 pixels down
    box2 = [center2(1)-10,center2(2)-10,20,20];
    score2 = 1;
else
    center2 = [box2(1)+box2(3)/2,box2(2)+box2(4)/2];
end
% MinSize=[116,166];
% MaxSize=[193,193];
if isempty(box2)|| center2(2)>= center1(2)||box2(3)>193||box2(4)>116
    fprintf('Pawn not detected!\n')
    flag = 2;
    distance = 0;
    RGB = image;
    return;
end

%% annotation save
labels = cell(1,1);
labels{1} = sprintf('%s','dice');
labels2 = cell(1,1);
labels2{1} = sprintf('%s %s','Pier,',['score:',num2str(score2)]);

distance = pdist([center1;center2],'euclidean');
putString = sprintf('%s\n',['Next Line,',...
    'distance:',num2str(distance)]);
RGB = insertText(image,[20,20],putString,'TextColor','red','FontSize',25);
RGB = insertShape(RGB,'line',[center1,center2],...
    'color','green','LineWidth',3);% draw the line, green
RGB = insertObjectAnnotation(RGB,'rectangle',...
    box,labels,'color','blue','LineWidth',3);% draw a throwing child, blue box
RGB = insertObjectAnnotation(RGB,'rectangle',...
    box2,labels2,'color','red','LineWidth',3);% draw chess pieces, red box
flag = 0;% means normal detection





At present, the program I designed does not use the ADB tool. I can think of a simple tool to obtain the mobile phone interface in real time. Currently, I use Teamviewer QS (ie QuickSupport APP ), which is easy to install, convenient, and fool-like operation~ If you want to get a high score Not only is the real-time communication between the computer and the mobile phone better, but the above coefficient rateTime is properly selected. The mouse control function of matlab and Windows API has been compiled by mex, and it can be directly used as a function call in matlab. Of course, if it is other platforms, such as Linux and mac, you can re-mex cpp files. Different platforms have different compilation suffixes. Other usage methods are the same. I wish you a happy game~

Attachment: The link gives the download link of all programs and files, link: https://pan.baidu.com/s/1ht5IMyC Password: 8c74







Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325593734&siteId=291194637