Crazy Rockets-Teach you how to integrate Huawei HMS ML Kit face detection and gesture recognition to create a hot mini game

Preface

I don't know how many people like the editor are screened by some small games from time to time. These games are simple to operate, suitable for all ages, and spread very quickly, and dominate the screen of friends in minutes. The editor also has a dream. I hope that one day I can make a small game that can dominate the screen of friends. However, it is not a simple matter to make such a popular mini game. So the editor started to collect information on the Internet and finally found that the face detection and hand key point recognition provided by Huawei HMS ML Kit can pass the face and Hand key point detection to realize the fun of the game.

Application scenarios

The HMS ML Kit face detection service detects up to 855 key points of the face, and returns information such as the coordinates of the face contour, eyebrows, eyes, nose, mouth, ears and other parts, as well as the deflection angle of the face. After integrating the face detection service, developers can quickly build facial beautification applications based on this information, or add some interesting elements to the face to increase the interest of the picture.

Hand key point recognition technology has many application scenarios in life. For example, after the software for shooting short videos integrates this technology, it can generate some cute or funny special effects based on the key points of the hand to increase the interest of the short video.

Crazy Rockets is a game that integrates the above two services and is jointly developed. There are two ways to play this game. One is to control the rocket shuttle by moving the face up and down through Stonehenge. The other is to control the rocket shuttle through the Stonehenge by moving up and down with gestures. Both methods are to feedback information by detecting the key points of the face and hand, and then control the movement of the rocket, which is full of fun, let's take a look at the game display first!
Insert picture description here
Insert picture description here

how about it? Is it very exciting, then follow the editor to see how to integrate the HMS ML Kit face detection capabilities to realize the production of Crazy Rockets.

Development combat

For detailed preparation steps, please refer to Huawei Developer Alliance:

https://developer.huawei.com/consumer/cn/doc/development/HMS-Guides/ml-process-4

Here are the key development steps.

One. human face

1. Configure maven warehouse

Configure the Maven repository address of HMS Core SDK in "allprojects> repositories".

allprojects {
    repositories {
        google()
        jcenter()
        maven {url 'https://developer.huawei.com/repo/'}
    }
}

Configure the Maven warehouse address of the HMS Core SDK in "buildscript> repositories".

buildscript {    
    repositories {        
       google()        
       jcenter()        
       maven {url 'https://developer.huawei.com/repo/'}    
    }
}

Add agcp configuration in "buildscript> dependencies".

dependencies {
        ...        
        classpath 'com.huawei.agconnect:agcp:1.3.1.300'    
     }    
}

2. Integrated SDK

Implementation  'com.huawei.hms:ml-computer-vision-face:2.0.1.300'

3. Create a face analyzer

MLFaceAnalyzer analyzer = MLAnalyzerFactory.getInstance().getFaceAnalyzer();

4. Create a processing class

public class FaceAnalyzerTransactor implements MLAnalyzer.MLTransactor<MLFace> {
    @Override
    public void transactResult(MLAnalyzer.Result<MLFace> results) {
        SparseArray<MLFace> items = results.getAnalyseList();
        // 开发者根据需要处理识别结果,需要注意,这里只对检测结果进行处理。
        // 不可调用ML Kit提供的其他检测相关接口。
    }
    @Override
    public void destroy() {
        // 检测结束回调方法,用于释放资源等。
    }
}

5. Create LensEngine to capture the camera's dynamic video stream and pass it to the analyzer

LensEngine lensEngine = new LensEngine.Creator(getApplicationContext(), analyzer)
    .setLensType(LensEngine.BACK_LENS)
    .applyDisplayDimension(1440, 1080)
    .applyFps(30.0f)
    .enableAutomaticFocus(true)
    .create();

6. Call the run method, start the camera, read the video stream, and identify

// 请自行实现SurfaceView控件的其他逻辑。
SurfaceView mSurfaceView = findViewById(R.id.surface_view);
try {
    lensEngine.run(mSurfaceView.getHolder());
} catch (IOException e) {
    // 异常处理逻辑。
}

7. Release detection resources

if (analyzer != null) {
    try {
        analyzer.stop();
    } catch (IOException e) {
         // 异常处理。
    }
}
if (lensEngine != null) {
    lensEngine.release();
}

two. Gesture Recognition

1. Configure maven warehouse

Configure the Maven repository address of HMS Core SDK in "allprojects> repositories".

allprojects {
    repositories {
        google()
        jcenter()
        maven {url 'https://developer.huawei.com/repo/'}
    }
}

Configure the Maven warehouse address of the HMS Core SDK in "buildscript> repositories".

buildscript {    
     repositories {        
         google()        
         jcenter()        
         maven {url 'https://developer.huawei.com/repo/'}    
     }
}

Add agcp configuration in "buildscript> dependencies".

dependencies {
        ...        
        classpath 'com.huawei.agconnect:agcp:1.3.1.300'    
     }
 }

2. Integrated SDK

// 引入基础SDK
implementation 'com.huawei.hms:ml-computer-vision-handkeypoint:2.0.4.300'
// 引入手部关键点检测模型包
implementation 'com.huawei.hms:ml-computer-vision-handkeypoint-model:2.0.4.300'

3. Create a default gesture analyzer

 MLHandKeypointAnalyzer analyzer =MLHandKeypointAnalyzerFactory.getInstance().getHandKeypointAnalyzer();

4. Create a processing class

public class HandKeypointTransactor implements MLAnalyzer.MLTransactor<List<MLHandKeypoints>> {
@Override
public void transactResult(MLAnalyzer.Result<List<MLHandKeypoints>> results) {
SparseArray<List<MLHandKeypoints>> analyseList = results.getAnalyseList();
// 开发者根据需要处理识别结果,需要注意,这里只对检测结果进行处理。
// 不可调用ML Kit提供的其他检测相关接口。
}
@Override
public void destroy() {
// 检测结束回调方法,用于释放资源等。
}
}

5. Set the processing class

analyzer.setTransactor(new HandKeypointTransactor());

6. Create Lengengine

LensEngine lensEngine = new LensEngine.Creator(getApplicationContext(), analyzer)
.setLensType(LensEngine.BACK_LENS)
.applyDisplayDimension(1280, 720)
.applyFps(20.0f)
.enableAutomaticFocus(true)
.create();

7. Call the run method, start the camera, read the video stream, and identify

// 请自行实现SurfaceView控件的其他逻辑。
SurfaceView mSurfaceView = findViewById(R.id.surface_view);
try {
lensEngine.run(mSurfaceView.getHolder());
} catch (IOException e) {
// 异常处理逻辑。
}

8. Release detection resources

if (analyzer != null) {
analyzer.stop();
}

if (lensEngine != null) {
lensEngine.release();
}

For more details, please refer to:

Official website of Huawei Developer Alliance:

https://developer.huawei.com/consumer/cn/hms/huawei-mlkit

Obtain development guidance documents:

https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/service-introduction-0000001050040017

To participate in developer discussions, please go to the Reddit community:https://www.reddit.com/r/HuaweiDevelopers/

To download the demo and sample code, please go to Github:https://github.com/HMS-Core

To solve integration problems, please go to Stack Overflow:

https://stackoverflow.com/questions/tagged/huawei-mobile-services?tab=Newest


Original link:
https://developer.huawei.com/consumer/cn/forum/topic/0201388581574050067?fid=18&pid=0301388581574050321
Author: timer

Guess you like

Origin blog.51cto.com/14772288/2546552