Integrate Huawei's machine learning service (ML Kit) to easily create explosive mini games

When you scan Moments, you will always be refreshed by some interesting games. These games are easy to operate, suitable for all ages, and spread very fast, and they dominate the circle of friends in minutes. Do you also want to make an explosive and interesting little game? The face recognition and hand key point recognition functions provided by Huawei's machine learning service can help you achieve this.

Crazy Rockets-This game integrates facial recognition detection and hand key point recognition. Two gameplays have been developed, one is to control the rocket to shuttle through Stonehenge by moving the face up and down. The other is to control by moving up and down gestures. Both methods are to feedback information by detecting the key points of the face and hands, and then control the movement of the rocket, which is full of fun!

Integrate Huawei's machine learning service (ML Kit) to easily create explosive mini games

The shopping cart crazy game is realized by integrating the key point detection function of the hand. Through gesture detection, the shopping cart can be controlled to move left and right, so as to catch all kinds of goods that fall. It will speed up every 15 seconds for the player Bring a different shopping experience.

Integrate Huawei's machine learning service (ML Kit) to easily create explosive mini games

Crazy Rockets development combat

(1) Human face

1. Configure maven warehouse

  • Configure the Maven repository address of HMS Core SDK in "allprojects> repositories".
allprojects {
    repositories {
        google()
        jcenter()
        maven {url 'https://developer.huawei.com/repo/'}
    }
}
  • Configure the Maven warehouse address of the HMS Core SDK in "buildscript> repositories".
buildscript {   
    repositories {       
       google()       
       jcenter()       
       maven {url 'https://developer.huawei.com/repo/'}   
    }
}
  • Add agcp configuration in "buildscript> dependencies".
dependencies {
        ...       
        classpath 'com.huawei.agconnect:agcp:1.3.1.300'   
     }   
}

2. Integrated SDK

Implementation  'com.huawei.hms:ml-computer-vision-face:2.0.1.300'

3. Create a face analyzer

MLFaceAnalyzer analyzer = MLAnalyzerFactory.getInstance().getFaceAnalyzer();

4. Create a processing class

public class FaceAnalyzerTransactor implements MLAnalyzer.MLTransactor<MLFace> {
    @Override
    public void transactResult(MLAnalyzer.Result<MLFace> results) {
        SparseArray<MLFace> items = results.getAnalyseList();
        // 开发者根据需要处理识别结果,需要注意,这里只对检测结果进行处理。
        // 不可调用ML Kit提供的其他检测相关接口。
    }
    @Override
    public void destroy() {
        // 检测结束回调方法,用于释放资源等。
    }
}

5. Create LensEngine to capture the camera's dynamic video stream and pass it to the analyzer

LensEngine lensEngine = new LensEngine.Creator(getApplicationContext(), analyzer)
    .setLensType(LensEngine.BACK_LENS)
    .applyDisplayDimension(1440, 1080)
    .applyFps(30.0f)
    .enableAutomaticFocus(true)
    .create();

6. Call the run method, start the camera, read the video stream, and identify

// 请自行实现SurfaceView控件的其他逻辑。
SurfaceView mSurfaceView = findViewById(R.id.surface_view);
try {
    lensEngine.run(mSurfaceView.getHolder());
} catch (IOException e) {
    // 异常处理逻辑。
}

7. Release detection resources

if (analyzer != null) {
    try {
        analyzer.stop();
    } catch (IOException e) {
         // 异常处理。
    }
}
if (lensEngine != null) {
    lensEngine.release();
}

(2) Gesture recognition

1. Configure maven repository

Configure the Maven repository address of HMS Core SDK in "allprojects> repositories".

allprojects {
    repositories {
        google()
        jcenter()
        maven {url 'https://developer.huawei.com/repo/'}
    }
}

Configure the Maven warehouse address of the HMS Core SDK in "buildscript> repositories".

buildscript {   
    repositories {       
       google()       
       jcenter()       
       maven {url 'https://developer.huawei.com/repo/'}   
    }
}

Add agcp configuration in "buildscript> dependencies".

dependencies {
        ...       
        classpath 'com.huawei.agconnect:agcp:1.3.1.300'   
     }
 }

2. Integrated SDK

// 引入基础SDK
implementation 'com.huawei.hms:ml-computer-vision-handkeypoint:2.0.4.300'
// 引入手部关键点检测模型包
implementation 'com.huawei.hms:ml-computer-vision-handkeypoint-model:2.0.4.300'

3. Create a default gesture analyzer

MLHandKeypointAnalyzer analyzer =MLHandKeypointAnalyzerFactory.getInstance().getHandKeypointAnalyzer();

4. Create a processing class

public class HandKeypointTransactor implements MLAnalyzer.MLTransactor<List<MLHandKeypoints>> {
@Override
public void transactResult(MLAnalyzer.Result<List<MLHandKeypoints>> results) {
SparseArray<List<MLHandKeypoints>> analyseList = results.getAnalyseList();
// 开发者根据需要处理识别结果,需要注意,这里只对检测结果进行处理。
// 不可调用ML Kit提供的其他检测相关接口。
}
@Override
public void destroy() {
// 检测结束回调方法,用于释放资源等。
}
}

5. Set the processing class

analyzer.setTransactor(new HandKeypointTransactor());

6. Create Lengengine

LensEngine lensEngine = new LensEngine.Creator(getApplicationContext(), analyzer)
.setLensType(LensEngine.BACK_LENS)
.applyDisplayDimension(1280, 720)
.applyFps(20.0f)
.enableAutomaticFocus(true)
.create();

7. Call the run method, start the camera, read the video stream, and identify

// 请自行实现SurfaceView控件的其他逻辑。
SurfaceView mSurfaceView = findViewById(R.id.surface_view);
try {
lensEngine.run(mSurfaceView.getHolder());
} catch (IOException e) {
// 异常处理逻辑。
}

8. Release detection resources

if (analyzer != null) {
analyzer.stop();
}

if (lensEngine != null) {
lensEngine.release();
}

(3) Actual Combat of Crazy Shopping Cart Development

1. Configure Maven warehouse address

buildscript {
    repositories {
        google()
        jcenter()
        maven {url 'https://developer.huawei.com/repo/'}
    }
    dependencies {
        ...
        classpath 'com.huawei.agconnect:agcp:1.4.1.300'
    }
}

allprojects {
    repositories {
        google()
        jcenter()
        maven {url 'https://developer.huawei.com/repo/'}
    }
}

2. Full SDK integration

dependencies{
    // 引入基础SDK
    implementation 'com.huawei.hms:ml-computer-vision-handkeypoint:2.0.4.300'
    // 引入手部关键点检测模型包
    implementation 'com.huawei.hms:ml-computer-vision-handkeypoint-model:2.0.4.300'
}

After integrating the SDK in one of the above two ways, add configuration in the file header.

Add apply plugin:'com.huawei.agconnect' after apply plugin:'com.android.application'

3. Create a hand key point analyzer

MLHandKeypointAnalyzer analyzer =MLHandKeypointAnalyzerFactory.getInstance().getHandKeypointAnalyzer();

4. Create recognition result processing class "HandKeypointTransactor"

public class HandKeypointTransactor implements MLAnalyzer.MLTransactor<List<MLHandKeypoints>> {
    @Override
    public void transactResult(MLAnalyzer.Result<List<MLHandKeypoints>> results) {
        SparseArray<List<MLHandKeypoints>> analyseList = results.getAnalyseList();
        // 开发者根据需要处理识别结果,需要注意,这里只对检测结果进行处理。
        // 不可调用ML Kit提供的其他检测相关接口。
    }
    @Override
    public void destroy() {
        // 检测结束回调方法,用于释放资源等。
    }
}

5. Set the recognition result processor to realize the binding of the analyzer and the result processor

analyzer.setTransactor(new HandKeypointTransactor());

6. Create LensEngine

LensEngine lensEngine = new LensEngine.Creator(getApplicationContext(), analyzer)
    .setLensType(LensEngine.BACK_LENS)
    .applyDisplayDimension(1280, 720)
    .applyFps(20.0f)
    .enableAutomaticFocus(true)
    .create();

7. Call the run method, start the camera, read the video stream, and identify

// 请自行实现SurfaceView控件的其他逻辑。
SurfaceView mSurfaceView = findViewById(R.id.surface_view);
try {
    lensEngine.run(mSurfaceView.getHolder());
} catch (IOException e) {
    // 异常处理逻辑。
}

8. After the detection is complete, stop the analyzer and release the detection resources

if (analyzer != null) {
    analyzer.stop();
}
if (lensEngine != null) {
    lensEngine.release();
}

After reading the main development steps, do you feel that the integration is simple and fast? In addition to the above two mini games, face recognition and hand key point recognition technologies have many application scenarios in life. For example, after the software for shooting short videos integrates this technology, it can generate some cute or funny special effects based on the key points of the hand to increase the interest of the short video. Or in a smart home-oriented scene, you can customize some gestures as remote control instructions for smart home appliances to perform some smarter human-computer interaction methods. Come and try and develop fun and interesting applications together!

For more details, please refer to:

Huawei Developer Alliance official website , get development guidance documents

To participate in developer discussions, please go to Reddit

To download demo and sample code, please go to Github

To solve integration problems, please go to Stack Overflow


Original link:
https://developer.huawei.com/consumer/cn/forum/topic/0204406585449080270?fid=18&pid=0304406585449080230

Author: Pepper

Guess you like

Origin blog.51cto.com/14772288/2562480