Android quickly integrates hand key point recognition capabilities super simple

Preface

In the previous article "Using Huawei HMS ML Kit Human Skeleton Recognition Technology, Android Quickly Realize Human Posture Action Capture", we introduced the HMS ML Kit human skeleton recognition technology, which can locate the top of the head, neck, shoulders, elbows, wrists, and hip , Knees, ankles and other key points of the human body. In addition to identifying the key points of the human body, HMS ML Kit also provides developers with hand key point recognition technology, which can locate 21 key points of the hand including fingertips, joint points, and wrist points, which are human-computer interactive The experience is richer.

Application scenario

Hand key point recognition technology has many application scenarios in life. For example, after the software for shooting short videos integrates this technology, it can generate some cute or funny special effects according to the key points of the hand, and increase the interest of the short video.
Insert picture description here

Or in a smart home-oriented scenario, you can customize some gestures as remote control instructions for smart home appliances to perform some smarter human-computer interaction methods.
Insert picture description here

Development combat

Here's how to quickly integrate Huawei's HMS ML Kit hand key point recognition technology, taking video stream recognition as an example.

1. Development Preparation

For detailed preparation steps, please refer to Huawei Developer Alliance:

https://developer.huawei.com/consumer/cn/doc/development/HMS-Guides/ml-process-4

Here are the key development steps.

1.1 Configure Maven warehouse address in project-level gradle

buildscript {
    repositories {
             ...
        maven {url 'https://developer.huawei.com/repo/'}
    }
}
 dependencies {
                 ...
        classpath 'com.huawei.agconnect:agcp:1.3.1.300'
    }
allprojects {
    repositories {
             ...
        maven {url 'https://developer.huawei.com/repo/'}
    }
}

1.2 Configure SDK dependencies in application-level gradle

dependencies{
    // 引入基础SDK
    implementation 'com.huawei.hms:ml-computer-vision-handkeypoint:2.0.2.300'
    // 引入手部关键点检测模型包
    implementation 'com.huawei.hms:ml-computer-vision-handkeypoint-model:2.0.2.300'
}

1.3 Add configuration to the file header

apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'

1.4 Add the following statement to the AndroidManifest.xml file to automatically update the machine learning model to the device

<meta-data 
android:name="com.huawei.hms.ml.DEPENDENCY" 
android:value= "handkeypoint"/>

1.5 Apply for camera permission and read local file permission

<!--相机权限-->
<uses-permission android:name="android.permission.CAMERA" />
<!--读权限-->
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />

2. Code Development

2.1 Create a hand key point analyzer

MLHandKeypointAnalyzerSetting setting = new MLHandKeypointAnalyzerSetting.Factory()
      // MLHandKeypointAnalyzerSetting.TYPE_ALL表示所有结果都返回。
      // MLHandKeypointAnalyzerSetting.TYPE_KEYPOINT_ONLY表示只返回手部关键点信息。
      // MLHandKeypointAnalyzerSetting.TYPE_RECT_ONLY表示只返回手掌区域信息。
      .setSceneType(MLHandKeypointAnalyzerSetting.TYPE_ALL)
      // 设置同一张图片中最多支持检测的手部区域个数。默认最多支持10个手部区域信息检测。
      .setMaxHandResults(1)
      .create();
MLHandKeypointAnalyzer analyzer = MLHandKeypointAnalyzerFactory.getInstance().getHandKeypointAnalyzer(setting);

2.2 The developer creates the recognition result processing class "HandKeypointTransactor", which implements the MLAnalyzer.MLTransactor<T> interface, and uses the "transactResult" method in this class to obtain detection results and implement specific services. In addition to the coordinate information of each finger point, the detection result also includes the palm confidence and the confidence value of each point. The incorrectly recognized invalid palm can be filtered based on the confidence value. In actual applications, it can be based on the tolerance for misrecognition Degree, flexible application of setting threshold.

public class HandKeypointTransactor implements MLAnalyzer.MLTransactor<List<MLHandKeypoints>> {
          @Override
          public void transactResult(MLAnalyzer.Result<List<MLHandKeypoints>> results) {
              SparseArray<List<MLHandKeypoints>> analyseList  = result.getAnalyseList();
              // 开发者根据需要处理识别结果,需要注意,这里只对检测结果进行处理。
              // 不可调用ML Kit提供的其他检测相关接口。
          }
         @Override
         public void destroy() {
            // 检测结束回调方法,用于释放资源等。
        }
}

2.3 Set the recognition result processor to realize the binding of the analyzer and the result processor.

analyzer.setTransactor(new HandKeypointTransactor());

2.4 Create LensEngine, this class is provided by ML Kit SDK, used to capture camera dynamic video stream and pass it to analyzer. It is recommended to set the camera display size not less than 320 320 pixels and not more than 1920 1920 pixels.

2.5 Call the run method, start the camera, read the video stream, and identify.

  // 请自行实现SurfaceView控件的其他逻辑。
  SurfaceView mSurfaceView = findViewById(R.id.surface_view);
  try {
          lensEngine.run(mSurfaceView.getHolder());
  } catch (IOException e) {
          // 异常处理逻辑。
   }

2.6 After the detection is complete, stop the analyzer and release the detection resources.

  if (analyzer != null) {
          analyzer.stop();
  }
  if (lensEngine != null) {
          lensEngine.release();
   }

Demo effect

The following demo shows the effect of hand key point recognition in different gestures. Developers can expand according to actual development needs.
Insert picture description here

Github source code

https://github.com/HMS-Core/hms-ml-demo/blob/master/MLKit-Sample/module-body/src/main/java/com/mlkit/sample/activity/HandKeypointActivity.java

For more detailed development guidelines, please refer to the official website of Huawei Developer Alliance

https://developer.huawei.com/consumer/cn/hms/huawei-mlkit

For more details, please refer to:
Huawei Developer Alliance official website:https://developer.huawei.com/consumer/cn/hms to
obtain development guidance documents:https://developer.huawei.com/consumer/cn/doc/development To
participate in developer discussions, please go to the Reddit community:https://www.reddit.com/r/HMSCore/ To
download the demo and sample code, please go to Github:https://github.com/HMS-Core to
solve integration problems, please go to Stack Overflow:https://stackoverflow.com/questions/tagged/huawei-mobile-services?tab=Newest


Original link:https://developer.huawei.com/consumer/cn/forum/topicview?tid=0203346162792430439&fid=18
Author: leave leaves

Guess you like

Origin blog.51cto.com/14772288/2539212