Kinect for windows v2 gesture recognition tools of Visual Gesture Builder uses (b)

Reprinted to: https://www.parful.com/blog/article/108

In using Visual Gesuture Builder (hereinafter referred to as VGB) before the tool to be familiar with the Kinect Studio tools, such as not familiar with the venue and Kinect for window using v2 gesture recognition tools of the Kinect Studio

VGB is used for ML (machine learning) according Kinect Studio suffix .xef recorded data stream file, may be obtained by training a Kinect SDK .gba or suffix recognized gesture database file .gbd

VGB use the steps

  1. Open VGB, choose File-> New Solution Solution name as input (Seated), will generate a suffix .vbgsln of VGB solutions.
  2. At this time, as shown in FIG interface

  3. Right-click Seated -> Create New Project With Wizard (the familiar can be selected directly Create New Project)

  4. Next -> a name for your posture, try to be consistent with the name of the action

  5. Next -> Choose your action is dependent on lower body joints (buttocks), because my action is to sit down, you need the following hip joints, so this place select Yes

  6. Next -> Choose your action is necessary to detect the hand of the state (OPEN | CLOSED | LASSO), my action is to sit down, no need to detect the hand of the state, where select No

  7. Next -> Select your moves in line with mask, green will be used in joint training matter, gray is ignored. My action is to sit down, so I chose the lower left corner of the mask Ignore left arm, Ignore right arm, Ignore Hand state

  8. Next —>选择是否需要辨别动作的左右侧,选择 Yes 的情况如:你想知道玩家是在用左手打乒乓球还是右手打乒乓球,我的动作为坐下,不需要辨别动作的左右侧,所以这里选择No

  9. Next —>选择你的动作是否是对称的,选择 Yes ,在训练时 Duplicate 和 Mirror 将会被设置为 True ,这将会产生大量的数据来减少影响动作的对称性的因素。在此,坐下的动作选择 No

  10. Next —>选择你的动作是否需要获取进度值(progress),这个在连续性姿势检测时选择为 Yes,目前要检测的动作是离散型姿势,这里选择 No

  11. Next —>确认你的姿势的设置

  12. Confirm —>为你的项目起个名称 我的为Seated,这里将产生两个项目,Seated与Seated.a Seated将用于训练, Seated.a 将用于数据分析

  13. 右键点击Seated项目 —>Add Clip 选择Kinect Studio 录制的 Seated.xef文件,选中添加进来的剪辑

  14. 在控制区按住shift+left 或者 shift+right 进行动作选区标记,符合的部分 按enter键,不符合的部分按space键(一般情况下只需要对符合的部分进行标记就可以了),如下图有三段选区符合坐下的动作,这样标记后便可用于VGB的ML(机器学习)。

  15. 右键点击Seated项目 —>Build 输入姿势数据库名称,本案例名称为Seated.gba,保存。

  16. 此时我们可以在输出窗口看到构建过程,并在相应目录产生Seated.gba文件

  17. 到这里,姿势数据库文件就已经制作完了,接下来进行实时预览制作好的姿势数据库文件的可行性与可靠度。鼠标右击Seated项目 —>Live Preview —>选择刚刚制作好的Seated.gba文件,可以看到,柱状图的高度代表的是判断左侧画面是坐立的动作的信心,值在0-1之间。此图可以看到,判断该动作为坐立的信心很高,因为该动作比较简单,当有复杂的动作的需求时,应当将其拆散为简单的离散动作进行训练。

多说一点

到这里,简单离散动作识别基本就是这些了。下一篇文章将会用介绍使用VGB API进行读取姿势数据库数据,并与体感实时匹配获取匹配可靠度。

Guess you like

Origin blog.csdn.net/ZDT_zdh/article/details/90473889