Reprinted from the article: http://www.pythonheidong.com/blog/article/4117/
At the recent Chengdu TGC2016 show, we developed a "Naruto hand tour" somatosensory game, the main analog mobile games section "nine hit", the user avatar four generations, with nine duel, attracted a large number of players participate. On the surface, this game is no different from other somatosensory experience, in fact, it has been run under the browser Chrome, that is to say, we only need to have the appropriate technical front end, you can develop web-based Kinect somatosensory game.
H5-based development using Kinect somatosensory game, in fact, the principle is very simple, the player and environmental data collected by the Kinect to, such as human bones, some way, so that the browser can access the data.
1, data acquisition
Kinect has three lenses, a lens similar intermediate ordinary camera, capturing a color picture. Right and left lens depth data is acquired via infrared. We use the SDK provided by Microsoft to read the following types of data:
- Color data: color image;
- Depth data: Color attempt information;
- Human skeleton data: Based on the above data is calculated, to obtain human bones data.
2, so that the browser can access the Kinect data to
me to try and find out about the framework, basically socket allows the browser to communicate with the server process, for data transfer:
- Kinect-HTML5 using C # to build server-side, color data, try data, skeletal data are provided;
- ZigFu support H5, U3D, Flash development, API more complete, seemingly fees;
- DepthJS provide access to data in the form of a browser plug-in;
- Node-Kinect2 to Nodejs build server, providing data more complete, more examples.
I finally selected Node-Kinect2, although there is no documentation, but many instances, the use of front-end engineers familiar Nodejs, another author feedback faster.
- Kinect: capturing player data, such as a depth image, a color image and the like;
- Node-Kinect2: to obtain the corresponding data from Kinect, and secondary processing;
- Browser: listen node application interface is specified, the data and get the players to complete the game development.
1, System Requirements:
This is a mandatory requirement, I've spent too much time in the non-compliant environment.
This is a mandatory requirement, I've spent too much time in the non-compliant environment.
- USB3.0
- Support DX11 graphics card
- win8 system and above
- Browsers support Web Sockets
- Of course, it is indispensable sensor Kinect v2
2, the environment structures process:
- Kinect v2 connection
- Installation KinectSDK-v2.0
- Installation Nodejs
- Installation of Node-Kinect2
npm install kinect2
As shown below, we show how to obtain human bones and spine logo in the middle and gestures:
1, the server
create web server, and sends data to the browser bones, as follows:
1, the server
create web server, and sends data to the browser bones, as follows:
var Kinect2 = require('../../lib/kinect2'), express = require('express'), app = express(), server = require('http').createServer(app), io = require('socket.io').listen(server); var kinect = new Kinect2(); // 打开kinect if(kinect.open()) { // 监听8000端口 server.listen(8000); // 指定请求指向根目录 app.get('/', function(req, res) { res.sendFile(__dirname + '/public/index.html'); }); // 将骨骼数据发送给浏览器端 kinect.on('bodyFrame', function(bodyFrame){ io.sockets.emit('bodyFrame', bodyFrame); }); // 开始读取骨骼数据 kinect.openBodyReader(); }
2、浏览器端
浏览器端获取骨骼数据,并用canvas描绘出来,关键代码如下:
var socket = io.connect('/'); var ctx = canvas.getContext('2d'); socket.on('bodyFrame', function(bodyFrame){ ctx.clearRect(0, 0, canvas.width, canvas.height); var index = 0; // 遍历所有骨骼数据 bodyFrame.bodies.forEach(function(body){ if(body.tracked) { for(var jointType in body.joints) { var joint = body.joints[jointType]; ctx.fillStyle = colors[index]; // 如果骨骼节点为脊椎中点 if(jointType == 1) { ctx.fillStyle = colors[2]; } ctx.fillRect(joint.depthX * 512, joint.depthY * 424, 10, 10); } // 识别左右手手势 updateHandState(body.leftHandState, body.joints[7]); updateHandState(body.rightHandState, body.joints[11]); index++; } }); });
很简单的几行代码,我们便完成了玩家骨骼捕获,有一定 javascript基础的同学应该很容易能看明白,但不明白的是我们能获取哪些数据?如何获取?骨骼节点名称分别是什么?而node-kienct2并没有文档告诉我们这些。
Node-Kinect2并没有提供文档,我将我测试总结的文档整理如下:
1、服务器端能提供的数据类型;
kinect.on('bodyFrame', function(bodyFrame){}); //还有哪些数据类型呢?
bodyFrame | 骨骼数据 |
infraredFrame | 红外数据 |
longExposureInfraredFrame | 类似infraredFrame,貌似精度更高,优化后的数据 |
rawDepthFrame | 未经处理的景深数据 |
depthFrame | 景深数据 |
colorFrame | 彩色图像 |
multiSourceFrame | 所有数据 |
audio | 音频数据,未测试 |
2、骨骼节点类型
body.joints[11] // joints包括哪些呢?
节点类型 | JointType | 节点名称 |
0 | spineBase | 脊椎基部 |
1 | spineMid | 脊椎中部 |
2 | neck | 颈部 |
3 | head | 头部 |
4 | shoulderLeft | 左肩 |
5 | elbowLeft | 左肘 |
6 | wristLeft | 左腕 |
7 | handLeft | 左手掌 |
8 | shoulderRight | 右肩 |
9 | elbowRight | 右肘 |
10 | wristRight | 右腕 |
11 | handRight | 右手掌 |
12 | hipLeft | 左屁 |
13 | kneeLeft | 左膝 |
14 | ankleLeft | 左踝 |
15 | footLeft | 左脚 |
16 | hipRight | 右屁 |
17 | kneeRight | 右膝 |
18 | ankleRight | 右踝 |
19 | footRight | 右脚 |
20 | spineShoulder | 颈下脊椎 |
21 | handTipLeft | 左手指(食中无小) |
22 | thumbLeft | 左拇指 |
23 | handTipRight | 右手指 |
24 | thumbRight | 右拇指 |
3、手势,据测识别并不是太准确,在精度要求不高的情况下使用
0 | unknown | 不能识别 |
1 | notTracked | 未能检测到 |
2 | open | 手掌 |
3 | closed | 握拳 |
4 | lasso | 剪刀手,并合并中食指 |
4、骨骼数据
body [object] {
bodyIndex [number]:索引,允许6人
joints [array]:骨骼节点,包含坐标信息,颜色信息
leftHandState [number]:左手手势
rightHandState [number]:右手手势
tracked [boolean]:是否捕获到
trackingId
}
body [object] {
bodyIndex [number]:索引,允许6人
joints [array]:骨骼节点,包含坐标信息,颜色信息
leftHandState [number]:左手手势
rightHandState [number]:右手手势
tracked [boolean]:是否捕获到
trackingId
}
5、kinect对象
on | 监听数据 |
open | 打开Kinect |
close | 关闭 |
openBodyReader | 读取骨骼数据 |
open**Reader | 类似如上方法,读取其它类型数据 |
接下来,我总结一下TGC2016《火影忍者手游》的体感游戏开发中碰到的一些问题。
1、讲解之前,我们首先需要了解下游戏流程。
1.1、通过手势触发开始游戏 |
1.2、玩家化身四代,左右跑动躲避九尾攻击 |
1.3、摆出手势“奥义”,触发四代大招 |
1.4、用户扫描二维码获取自己现场照片 |
2、服务器端
游戏需要玩家骨骼数据(移动、手势),彩色图像数据(某一手势下触发拍照),所以我们需要向客户端发送这两部分数据。值得注意的是,彩色图像数据体积过大,需要进行压缩。
游戏需要玩家骨骼数据(移动、手势),彩色图像数据(某一手势下触发拍照),所以我们需要向客户端发送这两部分数据。值得注意的是,彩色图像数据体积过大,需要进行压缩。
var emitColorFrame = false; io.sockets.on('connection', function (socket){ socket.on('startColorFrame', function(data){ emitColorFrame = true; }); }); kinect.on('multiSourceFrame', function(frame){ // 发送玩家骨骼数据 io.sockets.emit('bodyFrame', frame.body); // 玩家拍照 if(emitColorFrame) { var compression = 1; var origWidth = 1920; var origHeight = 1080; var origLength = 4 * origWidth * origHeight; var compressedWidth = origWidth / compression; var compressedHeight = origHeight / compression; var resizedLength = 4 * compressedWidth * compressedHeight; var resizedBuffer = new Buffer(resizedLength); // ... // 照片数据过大,需要压缩提高传输性能 zlib.deflate(resizedBuffer, function(err, result){ if(!err) { var buffer = result.toString('base64'); io.sockets.emit('colorFrame', buffer); } }); emitColorFrame = false; } }); kinect.openMultiSourceReader({ frameTypes: Kinect2.FrameType.body | Kinect2.FrameType.color });
3、客户端
客户端业务逻辑较复杂,我们提取关键步骤进行讲解。
3.1、用户拍照时,由于处理的数据比较大,为防止页面出现卡顿,我们需要使用web worker
(function(){ importScripts('pako.inflate.min.js'); var imageData; function init() { addEventListener('message', function (event) { switch (event.data.message) { case "setImageData": imageData = event.data.imageData; break; case "processImageData": processImageData(event.data.imageBuffer); break; } }); } function processImageData(compressedData) { var imageBuffer = pako.inflate(atob(compressedData)); var pixelArray = imageData.data; var newPixelData = new Uint8Array(imageBuffer); var imageDataSize = imageData.data.length; for (var i = 0; i < imageDataSize; i++) { imageData.data[i] = newPixelData[i]; } for(var x = 0; x < 1920; x++) { for(var y = 0; y < 1080; y++) { var idx = (x + y * 1920) * 4; var r = imageData.data[idx + 0]; var g = imageData.data[idx + 1]; var b = imageData.data[idx + 2]; } } self.postMessage({ "message": "imageReady", "imageData": imageData }); } init(); })();
3.2、接投影仪后,如果渲染面积比较大,会出现白屏,需要关闭浏览器硬件加速。
3.3、现场光线较暗,其它玩家干扰,在追踪玩家运动轨迹的过程中,可能会出现抖动的情况,我们需要去除干扰数据。(当突然出现很大位移时,需要将数据移除)
var tracks = this.tracks; var len = tracks.length; // 数据过滤 if(tracks[len-1] !== window.undefined) { if(Math.abs(n - tracks[len-1]) > 0.2) { return; } } this.tracks.push(n);
3.4、当玩家站立,只是左右少量晃动时,我们认为玩家是站立状态。
// 保留5个数据 if(this.tracks.length > 5) { this.tracks.shift(); } else { return; } // 位移总量 var dis = 0; for(var i = 1; i < this.tracks.length; i++) { dis += this.tracks[i] - this.tracks[i-1]; } if(Math.abs(dis) < 0.01) { this.stand(); } else { if(this.tracks[4] > this.tracks[3]) { this.turnRight(); } else { this.turnLeft(); } this.run(); }
1、使用HTML5开发Kinect体感游戏,降低了技术门槛,前端工程师可以轻松的开发体感游戏;
2、大量的框架可以应用,比如用JQuery、CreateJS、Three.js(三种不同渲染方式);
3、无限想象空间,试想下体感游戏结合webAR,结合webAudio、结合移动设备,太可以挖掘的东西了……想想都激动不是么!
2、大量的框架可以应用,比如用JQuery、CreateJS、Three.js(三种不同渲染方式);
3、无限想象空间,试想下体感游戏结合webAR,结合webAudio、结合移动设备,太可以挖掘的东西了……想想都激动不是么!