Leap Motion Development (4) Leap Motion + Unity3D Realizes Scene Roaming

Gesture recognition is performed through Leap Motion (hereinafter referred to as LM), and the left rotation, right rotation, and teleportation of the scene are realized in the Unity3D virtual scene to complete the roaming of the scene.

Show results

insert image description here
Raise the little finger of the left hand and move the scene
insert image description here
to the left. Raise the little finger
insert image description here
of the right hand and move the scene to the right. If you like the left hand, a ray will be sent from the direction of the hand to focus on the scene, and a yellow ball will be generated at the landing point to mark it. When you leave the like gesture with the left hand (that is, change the left hand to any gesture), teleport to the point where the ray falls. A judgment is made here, that is, teleportation can only be performed on the ground named "Ground".

explain

insert image description here
scene hierarchy

MainCamera and LMProvider

insert image description here
Mount the Leap XR Service Provider on the Main Camera (install the LM on the VR helmet or glasses for use, if you want to use the LM on the desktop, just create a new empty node and mount the LeapServiceProvider). Under Advanced, set the Device Offset Mode to Transform, then the LPProvider will move with the movement of the camera. After the camera position changes, the position of the LPProvider will change accordingly. If the default is selected, when the camera position changes, the LPProvider will still stay at the original position of the camera. position, the recognized and rendered hand also stays in place and does not appear in front of the camera.
DeviceOrigin can mount a Transform sub-object under MainCamera to realize an offset on Transform (you can also not add it, the test is fine).
insert image description here

HandModel

insert image description here
HandModelManager mounts the HandModelManager script, and HandModel uses LopolyRiggedHand.

Scene rotation function

In RotateHandCheck, the ExtendedFingerDetector in the SDK provided by LM for Unity is used (usually ExtendedFingerDetector, FingerDirectionDetector and DetectorLogicGate are used to jointly control and recognize gestures, but the gestures here are relatively simple, so only ExtendedFingerDetector is used).
ExtendedFingerDetector detects whether the five fingers are straightened and bent to judge some simple gestures.
insert image description here
For example, the picture above is a monitoring gesture. When the little finger is straightened and the other four fingers are bent, the function in OnActivate is triggered. When the gesture is released, the function in OnDeactivate is triggered.
ExtendedFingerDetector inherits from Detector (provided by LM), and implements the virtual functions Activate and Deactivate in it, which can be overloaded with parameters of the same name with different parameters, and parameter calls can be implemented in OnActivate and OnDeactivate. (However, custom data types such as Hand cannot be passed as parameters. I don’t know why, it may be defined in the Detector source code).

    public virtual void Activate(){
    
    
      if (!IsActive) {
    
    
        _isActive = true;
        OnActivate.Invoke();
      }
    }
    public virtual void Activate(int x)
    {
    
    
        if (!IsActive)
        {
    
    
            _isActive = true;
            OnActivate.Invoke();
        }
    }
   // 行不通
 	public virtual void Activate(Hand hand)
	 {
    
    
	      if (!IsActive)
	      {
    
    
	          _isActive = true;
	          OnActivate.Invoke();
	      }
	  }

Add two ExtendedFingerDetectors in RoateHandLogic to monitor the left and right hands respectively, and call the method in RotateLogic written by yourself after monitoring specific gestures.
insert image description here
insert image description here
The parameter passed into RotateActivate is 1 (left hand) or 2 (right hand). Set RotateSpeed ​​to negative and positive to control the camera to turn left and right. Add StartRotate to the process in RotateActivate (fifty frames per second).

    public void RotateActivate(int index)
    {
    
    
        if (index == 1 && rotateSpeed>0)
        {
    
    
            rotateSpeed = -rotateSpeed;
        }
        if (index == 2 && rotateSpeed < 0)
        {
    
    
            rotateSpeed = -rotateSpeed;
        }
        Debug.Log("右手开始旋转:"+index);
        InvokeRepeating("StartRotate", 0, 0.02f);
    }

    private void StartRotate()
    {
    
    
        camera.transform.Rotate(0, rotateSpeed * Time.deltaTime, 0, Space.Self);
    }

    public void RotateDeActivate()
    {
    
    
        Debug.Log("旋转停止");
        CancelInvoke("StartRotate");
    }

It is worth noting that it is also possible to directly complete the response operation to the gesture in the ExtendedFingerDetector (need to set the Period (detection cycle) in the Inspector to a smaller value, the default value is 0.1, then directly complete the response to the gesture here, the effect will be stuck.

if (HandModel.IsTracked && fingerState)
{
    
    
    Activate();
    // 此函数里直接完成手势的响应 也是可以的 
}
else if (!HandModel.IsTracked || !fingerState)
{
    
    
    Deactivate();
    // Debug.Log("手势结束");
}

In addition, there is no need to judge the opponent's left and right hands in the code, because which hand has been specified in the Inspector. Use the following statement to directly obtain the corresponding hand. After obtaining the hand in ExtendedFingerDetector, you can directly use HandModelBase to obtain the pose information of the hand in other scripts.

public HandModelBase HandModel = null;
if (HandModel != null && HandModel.IsTracked)
{
    
    
    Hand hand = HandModel.GetLeapHand();
    if (hand != null)
    {
    
    
        // 相应操作
    }
}

Displacement in the scene

insert image description here
After the test, the thumbs-like gesture is used as the command gesture for displacement. Other slightly complicated gestures, such as the six-pulse sword or the pistol gesture, will suddenly appear during the monitoring due to self-occlusion and misjudgment. Gesture situation, then the scene displacement will teleport uncontrollably, so if the mechanism of monitoring gestures is used, then the displacement should try to choose simple gestures.
insert image description here
When listening to the like gesture on the left hand, the method TransActivate in this script will be called to send out a ray from the direction of the hand, and a small ball mark will appear at the landing point, and then the method TransDeactivate in this script will be called when the like gesture is released Implement camera displacement.
The ray is created through the LineRenderer component. Since it will not be used, the set color cannot be displayed, and it appears purple without mesh.

// 在此函数里将发出射线加入线程,重复调用
public void TrasActivate()
{
    
    
    // 不需要判断是左手还是右手,因为Public HandModelBase HandModel被设置为左手!!!!
    // 也就是说,通过HandModel获取不到右手,只能获取到左手
    //for(int i = 0; i < 2; i++)
    //{
    
    
    //    Frame curFrame = LeapProvider.CurrentFrame.TransformedCopy(LeapTransform.Identity);
    //}
    if (HandModel != null && HandModel.IsTracked)
    {
    
    
        hand = HandModel.GetLeapHand();
        if (hand != null)
        {
    
    
            InvokeRepeating("CastRay", 0, 0.02f);
        }
    }
}

private void CastRay()
{
    
    
    Destroy(rayPoint);

    line.enabled = true;
    // Ray ray = new Ray(hand.Fingers[1].Bone(Bone.BoneType.TYPE_METACARPAL).NextJoint.ToVector3(), hand.Fingers[1].Direction.ToVector3());
    Ray ray = new Ray(hand.StabilizedPalmPosition.ToVector3(), hand.Direction.ToVector3());
    RaycastHit hit;
    
    Physics.Raycast(ray, out hit);
    line.SetPositions(new Vector3[] {
    
     hand.Fingers[1].Bone(Bone.BoneType.TYPE_METACARPAL).NextJoint.ToVector3(), hit.point });

    rayPoint = Object.Instantiate(_rayPoint);
    rayPoint.transform.position = hit.point;

    // 先判断hit.transform是否存在,才能判断信息
    if (hit.transform!=null && hit.transform.name == "Ground")
    {
    
    
        Debug.Log("我指向地面了!");
        transLoc = hit.point;
    }
    
    // OnRayCast(hand.StabilizedPalmPosition.ToVector3());
}



public void TransDeActivate()
{
    
    
    if (rayPoint != null)
    {
    
    
        Destroy(rayPoint);
    }
    CancelInvoke("CastRay");

    if (transLoc != Vector3.zero)
    {
    
    
        transLoc.y = 1.8f;
        Vector3 transDis = transLoc - camera.transform.position;
        camera.transform.position = transLoc;
        // hand.SetTransform(transLoc, Vector3.zero);
        Debug.Log("相机进行了一个位移,相机的位置是:" + camera.transform.position);
        Debug.Log("手/leap provider的位置是:" + hand.PalmPosition);
        // Debug.Log("Leap的位置是:" + leap.transform.position);
    }
    else
    {
    
    
        Debug.Log("先前没有指向地面");
    }
}

BUG

The ball will move towards the user along the ray.
There will be a broken line in the scene, and the line cannot be rendered, showing a purple without mesh (the use of the LineRenderer component is not very familiar

Note In use, I don’t know whether it is because the software version is too old or the hardware interface problem, there will be a situation where the 4.1.0 version software connection fails but the latest software connection succeeds. At this time, the Leap Motion cannot be connected in this program, and the Leap Service not appears. connected error (probably because the SDK version is too old only supports the old version of the connection)

Guess you like

Origin blog.csdn.net/qq_39006214/article/details/124881093