Snapdragon Spaces Development Guide (8)


5.4.3 XR Interaction Toolkit Example

This sample demonstrates the usage of the action-based input system and XR Interaction Toolkit components. For basic information on the new input system and the XR Interaction Toolkit package, see the Unity documentation .

How the example works

This sample demonstrates how to interact with UI and other game objects in the scene.

Floating UI panels provide interactive common UI elements, such as buttons and scroll bars. While buttons work with gaze and device pointers, the scroll bars in the example only work with device pointers. Additionally, another UI element in the scene shows input from the host controller's touchpad.

Finally, the interactive cube object can be grabbed using the device pointer to demonstrate 3D object interaction. In order for it to be an interactive object it needs to have the following components:

  • Collider
  • rigid body
  • XR Grab Interactable (see XR Interactable Manager)

device pointer controller

The device pointer prefab included in the sample consists of three main components:

  • The XR Controller (action-based) component is responsible for receiving input from XR input devices. In the example, the input action references in the action map are assigned to Position, Rotation, Select, Activate, and UI Press, as shown in the figure below.
  • An XR ray interactor is one of several types of interactors. It can interact with Unity UI elements in the scene as well as effectively interactive objects by using ray casting. The XR Ray Interactor should reference the current XR Interaction Manager in the scene.
  • The XR Interactor Line Visual and corresponding Line Renderer component are used to render a line from the controller's origin along the controller's heading direction. By default it's colored red, but as soon as it hits a valid interactive object, it changes color to green.

5.5 Composite layer components

The Interaction Prefab contains a Gaze Pointer prefab that utilizes a compositing layer to render content.

gaze pointer

The gaze pointer consists of the following game objects:

Please add a picture description

The XR Gaze Interactor game object has a Spaces Composition Layer component. This will create a four layer display view locked content. Combined with the UI Overlay Camera GameObject, this renders gaze pointer UI elements in a way that improves pointer stability and clarity, at the expense of some performance.

Supported Renderers
The Spaces Composition Layer component is only supported when using OpenGL ES3 as the graphics API, not Vulkan. The currently selected graphics API can be viewed in Project Settings > Player > Additional Settings > Rendering > Graphics API.

Please add a picture description

  • Layer Textures: Textures that will be rendered to view-locked four layers.
  • Extent: The size (in meters) of the four layers to be rendered.
  • Orientation: The direction of the four layers, relative to the view of the main camera. The fourth floor only has the front side, and it will not be visible if it is facing away from the main camera.
  • Position: The position of the four layers, relative to the view of the main camera.
  • Sort order: The rendering order of the quaternion layer. Lower numbers represent lower tiers. See sort order.

Configuration example

In the Snapdragon Spaces SDK sample, the view lock gaze pointer is configured as follows:

  • The spatial composite layer consists of four layers of 10cm x 10cm, which are composited at a position 2m in front of the main camera.
  • It draws the contents of the UI overlay render texture into this quad.
  • The content of this image is captured by the UI Overlay Camera in its target texture field.
  • The camera uses Culling Mask to capture only the content in the UI Overlay layer.
  • The only elements on the UI Overlay layer are the Reticle Canvas GameObject and its children.
  • The Render Camera property of Reticle Canvas is set to UI Overlay Camera.

Spatial compositing layer components can be configured with static or dynamic textures as needed. Enabling or disabling components will hide/show the content they render.

5.6 Perceptual samples

5.6.1 Anchor

This example demonstrates how to create and destroy local anchors to accurately track points in the real world. For basic information on anchors and what the AR FoundationAR Anchor Manager component does, see the Unity documentation . In order to use this feature, it must be enabled in the OpenXR Plug-in settings located under Project Settings > XR Plug-in Management > OpenXR (> Android Tab).

How the example works

First, make sure Spatial Anchors is enabled in the OpenXR project settings.

For placement purposes, a transparent placement gizmo visual floats at a distance of 1 meter from the center of the camera. Each frame a ray is cast forward from the center of the head in order to position where to place the gizmo relative to the plane of the real world. The drop gizmo also turns yellow if a hit is detected. After tapping the touchpad on the host controller or interacting with the gaze on the UI panel (visible if the gaze interactor is selected), an empty game object and an ARAnchor game object are instantiated. An empty GameObject has a transparent Gizmo grid that the AR session will track. The AR anchor gizmo will be updated via the ARAnchorManager event anchorsChanged to indicate its tracking status.

  public GameObject GizmoTrackedAnchor;
    public GameObject GizmoUntrackedAnchor;

    private override void Start() {
    
    
        FindObjectOfType<ARAnchorManager>().anchorsChanged += OnAnchorsChanged;
    }

    private void OnAnchorsChanged(ARAnchorsChangedEventArgs args) {
    
    
        foreach (var anchor in args.added) {
    
    
            ...
        }

        foreach (var anchor in args.updated) {
    
    
            Destroy(anchor.transform.GetChild(0).gameObject);
            var newGizmo = Instantiate(anchor.trackingState == TrackingState.None ? GizmoUntrackedAnchor : GizmoTrackedAnchor);
            newGizmo.transform.SetParent(anchor.transform, false);
        }

        foreach (var anchor in args.removed) {
    
    
            ...
        }
    }

destroy all anchors

Destroy All Anchors All anchors and gizmos can be destroyed with a single click on the UI. The delete command is issued with a delay to prevent anchors being created with the Select button after everything should be deleted.

spatial anchor

WARNING
Make sure to look around your environment to generate better tracking maps and reduce save and load times. Saving multiple anchors at once blocks the main thread, so a callback should be used to save any subsequent anchors.

With the addition of a component AR Anchor Manager alongside the Spaces Anchor Store, anchors can be saved locally for identification and tracking in future sessions. This component provides the following APIs for loading and saving anchors, deleting saved anchors, and clearing local storage of anchors.

namespace Qualcomm.Snapdragon.Spaces
{
    
    
    public class SpacesAnchorStore
    {
    
    
        public void ClearStore();

        public void SaveAnchor(ARAnchor anchor, string anchorName, Action<bool> onSavedCallback = null);
        public void SaveAnchor(ARAnchor anchor, Action<bool> onSavedCallback = null);

        public void DeleteSavedAnchor(string anchorName);

        public void LoadSavedAnchor(string anchorName, Action<bool> onLoadedCallback = null);
        public void LoadAllSavedAnchors(Action<bool> onLoadedCallback = null);

        public string[] GetSavedAnchorNames();
        public string GetSavedAnchorNameFromARAnchor(ARAnchor anchor);
    }
}

Before this information is moved to the scripting API, here is a short description of the methods:

  • ClearStore
    • Clear local storage for anchors.
  • SaveAnchor
    • AR Anchor saves objects by given name or generated hash. A callback can be called upon completion.
  • DeleteSavedAnchor
    • Deletes a saved anchor by name from local storage.
  • LoadSavedAnchor
    • Loads an anchor from local storage and attempts to position the anchor in the scene. AR Anchor instantiates an object if an anchor is found. addedLoaded anchors will be listed in the ARAnchorManager's anchorsChanged event. The names of saved anchors can be retrieved using GetSavedAnchorNames. A callback can be called upon completion.
  • LoadAllSavedAnchors
    • Loads all anchors from storage and tries to find them in the scene. Similar to LoadSavedAnchor, the AR Anchor object will be instantiated after recognition.
  • GetSavedAnchorNames
    • Returns all saved anchors by name.
  • GetSavedAnchorNameFromARAnchor
    • If the tracked AR Anchor object is a previously saved object, this method will return its name, otherwise an empty string. This method can be used to check if the anchor has been saved.

Save, delete and load anchors in the example

By enabling the Save new anchors to local store toggle, whenever a new anchor is created it will be saved to the local app store. This means that as long as this saved anchor exists in the AR Anchor Manager scene, it can be recreated and tracked like any other regular anchor. In order to differentiate between regular anchors and saved anchors, an additional cube mesh will be generated at the center of said anchors. If the cube is red it means the saved anchor is not being tracked, if it is white it means it is being tracked. By clicking Load All Saved Anchors all anchors in your local storage will be loaded and the underlying functionality will try to find them. On the other hand, clicking Clear Store will delete all anchors saved in local storage. This action will not destroy any existing anchors loaded from the store.

5.6.2 Hand Tracking

Warning
The Spaces Hand component has been deprecated in favor of components from the QCHT package and will be removed in a future release. The current hand tracking scene in the sample application is included in the QCHT package and must be imported for the sample to work properly.

space hand manager

Warning
The Spaces Hand Manager component has been deprecated in favor of components from the QCHT package and will be removed in a future release.

The Spaces Hand Manager component is of type ARTrackableManager and is programmed like all other AR Foundation managers - by providing callback functions to retrieve changes in the form of adding, updating and removing items

public void Start() {
    
    
    spacesHandManager.handsChanged += OnHandsChanged;
}

...

private void OnHandsChanged(SpacesHandsChangedEventArgs args) {
    
    
    foreach (var hand in args.added) {
    
    
        ...
    }

    foreach (var hand in args.updated) {
    
    
        ...
    }

    foreach (var hand in args.removed) {
    
    
        ...
    }
}

It also provides an inspector field to define the default prefab that should be generated when detecting hands. The example prefab, Default Spaces Hand, consists of two add-ons, as shown in the image below. The following sections describe them in detail.

Spatial Hand Components

Warning
The Spaces Hand component has been deprecated in favor of components from the QCHT package and will be removed in a future release.

This component is a generic interface to get all hand related data. It is of type ARTrackable and thus has common properties such as TrackableID, TrackingState and Pose which will be defined by the wrist joint of the tracked hand.

It also provides three additional properties:

  • IsLeft is a boolean that returns true if the tracked hand is the left hand. Otherwise, the return value is false.
  • Joints is a series of Qualcomm.Snapdragon.Spaces.SpacesHand.Joint types. This type has the following properties:
    • Pose is a Unity pose type that returns the pose of the hand joint.
    • Type is of type Qualcomm.Snapdragon.Spaces.SpacesHand.JointType and returns an enumeration value for which hand joint.
  • Gestures are of type Qualcomm.Snapdragon.Spaces.SpacesHand.Gesture and have the following properties:
    • Type is of type Qualcomm.Snapdragon.Spaces.SpacesHand.GestureType and returns an enumeration value representing the detected hand gesture.
    • GestureRatio is a floating point value between 0 and 1 indicating the degree to which the gesture is applied.
    • FlipRatio is a floating point value between -1 and 1 indicating whether the gesture was detected from the back (-1), from the front (1), or in the middle.

For more information on gestures, see the Interaction Gestures documentation .

namespace Qualcomm.Snapdragon.Spaces.SpacesHand
{
    
    
    public enum JointType
    {
    
    
        PALM = 0,
        WRIST = 1,
        THUMB_METACARPAL = 2,
        THUMB_PROXIMAL = 3,
        THUMB_DISTAL = 4,
        THUMB_TIP = 5,
        INDEX_METACARPAL = 6,
        INDEX_PROXIMAL = 7,
        INDEX_INTERMEDIATE = 8,
        INDEX_DISTAL = 9,
        INDEX_TIP = 10,
        MIDDLE_METACARPAL = 11,
        MIDDLE_PROXIMAL = 12,
        MIDDLE_INTERMEDIATE = 13,
        MIDDLE_DISTAL = 14,
        MIDDLE_TIP = 15,
        RING_METACARPAL = 16,
        RING_PROXIMAL = 17,
        RING_INTERMEDIATE = 18,
        RING_DISTAL = 19,
        RING_TIP = 20,
        LITTLE_METACARPAL = 21,
        LITTLE_PROXIMAL = 22,
        LITTLE_INTERMEDIATE = 23,
        LITTLE_DISTAL = 24,
        LITTLE_TIP = 25
    }
}
namespace Qualcomm.Snapdragon.Spaces.SpacesHand
{
    
    
    public enum GestureType
    {
    
    
        UNKNOWN = -1,
        OPEN_HAND = 0,
        //FLIP = 1,
        GRAB = 2,
        //UP = 3,
        //DOWN = 4,
        //SWIPE = 5,
        //SWIPE_OUT = 6,
        PINCH = 7,
        POINT = 8,
        VICTORY = 9,
        //CALL = 10,
        METAL = 11
    }
}

Spaces Hand Visualizer Component

WARNING
The Spaces Hand Joint Visualizer component has been deprecated in favor of components from the QCHT package and will be removed in a future release.

This component provides some properties to change the appearance of the joint visualization, for example:

  • JointMesh is the mesh that should be instantiated for each joint.
  • JointMaterial is the material that should be applied to the mesh.
  • JointMeshScale is a float value between 0.005 and 0.05 that defines the scaling that should be applied to the mesh.
  • UseNormalizedColors is a boolean value. If set to true, the _Color property of the applied material shader will be colored by the component.

In the example, the simple sphere mesh included with UnityEngine is set to JointMesh and the default material is set to JointMaterial.

Guess you like

Origin blog.csdn.net/weixin_38498942/article/details/132467418