(3) Unity Development Vision Pro - Getting Started

3.Getting started

1. Getting Started

This section covers several important topics to help you accelerate visionOS platform development. Here you'll find links to step-by-step guides for building your first Unity PolySpatial XR application, as well as some development best practices when developing for PolySpatial XR.

2. Development and iteration

For information about prerequisites, development, iteration, deployment, and debugging, see Development and Iteration.

3. Create a new project

These guides provide step-by-step instructions for getting started with VisionOS.

  1. In Starting a New VisionOS Project from Scratch , you'll find a step-by-step tutorial that guides you through installing, setting up, and deploying a simple Unity application targeting visionOS and Apple Vision Pro from scratch.
  2. In Starting a new visionOS project from the Immersive Apps template , you'll find a step-by-step tutorial for setting up a new project using the Immersive Apps template.
  3. In Sample Content: Learn How to Use visionOS and Application Samples , you'll find a variety of vertical slice demo projects that explain how to use PolySpatial technology for visionOS development.

4. Migrate existing projects

When porting an existing Unity project to visonOS, there are several factors to consider. The biggest limitation is that some core Unity features are not supported, while others offer a reduced feature set. Additionally, processing capabilities and supported components will vary depending on the input. Sometimes you must develop your own systems to support your unique project functionality and work around these limitations.

You can find information about porting VR experiences to VisionOS , learn which Unity features and components are currently supported for immersive applications , or how to use project validation to get helpful in-editor help porting your project. For more information about input and other development topics, check out the reference documentation .

3.1 Create a VisionOS project from scratch

1. For more information about hardware and Unity version requirements, please visit Requirements

2. Make sure to switch the visionOS build platform (experimental)

3.1.1 Fully immersive virtual reality

Make sure you have the xr.sdk.visionos package installed

1) Select Edit > Project Settings...

2) Open the XR plug-in manager menu

3) Check the Visual Operating System checkbox

4) Select File > Build Settings...

>Add scene (SampleScene)

>Select Build.

Your application will render a fully immersive space, and you should see the Unity skybox (or your application) running in the Apple Vision Pro simulator.

For more information, see the Fully Immersive VR documentation

3.1.2 Mixed reality and shared space

Make sure the com.unity.polyspatial, com.unity.polyspatial.visionos and com.unity.polyspatial.xr packages are installed

1. Create a volume camera in the scene
a. Open the scene tools/XR Building Blocks menu and click Volume Camera
b. Create an empty game object and add the volume camera component

2. Configure the volumetric camera in bounded or unbounded mode and adjust the dimensions
a. The dimensions will adjust the rendering scale of the content
b. For bounded applications, ensure that some content is visible within the dimensions of the volumetric camera

3. Open Project Settings > PolySpatial  ...

>Check the Enable PolySpatial runtime box

Unbounded applications

For unbounded applications that want to use ARKit features, you need to enable visionOS in the XR plug-in management settings and ensure that you have the AR Foundation package in your project. For ARKit Hands, make sure you have the XR Hands package in your project.

1. Select File > Build Settings...

>Add scene (SampleScene)

>Select Build.

For bounded apps, your app can exist with other apps in a shared space, and for unbounded apps, your app will be the only thing visible.

Note: The Apple Vision Pro simulator does not provide any ARKit data, so planes, meshes, tracking hands, etc. will not work.

For more information, see the PolySpatial MR application documentation

3.2 Example

Unity 's PolySpatial sample provides a starting point for VisionOS development in Unity around specific use cases with finite volume and infinite experiences.

3.2.1 Bounded volume sample

Targeted Input – Balloon Gallery

Target Input - Balloon Gallery is a mini-game that demonstrates how to use indirect pinch and direct (poke) input to target content in a bounded volume scene.

3.2.1.1 Dynamic Volume Camera - Character Runner

Dynamic Volumetric Camera - Character Runner is a mini-game that demonstrates the ability to dynamically reposition a volumetric camera within a bounded volume. The Runner mini-game follows the character as he navigates an environment that is larger than a limited volume.

 

3.2.1.2 Debugging interface

The Debug UI scene allows users to test various input types; direct (poke), direct pinch, indirect pinch and analyze data using the debug UI . 

 

3.2.1.3 Manipulation

Manipulation scenes allow users to manipulate a variety of objects with different collider shapes within a bounded volume .

 

3.2.1.4 User interface

UI scenes provide users with examples of common spatial UI used in bounded applications. This includes elements such as buttons, sliders, toggle buttons, and drop-down menus.

3.2.1.5 Project Launcher

The Project Launcher scene allows users to launch various Unity scenes from a limited volume using a carousel-style spatial UI.

 

3.2.2 Unlimited samples

3.2.2.1 Image tracking

The image tracking scenario allows users to generate content using predefined, unique image tags in unlimited applications.

3.2.2.2 Mixed reality

Mixed reality scenes allow users to generate content using custom ARKit gestures in unlimited applications. It also visualizes flat data information in physical environments** **.

Guess you like

Origin blog.csdn.net/humilezr/article/details/132271501