Recently, the model needs to be displayed in the form of VR, but the equipment is not one type, there are vive of htc and quest of oculus. If the project is done separately, you can use steamVR for htc development, and use oculus integration for quest. However, the workload of the two sets of projects is a bit heavy, and the later update and maintenance are quite inconvenient. Of course, I also tried to add these two plug-ins in the same project, but found that there would be conflicts. SteamVR didn't start directly, so I had to give up this idea.
I heard from others that Unity already has a plug-in that is compatible with all XR products-XR Interaction toolkits. I checked the relevant information and found that I could give it a try, but because this plug-in is now also an internal beta version, there are not many people in China who can refer to it. The information is also limited, and we can only rely on web page translation to understand some of the experience of foreign developers using this plug-in. Here is mainly to record the use process of using XR Interaction toolkits, so that if you need to use it in the future, you can come back and have a look. It would be great if you can help those in need.
The official has open sourced a set of demos on github ( click to jump ). As long as the demo is loaded and run normally in Unity, the VR operation can be called normally. Of course, a series of preparatory work needs to be done before this. I will explain them one by one.
First talk about my development environment
Operating System: Windows 10
Development platform: Unity 2019.3.4f1
Use plug-in: XR Interaction ToolKit preview 0.9.4
Release platform: HTC Vive, Oculus Quest
On the windows side, steam and steamVR still need to be installed by default. Account registration and configuration are not detailed here. There are basically tutorials on the Internet.
One thing to note is that when using XR Interaction toolkits, the windows environment will use SteamVR by default. Just like using Unity’s steamVR plug-in, when you run the Unity project, steamVR.exe will also be enabled, so when you successfully import this set of demos into Unity, if steamVR fails to start when running, then check the steamVR environment. Whether it is matched well, I won't repeat it here.
There are also a few small points that need to be paid attention to on Quest. First, quest needs to be connected to the oculus link app to turn on the developer mode. This operation may require a ladder. You can basically find out by consulting the online tutorial; because quest is actually an Android device, unity also needs to configure the Android environment; At the same time, a data cable is also needed for data transmission. If it is a laptop computer development, you can use the same route to transmit data; Android development and testing have always been troublesome, because you can’t see the log output directly, and you need to wear it for each test. The head-mounted display is quite awkward, so I found a solution to solve this problem.
First of all, I used the Android test method to view the log output, which is adb+ddms. As long as the quest can be successfully connected, ddms can output the log like an Android phone. As for the screen output, I use the Scrcpy plug-in. As long as adb is installed, connect to quest and the screen can be displayed directly. As for the problem that the quest terminal needs to be worn to display the screen, it is better solved. There is an infrared sensor on the nose of the headgear, and the page can be displayed continuously as long as the paper is posted! However, it should be noted that if the screen is not turned off, the battery of the handle will continue to be consumed. I forgot to remove the sticker, which caused the handle to run out of power after a new battery was installed. . .
When everything is set up properly, the display on scrcpy and ddms is as shown in the following two pictures
In this way, the preparatory work before development is probably completed, and then Unity can be developed.