VR Interaction
For VR interactions, we will use a standard module called "XR Interaction Toolkit". It implements a very comprehensive interaction and navigation system.
It is installed by default in your project when you choose the "VR (Core)" preset when creating a new Unity project. Create a new project; if the VR (Core) preset is not installed, find it in the list then install it.
The project opens to a scene named "SampleScene" containing several examples of possible interactions. I recommend loading a more complete scene. In the "Project" window, search for "DemoScene" and then double-click on "DemoScene" to launch the scene.
The task of this part of the workshop is simply to experiment with all that the demo scene offers and understand how it is implemented. We will use what we learn here in the next parts.
If you want this scene to be the one that appears in the headset when started from a build of your project, you need to specify it. Go to "File" > "Build Settings"; in the upper part of the window that appears, there is a scene manager. Click on "Add Open Scenes", then uncheck "Scenes/SampleScene" and check the scene we just added instead.
Scene in build settings
The details just above or mostly a factor when debugging on headsets that do not interface well with Unity and a new build must be uploaded to the device every time (e.g., some Android-based autonomous devices).
For today's workshop (July 8) go directly here.
Instructions for Pico Headsets
Go to this address and download the "PICO Unity Integration SDK".
Once downloaded, extract the archive where you want.
In Unity, locate "Windows" in the top bar, then "Package Manager". In the window that appears, click the "+" symbol and choose "Add package from disk". Navigate to the folder you extracted and select the file named "package.json" to import the package into your project.
Once this is done, a window may automatically appear ("Platform settings"); follow its recommendations.
Open "Edit" > "Project Settings" > "XR Plugin Management", and ensure that PICO is the only selected item.
Lower left, click on "Project Validation" and then click on "Fix All". If web pages appear (in Chinese), you can ignore them.
General VR Instructions for Android-based devices
Only do the following if you are using an autonomous XR headset running Android.
If not go directly here.
Open "Edit" > "Project Settings" > "Player". Look at the options in "Other Settings", paying particular attention to having the correct configuration for the following items:
Connect the Headset
Make sure the Pico headset is turned on, then connect it to the computer with a sufficiently long USB cable. We will now create the application (a build of the project producing an application that will run independently of the Unity editor).
To do this, look for "File" > "Build Settings" in the upper left. You will see the window below.
Most standalone VR headsets today are Android devices like mobile phones.
When connecting the headset to the computer, it is recognized as such, and Unity can directly load a VR application onto the headset.
Click on "Refresh" (see image below) and then choose the option starting with "Pico" from the dropdown list.
If the Pico device does not appear in the list, ensure it is properly connected to the computer; you may also need to click on a notification in the headset asking to accept USB debugging.
Launching the Application on the Headset
Use the "patch" and "patch and run" buttons: the first creates the application and loads it onto the headset, while the second automatically launches the application as well. To allow for "patch" or "patch and run", the "Development Build" option must be checked just below.
In contrast, the "build" button will create an .apk file (Android application) on the computer instead. An .apk file can be distributed to share your application.
No headset? Run in the simulator!
If you do not have a headset handy right now you can run the XR interaction simulator instead. As the name suggests it will emulate an XR device that you will be able to control with the keyboard and mouse.
To enable this option go to "Edit > Project Settings", in the window that open find "XR Interaction Toolkit" (in the left tab), and in that panel toggle "Use XR interaction Simulator in scenes".
A VR simulator will be created when a scene is ran, keyboard shortcuts control the camera while others the controllers.
Means of Interaction
The demo scene of the "XR Interaction Toolkit" offers several examples of interaction using the controllers. Explore the different interactions in the headset.
Callback
The toolkit used relies heavily on the "callback" system (UnityEvent). A callback is a function that is called when an event is triggered. For example, when the user presses a button on the controller or interacts with an object.
In this example, when a user presses button [5] (see image below; "grab"), it corresponds to the "select" action that will call a pre-chosen function. Here, a sound is produced.
This callback method managed in the editor is very convenient for reducing the need for programming in your project.
Programming
If you want to check with code whether a button has been pressed/released, use the following method.
// Insert at the very beginning of your file (not within the class)
using UnityEngine.XR.Interaction.Toolkit.Inputs.Readers;
[...]
// Insert in the body of the class (not inside a method/function)
public XRInputButtonReader button;
[...]
// Insert where you need to check if a button on the controller was just released during this frame
button.ReadWasCompletedThisFrame()
Since button
is public, it will appear in the editor, specifically in the Inspector for the GameObject that contains this script.
Add the missing elements as shown in the image below. In this example, we are waiting for a click of button [6] (trigger).
Means of Navigation
Using the controller held in the left hand by manipulating the element marked [1] in the image below (joystick, thumbstick), you can move (translation) along the horizontal axis. With the same element [1] in the right hand, you can rotate (perform rotations).
Still in the right hand, if you push the joystick [1] and hold it forward, you will activate the teleportation feature.
A blue beam extending from the VR controller will appear for you to aim at the teleportation point.
There are two modes of teleportation:
- Constrained: represented by small area zones (white circle in the image below), a constrained teleportation allows for a single relocation point at its center.
- Free: relocation is possible over a predetermined surface (white horizontal plane in the image below).