Search Unity

Targeting as many VR platforms as you can gives your application the widest possible audience. Using the Unity 2018.3 features outlined in this article, you can make a lean, mean app that works across a wide range of desktop and mobile VR devices without having to painstakingly build for each platform. Read to the end to find references to open source projects that show these features in action!

Note: This article applies to the following SDKs: OpenVR, Oculus, WindowsMR, Daydream, and GearVR. It does not apply to PSVR.

Build Target Platform and VR SDK choice

First, the basics.

Depending on what devices your application is built for, you may have to change your build target platform and compile multiple times. If you’re targeting the Oculus Rift, for example, your build target platform should be “PC, Mac, & Linux Standalone.”  If you are targeting Oculus Go, your build target platform should be “Android.” If you want to target both the Oculus Rift and Oculus Go, you will want to build once per each target platform.

You can read up on changing build target platforms on the Build Settings page in Unity Documentation.

Under each build target platform, you must target the correct virtual reality SDK.  Sometimes you can target multiple SDKs with a single exe, and sometimes you must build separately.  When targeting “PC, Mac, & Linux Standalone” you can build targeting both Oculus and OpenVR virtual reality SDKs.  When targeting “Android” you must build separately for daydream and Oculus virtual reality SDKs.

Be sure to do your homework on each device you plan to target to identify the proper build target platform and VR SDK selection!

You can read up on enabling VR and selecting SDKs for your build in Unity Documentation’s VR Overview.

Tracked Pose Driver

Tracked Pose Driver is a nifty component that will drive a Transform’s position and rotation based on a VR device.

This screenshot shows an XR Rig that uses Tracked Pose Driver.  The “Main Camera,” “Left Hand,” and “Right Hand” GameObjects of the XR Rig each have a TrackedPoseDriver component, which will change their GameObject’s Transform based on the selected Device/Pose Source combination.  You can play with this XR Rig yourself by creating a new project using the VR Lightweight RP template.

InputManager

Hey, did you know that you can get input from VR controllers the same way you get input for non-VR controllers? It’s true! Using the Input Megachart available on our XR Input documentation page, you can reference input such as triggers, thumbsticks, grip buttons, and more!

For example, here is how you set up the Input Manager for the left-hand trigger axis:

And here is how you setup the Input Manager for the left-hand Primary Button.

XRNodes

If you need a greater understanding of specific devices, there are several handy tools in the InputTracking class.

InputTracking.GetNodeStates() will give you a list of what we call XRNodeStates.  XRNode types include abstract representations such as Head, LeftHand, RightHand, TrackingReference (for example, Oculus Rift cameras or Vive Lighthouses), and several more. XRNodeStates provide physical information such as position and rotation of a node.  It also tells you the node type, whether or not the node is currently tracked, and gives you a uniqueID.

InputTracking also provides events that fire when a node is added or removed and when tracking on a node is lost or acquired.

Enabling SDK-Specific Details

Sometimes user experience concerns demand that you implement SDK-specific details.  Don’t worry, we’ve got your back.

XRSettings provides global XR-related settings such as which VR SDKs are supported by your built application and which one specifically is active.  XRSettings.supportedDevices lists out the VR SDKs that were selected at build time. XRSettings.loadedDeviceName tells you which SDK is currently driving input.  Using this setting, you can enable platform-specific settings such as choosing different inputs to drive user actions.

 

Using XRSettings along with XRNodes, you can control UX decisions across all VR SDKs and their respective devices.

Resources

Are you excited to get your hands on the features discussed in this article? Check out these two projects to get started:

Onslaught is a <400kB project on GitHub that supports OpenVR, Oculus Desktop, Windows Mixed Reality Immersive Headsets, Oculus Mobile (GearVR, Go), and Daydream.  Try it now with Unity Editor 2018.3.0b3+

XR Input Tests is the multi-platform test project that Unity XR QA uses every day to verify these features and more.

10 Comments

Subscribe to comments

Leave a reply

You may use these HTML tags and attributes: <a href=""> <b> <code> <pre>

  1. For the Vive Focus (or rather the Wave Platform which includes other hmds) there’s the following blog entry:
    https://community.viveport.com/t5/Developer-Blog/bg-p/devblog

    To clarify it wasn’t a matter of opting to not use the Unity VR path as opposed to it not really being open to all platforms. And if you’re gong to abstract input at the application level rather than inside the Unity XR framework then you may as well also consider the third party plugins (like VIU) that can also target multiple platforms.

    And then there’s the new OpenVR (and OpenXR I suppose) new input system to also consider…

  2. What about Vive Focus?

    1. Hi Danila, I must admit that I do not have first hand experience with the Vive Focus, but here is what I was able to find out for you:

      I downloaded the SDK and loaded it into an empty Unity project based on these steps: https://hub.vive.com/storage/app/doc/en-us/UnityPluginGettingStart.html

      After walking through the steps, it appears that HTC opted to write their own stack on top of Android. They do not set PlayerSettings -> XR Settings -> Virtual Reality Supported, which means that Unity’s VR path does not have a window into the HTC Wave platform.

      Unfortunately, this means that the features outlined in this article don’t seem to apply to the Vive Focus. I do not have a Vive Focus to verify my findings – could someone who does chime in?

  3. Please stop spamming these blog posts.

  4. Seems like old news. Maybe the 2 GitHub projects are new?

    1. Yeah, afaik all of these features have been around for a while. However, looking through the repo I spotted a native Unity API for haptics which I’ve never seen before. So maybe that’s new?

    2. Hi Sean, based on my interactions with my local VR community I thought it would be helpful for some if this information was brought together and broadcast as a blog entry. In addition to the features themselves, I hope that you gain some value from seeing the features in action in the two projects, which were produced as part of our internal testing process.

      This article gives an overview of recommended features to use for 2018.3 (not including nice packages such as SteamVR, VRTK, and etc). There are some nice new features coming out for 2019.1 – look for a follow up blog entry introducing these features and explaining our thought process behind them.

  5. Love the idea of the Tracked pose driver (know it for a while). Only thing keeping me away from using Unity’s native stuff is the lack of a Player Physics system (I’m stuck using VRTK’s body physics till I have the time to make my own) and the lack of Haptic rumble (though I think this was added already, just found some code examples I’m going to test later).

    It’s cool about the inputs too, I already new about it but I’ve always preferred to use KeyCodes, I hate extra setup steps (I’d rather do it automatically in code). Setting up all the inputs for Vive and Oculus in the Inputs manager becomes a problem since they´re a lot (so much so, that some people simply share the input settings file to avoid having to go through the pain of setting up the inputs all over again. In my case, I just ended up making my own VR KeyCodes for Vive and Oculus separately that then my Input methods’ overloads use to know what inputs to check from each SDK). Having separate keycode enums makes it easy to select stuff in the editor or even in UI.

    1. Hi Alverik,

      Thanks for the read. In the Onslaught project at the end of the blog post, you will see that for one platform I had to abstract the input for one or two platforms as well due to feature differences. As for setup, I hope you’ll appreciate some new features that are cooking for 2019.1! Look out for a follow up blog post when 2019.1 is in beta detailing these features.

    2. Hi again Alverik,

      Sorry for the delay, but here is a bit of a blog post that talks about collider/rigidbody setups. I hope it will help you with a body physics system if that’s something you’re interested in making:

      http://jackpritz.com/blog/unity-colliderrigidbody-setups-in-xr