Search Unity

VR is all about immersion, and the ability to track the user’s position in space is a key element of it. However, to date this has only been available in desktop and console VR, even though modern smartphones already incorporate the essential technology to make it possible in mobile VR too. This blog explains how to achieve inside-out tracking in mobile VR using only Unity and AR SDKs with today’s handsets.

Note that this particular method of implementing inside-out-tracking is not an officially supported Unity feature, nor is it on our immediate roadmap. We learned that Roberto from ARM was doing something cool with some of our integrated platforms and wanted to share it with you.

If you have ever tried a room scale VR console or desktop game then you will understand how badly I wished to implement mobile inside-out VR tracking. The problem was that there were no SDK’s and/or phones to try it. At the beginning of the year, I saw the first opportunity at CES when the news about the new ASUS release supporting AR functionality and Daydream became public. This ended up being an accidental leak because as it turned out, the ASUS release was not available until June. Only then I could create my first inside-out mobile VR tracking project in Unity for Daydream using the early AR SDK for Android. When I got it working, it was amazing to walk in the real world and see how my camera in VR also moved around virtual objects. It felt so natural, and it is something you need to experience yourself.

The second chance I had to implement inside-out tracking became available when Google released the ARCore SDK. On the same day, Unity released a version supporting it. I was so excited I couldn’t wait! So, that weekend I got my second inside-out mobile VR tracking project in Unity. This time for the Samsung Gear VR using the Google ARCore SDK on a Samsung Galaxy S8. This mobile device has an Arm Mali-G71 MP20 GPU capable of delivering high image quality in VR by using 8x MSAA running consistently @ 60 FPS.

This blog is intended to share my experience in developing inside-out mobile VR tracking apps and making it available to Unity developers.  The Unity integration with ARCore SDKs is not yet prepared to do inside-out mobile VR tracking out of the box (or it wasn’t intended to do it), so I hope I will save you some time and pain with this blog.

I hope you will experience the same satisfaction I had when you implement your own Unity mobile VR project with inside-out tracking.  I will explain step by step how to do it with the ARCore SDK. 

Mobile inside-out VR tracking using the Google ARCore SDK in Unity

I won’t point out all the steps you need to follow to get Unity working. I assume you have Unity 2017.2.0b9 or later, and have the entire environment prepared to build Android apps. Additionally, you’ll need a Samsung Galaxy S8. Unfortunately, you can try inside-out VR tracking based on Google ARCore only on this phone and Google Pixel and Pixel XL so far.

The first step is to download the Unity package of the Google ARCore SDK for Unity (arcore-unity-sdk-preview.unitypackage) and import it to your project. A simple project will be enough; just a sphere, a cylinder and a cube on a plain.

You will also need to download the Google ARCore service. It is an APK file (arcore-preview.apk), and you need to install it on your device.

At this point you should have a folder in your project called “GoogleARCore” containing a session configuration asset, an example, the prefabs, and the SDK.

Figure 1. The Google ARCore SDK folders after imported in Unity.

We can now start integrating ARCore in our sample. Drag and drop the ARCore Device prefab that you will find in the Prefabs folder into the scene hierarchy. This prefab includes a First-Person Camera. My initial thought was to keep this camera that automatically converts to the VR camera when ticking the “Virtual Reality Supported” box in Player Settings. I understood later that this is a bad decision. The reason for this is that this is the camera used for AR. We mean the camera used to render the phone camera input together with the virtual objects we add to the “real world scene”. I have identified three big inconveniences so far:

  • You need to manually comment the line that calls _SetupVideoOverlay() in the SessionComponent script because if you untick the “Enable AR Background” option in the session settings asset (see Fig. 3) then the camera pose tracking doesn’t work at all.    
  • You can’t apply any scale factor you may need to use to map the real world to your virtual world. You can’t always use a 1:1 map.
  • After selecting the Single-pass Stereo Rendering option, I got the left eye rendered correctly but not-so-good rendering in the right eye. Single-pass Stereo Rendering is something we need to use, to reduce the load on the CPU and accommodate the additional load that ARCore tracking brings.

So, we will use our own camera. As we are working on a VR project, place the camera as a child of a game object (GO); so we can change camera coordinates according to the tracking pose data from the ARCore subsystem. It is important to note here that the ARCore subsystem provides the camera position and orientation, but I decided to use only the camera position and let the VR subsystem to work as expected. The head orientation tracking the VR subsystem provides is in sync with the timewarp process and we don’t want to disrupt this sync.

The next step is to configure the ARCore session to exclusively use what we need for tracking. Click on the ARCore Device GO and you will see in the Inspector the scripts attached to it as in the picture below:

Figure 2. The AR Core Device game object and the scripts attached to it.

Double click on Default SessionConfig to open the configuration options and untick the “Plane Finding” and “Point Cloud” options as we don’t need them since they add a substantial load on the CPU. We need to leave “Enable AR Background” (passthrough mode) ticked in options otherwise the AR Session component won’t work and we won’t get any camera pose tracking.

Figure 3. The session settings as we need to set.

The next step is to add our own ARCore controller. Create a new GO ARCoreController and attach to it the script HelloARController.cs which we will borrow from the GoogleARCore/HelloARExample/Scripts folder. I renamed it to ARTrackingController and removed some items we don’t need.  My ARCoreController looks as the picture below. I have also attached to it a script to calculate the FPS.

Figure 4. The ARCoreController GO.

The Update function of the ARTrackerController script will look like below:

 

I removed everything but the checking of connection errors and the right tracking state. I have replaced the original class members by the ones below:

You then need to populate the public members in the Inspector. The camPoseText is used to show  on-screen data for debugging and errors, when tracking is lost, together with the phone camera position obtained from the Frame and the virtual camera position after applying the scale factors.

As I mentioned before, you will hardly always be able to map your real environment one to one to the virtual scene, and this is the reason I have introduced a couple of scaling factors for the movement on the XZ plane and in the Y axis (up-down).

The scale factor depends on the virtual size (vSize) we want to walk through and the actual space we can use in the real world. If the average step length is 0.762 m and we know we have room in the real world to do only nSteps, then a first approximation to the XZ scale factor will be:

scaleFactorXZ = vSize / (nSteps x 0.762 m)

I kept the _QuitOnConnectionErrors() class method and only changed the message output to use the Text component m_camPoseText.

After all this is working, your hierarchy (besides your geometry), should look like in the picture below:

Figure 5. The needed ARCore game objects as listed in the hierarchy.

As in my project the camera is colliding with some chess pieces in a chess room (this is an old demo I use every time I need to show something quick) I have added a CharacterController component to it.

At this point we are almost ready. We just need to set up the player settings. Besides the standard settings we commonly used for Android, Google recommends:

   Other Settings -> Multithreaded Rendering: Off

   Other Settings -> Minimum API Level: Android 7.0 or higher

   Other Settings -> Target API Level: Android 7.0 or 7.1

   XR Settings -> ARCore Supported: On

Below you can see a capture of my XR Settings. It is important to set the Single-pass option to reduce the number of draw calls we issue to the driver (almost halved).

Figure 6. The XR Settings.

If you build your project following the above described steps, you should get the mobile VR inside-out tracking working. For my project the picture below was my rendering result. The first line of text shows the phone camera position in the world supplied by Frame.Pose. The second line shows the FPS, and the third line shows the position of the VR camera in the virtual world.

Although the scene is not very complex in terms of geometry, the chess pieces are rendered with reflections based on local cubemaps, there are camera-chess pieces and chess pieces – chess room collisions. I am using 8x MSAA to achieve high image quality. Additionally, the ARCore tracking subsystem is running and all this on the Samsung S8 CPU and Arm Mali-G71 MP20 GPU render the scene at a steady 60 FPS.

Figure 7. A screenshot from a Samsung Galaxy S8 running VR in developer mode with inside-out tracking.

Conclusions

At this point, I hope you have been able to follow this blog and build your own mobile VR Unity project with inside-out tracking and above all, experience walking around a virtual object while doing the same in the real world. You will hopefully agree with me that it feels very natural and adds even more sense of immersion to the VR experience.

Just a few words about the quality of the tracking. I haven’t performed rigorous measurements, and these are only my first impressions after some tests and the feedback of colleagues that have tried my apps. I have tried both implementations indoors and outdoors, and they worked pretty stable in both scenarios. The loop closing was also very good, with no noticeable difference when coming back to the initial spot. When using Google ARCore I was able to go out of the room and the tracking still worked correctly. Nevertheless, formal tests need to be performed to determine the tracking error and stability.

Up to now we have been bound to a chair, moving the virtual camera by means of some interface being able to control only the camera orientation with our head. However, now we are in total control of the camera in the same way we control our eyes and body. We are able to move the virtual camera by replicating our movements in the real world.  The consequences of this new “6DoF power” are really important. Soon, we should be able to play new types of games on our mobile phones that up to now are only possible in the console and desktop space. Other potential applications of mobile inside-out VR tracking in training and education will be possible soon as well just with a mobile phone and a VR headset.

As always, I really appreciate your feedback on these blogs and please any comments on your own inside-out mobile VR experience.

 

About the Author

 After a decade working in nuclear physics, Roberto discovered his real passion for 3D graphics in 1995 and has been working in leading companies ever since. In 2012 Roberto joined Arm and has been working closely with the ecosystem in developing optimized rendering techniques for mobile devices. He also regularly publishes graphics related blogs, delivers talks and workshops at different game related events.

41 コメント

コメントの配信登録

コメント受付を終了しました。

  1. Very nice!

    Question: is the tracking CPU or GPU dependent, and how much of either is consumed for the tracking alone.

    1. Roberto Lopez Mendez, Arm

      12月 1, 2017 6:27 pm

      Hi Sean: It depends of the ARCore implementation. You can either use both or just the CPU, but that is a question to address to Google.

  2. We did something similar using ARCore+GearVR, but we also used the camera feed. https://www.youtube.com/watch?v=odRtAstoJKo
    It helps when you can see where you are going… but we had to edit the background webcam shader, to correct the image.

    Maybe a better solution would be to decide of an area in a room before the game, and using Image Marker(using Vuforia or something else), set the boundaries of the room and show them in VR when you are too close.

    1. Roberto Lopez Mendez, Arm

      10月 27, 2017 10:03 am

      Hi de-Panther; very cool what you have done. There could be different ways of alerting the user when he is near to hit an obstacle in the real world without rendering the camera feed as it works against the sense of immersion, which is a key element in VR.

      1. Can you think about those ways?
        As the only way I currently think about, is the one with a marker – but in the end, the ultimate goal, is to use only the headset, without extra objects.
        Maybe if ARCore could remember the area, like tango does, we could define a boundary.

  3. Misleading headline. This isn’t available on *my* phone. It should be “mobile inside out vr tracking now readily one one single phone by one manufacturer with unity” (which may or may not be supported some day by other phones)

  4. Hey, anybody have an idea why I can’t build my project? For ARCore you need Unity 3D 2017.2.0b9 or later (I also tried it with 2017.3.0b5 and 2017.2.0f3 but nothing changed). Whenever I try to build my project, it says “Unable to list target platforms”. I already tried to downgrade the Android SDK (which seems to be the only fix) but then I cant build because ARcore needs Api 24. Any insights?

  5. Alex Coulombe

    10月 22, 2017 9:04 am

    I got this working, but my skybox is replaced with the real-world camera video feed. How can I limit that camera video feed to only tracking purposes so we can visually maintain a fully ‘virtual reality’ experience?

    1. Alex Coulombe

      10月 22, 2017 4:43 pm

      Okay got it working. For anyone else who wants to NOT see the camera feed, be sure to still do this step:

      “You need to manually comment the line that calls _SetupVideoOverlay() in the SessionComponent script because if you untick the “Enable AR Background” option in the session settings asset (see Fig. 3) then the camera pose tracking doesn’t work at all.”

      The author made it sound like that’s only something to do if you’re using the ARCore default camera, but you need to do it no matter what.

      1. Roberto Lopez Mendez, Arm

        10月 23, 2017 11:56 am

        Hi Alex; as I mentioned in the blog in my first test where I used the camera provided by the ARCore Device prefab I had to comment the _SetupVideoOverlay() line in the SessionComponent script. Later when I decided to use my own camera there was no need to comment this line. Perhaps it is something that depends on the version of Unity or ARCore due to some change. I have double checked my Chess Room project and the line is not commented. In any case it is clear that if the video feed is still rendered commenting that line will solve the problem, as the “Enable AR Background” option must be enabled otherwise the tracking will not work at all.

    2. Roberto Lopez Mendez, Arm

      10月 23, 2017 9:44 am

      Hi Alex, please try what I mention in the blog: You need to manually comment the line that calls _SetupVideoOverlay() in the SessionComponent script because if you untick the “Enable AR Background” option in the session settings asset (see Fig. 3) then the camera pose tracking doesn’t work at all.

  6. Roberto can this be done on a Sony Phone with android 7.0?

    1. Roberto Lopez Mendez, Arm

      10月 23, 2017 9:42 am

      Hi Ruby, as I mention in the blog you can try inside-out VR tracking based on Google ARCore only on Samsung Galaxy S8, Google Pixel and Pixel XL. Google ARCore is supported only in these phones so far.

  7. Great article! Is it possible to achieve this as well with ARKit and Iphone?

    1. Roberto Lopez Mendez, Arm

      10月 20, 2017 6:45 pm

      Hi Esteban; I haven’t tried but I guess it should be possible as ARKit should give you access to the camera pose as ARCore does.

    1. Roberto Lopez Mendez, Arm

      10月 24, 2017 11:06 am

      Hi Youten; great to see that you could implement inside-out VR tracking following the blog! Thanks for sending the links to the videos and the blog.

    2. Thank you for sharing your repository. It motivated me to pick up my daydream again and do some mobile virtual reality.

      I’ve forked it to my bitbucket account and applied some modifications to it to reduce initial shifting and drifting errors using the tracked planes provided by the ARCore API.
      In addition I’ve put some basic support for the Daydream remote to have more options for user interaction.
      https://bitbucket.org/TobiasPott/noxp.daydreamar

      @Everyone: Feel free to comment, fork and test drive my adopted solution.

  8. You also need to add:

    using UnityEngine.UI;

    at the top of your script or you will get an error on the public Text m_camPoseText; declaration

  9. I’ve been trying to follow your instructions for about three hours and I couldn’t even get close to the results you describe. Could you please upload some source code :)?

    1. Roberto Lopez Mendez

      10月 20, 2017 11:30 am

      Hi Rodolfo, believe me all you need is in the blog; just follow the steps in it. If you don’t get the tracking working, print in the screen the Frame.Pose.position and debug your code, it is not really much. You need to be sure that the ARCore session makes a successful connection (no error on that) in the controller script.

    2. Blockchain is future.

      Blockchain technologies helped to create Flying Cars.

      You don’t believe?

      Check this out: goo.gl/fsCXKj
      Blockchain-powered software platform for flying vehicles gives people affordable and spectacular
      way to avoid traffic jams in big cities.

      What do you think about it?

  10. Alex Coulombe

    10月 19, 2017 7:34 pm

    Thank you so much for this!! Can’t wait to try it out. One suggestion: instead of destroying your Daydream headset, you can use the Daydream app developer options to “Skip VR entry screens”, thereby allowing you to use Daydream with any Cardboard headset, many of which leave an opening for the camera (I use C-1 glass)

    1. Hi Alex, I haven’t really destroyed my Daydream headset, just drilled some holes in it. I’m still keeping the cover so that I can reattach it again . The point is that I wanted to show that it is possible to have inside-out mobile VR tracking and at the same time high VR visual quality. In the demo I showed in the Arm booth at Unite Austin I achieved a steady 60 FPS + 8x MSAA + inside-out VR tracking.

  11. Roberto Lopez Mendez

    10月 19, 2017 4:17 pm

    Thanks Jaye for your comments and congrats for results! I am sure you will be able to implement inside-out tracking following the steps described in this blog.

  12. Curious about why this can only be done with Google Pixel & Galaxy S8. I have a Xiaomi Mi 6 and am curious to see if this could work. Is it just a processor / SoC requirement, or are there certain camera components that are required?

    1. Roberto Lopez Mendez

      10月 19, 2017 4:21 pm

      Hi Mark, Google ARCore is supported so far only on Samsung Galaxy S8, Google Pixel and Pixel XL.

  13. Hey Roberto! This is great! Thanks for all the work you are doing for mobile VR. I followed your other posts and have gotten soft dynamic shadows, stereo reflections, refraction, linear fog, light shafts, and world space normals working in my project. I’m still trying to figure how to get the Bloom working though. I was even able to combine the shader techniques with Google’s Daydream Renderer and I have 6 real time lights as well! I showed this to your friend Carl at Unity meetup and he was impressed. You should try it as it runs on the S7 and S8. It uses vertex lighting. I can’t wait to try combining ARCore. Best!

    1. Roberto Lopez Mendez

      10月 19, 2017 4:22 pm

      Thanks Jaye for your comments and congrats for the results! I am sure you will be able to implement inside-out tracking following the steps described in this blog.

  14. TonyVT Skarredghost

    10月 19, 2017 12:36 pm

    Very very interesting tutorial, thanks for having written it! I’m just a bit curious about motion sickness: the tracking speed and accuracy is enough at this point to prevent people to get sick while using this kind of tracking?

    1. Roberto Lopez Mendez

      10月 19, 2017 1:31 pm

      Hi Tony:
      Personally, I didn’t experience any motion sickness during the test I performed, and neither did my colleagues that tested both implementations. We have only tested this while moving naturally (at walking speed), for example, I haven’t tested it while running. The main thing you need to take care of is that you have enough clear space when moving around with the headset to avoid the natural fear of hitting any obstacles.

      1. Very interesting tutorial, I hope Vuforia can do this someday so it would open positional tracking to any device (with enough power to handle it).

        Roberto have you though of making a simple vr boundary system with the point cloud data? As it is capable of detecting surrounding geometry, one could be able to “see” walls and other type of obstacles when approximating them in vr.

        1. Roberto Lopez Mendez, Arm

          10月 25, 2017 12:09 pm

          Hi Markel; your suggestion is very interesting but could be also expensive. For inside-out mobile VR tracking I removed point cloud and plane detection to minimize the impact on the app performance. Nevertheless I could expect obstacle detection in mobile VR combined with inside-out tracking to be possible once the AR libraries (as ARCore and ARKit) improve the performance. ARCore plane detection is limited right now to horizontal planes but I would expect to support other orientations as Tango SDK does.

  15. Johnny Carlsen

    10月 18, 2017 5:39 pm

    I used to play a lot of flight simulators. VR simulators too.
    I was shoked when I heard that it’s already reality.
    I’ve recently heard about McFly.
    Blockchain-powered software platform for flying vehicles to give people affordable and spectacular way to avoid traffic jams in big cities.
    You can check it here, it’s amazing: goo.gl/sfwA9V

    Do you think it soon will be world-wide?

    1. This is blatant marketing spam and should be removed, user should be banned from commenting too.

  16. Andrew Gledhill-Carr

    10月 18, 2017 5:21 pm

    I’m really interested by this, you should consider releasing a demo to show people the capability and limits, as this could really be a mobile VR game changer! I also wonder how the positional tracking technology works without the camera, what are they doing under the hood…

    1. Also it wasn’t clear if the first method, using ARCore was implying compatibility with using the Daydream headset or whether, like the Tango method, you have to modify the headset to use the camera.

      1. Roberto Lopez Mendez

        10月 18, 2017 5:54 pm

        You should be able to get inside-out tracking VR using ARCore and Daydream, but you will need to open the holes in the headset. Both ARCore and Tango use images from the camera and the IMUs to track the camera pose. You can also run the VR app + tracking without the headset (you need also to have the controller to sync it with the app even if the app doesn’t use it) and render the camera pose data to the screen as I did initially. Only when I saw that it was working then I drilled the holes.

        1. Thanks for clearing that up! That answer was expected, if a little disappointing given that Google have just released their second generation of Daydream headsets and they don’t contain a built in hole for use with the camera. It does make this new combination of tech hard to reach users but hopefully we’ll start to see Unity and Google and other major players embrace this as a step forward to more immersive VR.

    2. Roberto Lopez Mendez

      10月 18, 2017 6:03 pm

      Hi Andrew, with mobile inside-out tracking we definitely should be able to play new types of games on our phones that up to now are only possible in the console and desktop space. Sharing the project in the Asset Store requires some time, this is the reason I decided to write the blog with a detailed explanation of how to achieve it. I am pretty sure that if you follow the steps described in the blog you will get your own inside-out VR tracking Unity project working.

  17. Looks like the silence on ‘ARCore and VR combo’ have been officially broken. As when I posted a topic about the possibility to combine ARCore and VR, the only message seen from Unity was that is wasn’t officially supported. https://forum.unity.com/threads/google-arcore-and-vr-combined.491631/

    Good to see that there is progress on giving this subject some attention. I personally hope we get some official support on this subject as inside-out tracking is a real must have for Mobile VR to take it to the next level.
    Too bad I wasn’t able to test the setup as I don’t have a S8 and just before it all happened I bought a S7, which isn’t supported yet. Anyways, thanks for sharing how you got it all working.

    For Unity Technology, when can we expect official support of ARCore and VR combined? :)