Search Unity

Blend virtual content and the real world with Unity’s AR Foundation, now supporting the ARCore Depth API

, junio 25, 2020

Unity’s AR Foundation 4.1 supports Google’s new ARCore Depth API. With the addition of this capability, AR Foundation developers can now deliver experiences that blend digital content with the physical world more realistically than ever before.

With its extensive feature set and vast reach, Google’s ARCore is one of the most popular and powerful SDKs available to developers of augmented reality (AR) experiences. We have been working closely with Google to ensure Unity users have swift access to newly released ARCore features. The release of the ARCore Depth API is a significant milestone as it enables enhanced understanding of physical surroundings as well as more realistic visuals in AR Foundation-based experiences. 

ARCore can take advantage of multiple types of sensors to generate depth images. On phones with only RGB cameras, ARCore employs depth-from-motion algorithms that compare successive camera images as the phone moves to estimate the distance to every pixel. This method allows for depth data to be available on hundreds of millions of Android phones. And on devices that include a Time of Flight camera, the depth data is even more precise.

AR Foundation now includes the following new features:

  • Automatic occlusion
  • Access to depth images

Occlusion made easy

The most obvious effect of ARCore’s depth information is the ability to realistically blend digital content and real-world objects.

We’ve expanded AR Foundation’s existing support for pass-through video to include per-pixel depth information provided by ARCore so that occlusion “just works” on supported devices. By simply adding the AR Occlusion Manager to the same GameObject that holds the AR Camera and AR Background Renderer components, the depth data is automatically evaluated by the shader to create this blending effect.

When occlusion is combined with AR Foundation’s existing support for ARCore’s Lighting Estimation capabilities, augmented reality apps can achieve almost seamless visual quality.

Interact with the world using the depth image

AR Foundation provides developers convenient access to the same per-pixel depth data it uses for automatic occlusion. Depth data is a powerful tool that allows developers to add rich interactions with the user’s surroundings. For example, the depth data could be used to build a representation of real-world objects that can be fed to Unity’s physics system. This creates the opportunity for digital content to appear to respond to and interact with the physical surroundings.

This capability opens the door to novel AR game experiences such as The SKATRIX by Reality Crisis. This upcoming title leverages the ARCore Depth API to generate meshes that transform the physical surroundings into an AR skatepark.

Having access to the raw depth data gives developers the tools to create unique interactive AR experiences that weren’t previously possible.

Try it today

The 4.1 versions of the AR Foundation and ARCore XR Plugin packages contain everything you need to get started and are compatible with Unity 2019 LTS and later. Samples demonstrating how to set up automatic occlusion and depth data are located in AR Foundation Samples on GitHub.

We’re excited to see the enhanced visuals and rich experiences made possible by the ARCore Depth API. And we look forward to continuing our close collaboration with Google to bring more awesome AR functionality to AR Foundation developers.

For more information please check out Google’s ARCore Depth API announcement and Depth Lab app to see examples of this tech that were made in Unity. Finally, join us on the Unity Handheld AR forums as you try out this latest version of AR Foundation. We’d love to hear about what you’ve created using the new features, and we welcome your feedback.

6 replies on “Blend virtual content and the real world with Unity’s AR Foundation, now supporting the ARCore Depth API”

Really cool have been building out a couple of test scenes, works really nicely! Having a look at the google documentation, it seems that there are different ways of accessing the depth data. Can we do that through AR Foundation yet? and if so, How?

what’s about iOS ?
Android & PC hä???
what’s with Mac & iOS?
I’ll don’t get it… why Unity always goes a extra way…
why not release those things cross platform ???
it’s longer in iOS SDK than Google :-(
Unity you make it more easy to switch for us!
If you only listen to the big AA Studios who doesn’t pay the development … only ask for special features that nobody needs… like you did in the past… than Epic will overrun Unity soon…
this inconsequent pushing on Android & PC is not good @ all…
better drop Apple support and focus on your lovely Android and Windows…
Unity has forgotten it’s roots!
the beginning of the end…

You may have missed our blog post yesterday regarding AR Foundation support of Apple’s latest ARKit 4 features. We’re excited to have released support of both ARKit and ARCore’s latest features soon after their respective launches this week. AR Foundation now supports depth and real-time occlusion in AR on both platforms.

More info regarding AR Foundation support of ARKit 4: https://blogs.unity3d.com/2020/06/24/ar-foundation-support-for-arkit-4-depth/

It might be a good idea to update the compatibility table on GitHub. In the article you explicitly say that 4.1 is compatible with 2019 LTS. The table on GH does not imply that at all.

AR Foundation is compatible with 2019 LTS. The table on GitHub provides guidance on which Unity versions *should* be used based on the latest verified versions of AR Foundation. The latest verified version of AR Foundation in Unity 2019 LTS is AR Foundation 2.1; that won’t change. You can use AR Foundation 4.1 in 2019 LTS but it’s still in preview and in a beta state.

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *