Search Unity

Unity’s Handheld AR Ecosystem: AR Foundation, ARCore and ARKit

, December 18, 2018

A lot has happened since we first announced the AR Foundation package for multi-platform handheld AR development. We want to take this opportunity to share how the package has evolved since developers started using it, and where it’s headed in the future.

We also want to provide some resources to help you better understand how AR Foundation fits into the handheld AR development ecosystem and how to use it to build great handheld AR applications.


We recently made significant updates to AR Foundation and other XR packages.

LWRP Support

You can now have more control of rendering by using the Lightweight Render Pipeline on ARCore and ARKit apps built with AR Foundation.

This also opens up the ability to utilize Unity’s shader graph to create interesting effects through a visual node editor.

Camera Image APIs

We now provide low-level access to the camera image on the CPU, as well as optimized conversion utilities to convert the image to RGB or grayscale. This is ideal for developers looking to do their own image-processing for custom computer-vision algorithms.

See the AR Foundation manual and samples repo on GitHub for a sample scene and source code.

World Map (ARKit)

We added support for ARKit’s ARWorldMap feature, which allows you to create persistent and multi-user AR experiences. Note that this will only work on ARKit enabled iOS devices.

Face Tracking (ARKit)

AR Foundation now includes support for ARKit’s face tracking feature, which lets you track a face and access blend shapes for several facial features.

Which package should I use?

Today, AR Foundation provides a platform-agnostic scripting API and MonoBehaviours for making ARCore and ARKit apps that use core functionality shared between both platforms. This lets you develop your app once and deploy to both devices without any changes. For a full list of currently supported features in AR Foundation refer to the chart below.

However, AR Foundation does not yet implement all the features for ARKit and ARCore, so if your app depends on a specific feature that isn’t yet in AR Foundation, you can use those specific SDKs separately. We are constantly adding features to AR Foundation and hope that AR Foundation will serve all the needs of developers looking to target ARCore or ARKit.

If you are only targeting ARCore and want the full feature set, Google maintains an SDK for Unity. If you are only targeting ARKit and want the full feature set, we still maintain the original ARKit plugin for Unity.

The charts below summarize the differences:




Feature comparison and roadmap


A major feature we are testing and hope to roll out next year is remoting, which is the ability to stream sensor data from a device running ARCore or ARKit to the Mac or PC Editor. This should improve iteration time and aid in debugging your AR apps.


In addition to remoting, we are adding in-Editor simulation. This will let you develop and test an AR app without ever connecting an Android or iOS device to your computer. This can dramatically improve development time and debugging.

More platforms

In 2019 we are going to expand platform support beyond handheld AR to include wearable AR devices as well.

How do I get started?

We created a sample repository that has a Unity project and scene with AR Foundation packages already included. There are scripts available for visualizing planes and feature points, placing objects on found planes, and using light estimation. We recently added some UX features to the samples repository that includes various animations to guide the user to find planes, place objects and fade out planes when they are no longer being updated. Check out the SampleUXScene for all of these features and more!


Download the AR Foundation Samples repo and join the Handheld AR forums to learn more about building handheld AR apps with AR Foundation. Share your apps on social media with the hashtag #madewithunity so that we can see all your creations!

18 replies on “Unity’s Handheld AR Ecosystem: AR Foundation, ARCore and ARKit”

Great news regarding support for World Maps and really nice work on this article! Do you have a rough estimate when you will be able to follow up with Cloud Anchor support (Q1, Q2, …)? On the arfoundation samples github it was mentioned that you were shooting for feature parity with ARCore by end of 2018, so I hope this isn’t too far out anymore? Keep up the good work!

i am waiting for CloudAnchor support too, will then be possible to create multiplayer cross-platform AR experiences?

Camera Image API is amazing feature that I waiting for. But, there is no way to access camera intrinsic parameters via ARFoundation API. Most of computer vision algorithms requires this information. I hope this feature will be added soon.

Will Image recognition only be supported by ARCore/Arkit devices? Would it also be possible to add Opencv to the same project to use as a fallback for those devices (in a seperate scene ofcourse)

At initial release only ARCore and ARKit are supported via AR Foundation so the abstraction we provide would utilize either of those SDKs’ implementations of image tracking. But there would be nothing stopping an app from falling back to some OpenCV based solution – it simply wouldn’t be available via the AR Foundation abstractions for image tracking.

I’ve been most impressed with the Thomas the Tank AR experience. It’s not as useful, but being able to place a virtual track that you built on your desk or floor and then watch as the trains drive around and hot air balloons fly through the room is pretty cool. You can also change the size so it will fit on an end table or be the size of a whole room.


Do you know if AR Foundation for iOS would work with Magic Leap on Unity in multi-user, cross-platform mode?

AR Foundation does not support Magic Leap at this time but you should be able to write a connected experience between separate apps you write for iOS and Magic Leap.

Great article! Nice to see AR Foundation come this far.
A small rectification: ARCore 1.5 introduced an API for adding images to an Augmented Image database at runtime, while the stable says “static only”. Is that what is meant by “static only”, the ability to create image tracking targets only from the editor?

Vuforia has not been the go to AR SDK for quite some time now. Yes it was integrated into Unity 3D but developers quickly realized it has huge limitations. Aside from Developer limitations; the royalties are absurd. Vuforia is good to learn the basic function of AR but that’s it. The SDK is not as efficient as various other free to use and royalty free sdk’s out there. Users also are not very attracted to Target based experiences. 8th Wall is a free SDK that builds off of AR Core and AR Kit. I recommend this SDK for performance, future support, and since its built off of the top two SDK’s out there it will be around for a while.

Aye, their licensing model was a bit odd. Thanks for the heads-up – when I first started on AR projects about two years ago, Vuforia were the only real option out there, then they were endorsed by Unity recently, so it was a surprise to hear of this change of plan. I’ll check out 8th Wall too, thanks :)

Comments are closed.