Search Unity

A lot has happened since we first announced the AR Foundation package for multi-platform handheld AR development. We want to take this opportunity to share how the package has evolved since developers started using it, and where it’s headed in the future.

We also want to provide some resources to help you better understand how AR Foundation fits into the handheld AR development ecosystem and how to use it to build great handheld AR applications.

Updates

We recently made significant updates to AR Foundation and other XR packages.

LWRP Support

You can now have more control of rendering by using the Lightweight Render Pipeline on ARCore and ARKit apps built with AR Foundation.

This also opens up the ability to utilize Unity’s shader graph to create interesting effects through a visual node editor.

Camera Image APIs

We now provide low-level access to the camera image on the CPU, as well as optimized conversion utilities to convert the image to RGB or grayscale. This is ideal for developers looking to do their own image-processing for custom computer-vision algorithms.

See the AR Foundation manual and samples repo on GitHub for a sample scene and source code.

World Map (ARKit)

We added support for ARKit’s ARWorldMap feature, which allows you to create persistent and multi-user AR experiences. Note that this will only work on ARKit enabled iOS devices.

Face Tracking (ARKit)

AR Foundation now includes support for ARKit’s face tracking feature, which lets you track a face and access blend shapes for several facial features.

Which package should I use?

Today, AR Foundation provides a platform-agnostic scripting API and MonoBehaviours for making ARCore and ARKit apps that use core functionality shared between both platforms. This lets you develop your app once and deploy to both devices without any changes. For a full list of currently supported features in AR Foundation refer to the chart below.

However, AR Foundation does not yet implement all the features for ARKit and ARCore, so if your app depends on a specific feature that isn’t yet in AR Foundation, you can use those specific SDKs separately. We are constantly adding features to AR Foundation and hope that AR Foundation will serve all the needs of developers looking to target ARCore or ARKit.

If you are only targeting ARCore and want the full feature set, Google maintains an SDK for Unity. If you are only targeting ARKit and want the full feature set, we still maintain the original ARKit plugin for Unity.

The charts below summarize the differences:

Documentation

Forums

Deployment

Feature comparison and roadmap

Remoting

A major feature we are testing and hope to roll out next year is remoting, which is the ability to stream sensor data from a device running ARCore or ARKit to the Mac or PC Editor. This should improve iteration time and aid in debugging your AR apps. See this forum post for details and to try it out today!

Simulation

In addition to remoting, we are adding in-Editor simulation. This will let you develop and test an AR app without ever connecting an Android or iOS device to your computer. This can dramatically improve development time and debugging.

More platforms

In 2019 we are going to expand platform support beyond handheld AR to include wearable AR devices as well.

How do I get started?

We created a sample repository that has a Unity project and scene with AR Foundation packages already included. There are scripts available for visualizing planes and feature points, placing objects on found planes, and using light estimation. We recently added some UX features to the samples repository that includes various animations to guide the user to find planes, place objects and fade out planes when they are no longer being updated. Check out the SampleUXScene for all of these features and more!

 

Download the AR Foundation Samples repo and join the Handheld AR forums to learn more about building handheld AR apps with AR Foundation. Share your apps on social media with the hashtag #madewithunity so that we can see all your creations!

Deixar uma resposta

Você poderá usar estes atributos e marcas HTML: <a href=""> <b> <code> <pre>

  1. Great news regarding support for World Maps and really nice work on this article! Do you have a rough estimate when you will be able to follow up with Cloud Anchor support (Q1, Q2, …)? On the arfoundation samples github it was mentioned that you were shooting for feature parity with ARCore by end of 2018, so I hope this isn’t too far out anymore? Keep up the good work!

  2. Camera Image API is amazing feature that I waiting for. But, there is no way to access camera intrinsic parameters via ARFoundation API. Most of computer vision algorithms requires this information. I hope this feature will be added soon.

    1. Thanks for the feedback. Support for camera intrinsics is currently being reviewed and should make it into the next release. I’ll make an announcement on the handheld AR forum when it’s ready: https://forum.unity.com/forums/handheld-ar.159/

  3. Will Image recognition only be supported by ARCore/Arkit devices? Would it also be possible to add Opencv to the same project to use as a fallback for those devices (in a seperate scene ofcourse)

  4. Amazing1

  5. I’ve been most impressed with the Thomas the Tank AR experience. It’s not as useful, but being able to place a virtual track that you built on your desk or floor and then watch as the trains drive around and hot air balloons fly through the room is pretty cool. You can also change the size so it will fit on an end table or be the size of a whole room.

    TopStore.

  6. Can we use the Camera Image API in other platforms? WebGL specifically.

  7. Do you know if AR Foundation for iOS would work with Magic Leap on Unity in multi-user, cross-platform mode?

  8. Great article! Nice to see AR Foundation come this far.
    A small rectification: ARCore 1.5 introduced an API for adding images to an Augmented Image database at runtime, while the stable says “static only”. Is that what is meant by “static only”, the ability to create image tracking targets only from the editor?

    1. Static only in this case refers to the fact that ARKit supports recognizing and tracking images as they move. Currently ARCore only supports recognizing an image and will re-recognize it once it has stopped moving again.

  9. So where does that leave Vuforia?

    1. Vuforia has not been the go to AR SDK for quite some time now. Yes it was integrated into Unity 3D but developers quickly realized it has huge limitations. Aside from Developer limitations; the royalties are absurd. Vuforia is good to learn the basic function of AR but that’s it. The SDK is not as efficient as various other free to use and royalty free sdk’s out there. Users also are not very attracted to Target based experiences. 8th Wall is a free SDK that builds off of AR Core and AR Kit. I recommend this SDK for performance, future support, and since its built off of the top two SDK’s out there it will be around for a while.

      1. Aye, their licensing model was a bit odd. Thanks for the heads-up – when I first started on AR projects about two years ago, Vuforia were the only real option out there, then they were endorsed by Unity recently, so it was a surprise to hear of this change of plan. I’ll check out 8th Wall too, thanks :)

  10. Those charts are fantastic, thank you. It’s hard to keep all this stuff straight between release notes in packages. Please do another blog post when more things change!

    Better yet, put it in the package’s documentation! https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@1.0/manual/index.html