Search Unity

What’s new in Unity ARKit Plugin for ARKit 2

June 14, 2018 in Technology | 7 min. read
Topics covered
Share

Is this article helpful for you?

Thank you for your feedback!

Apple announced exciting news for AR developers last week at WWDC, including ARKit 2. Unity has worked closely with Apple to allow our developers instant access to all of these new features with an update to Unity ARKit Plugin. In this blog, we’ll get into the technical details of ARKit 2’s new features and how to access them via the Unity ARKit Plugin. Please download the plugin from Bitbucket to follow along.

ARWorldMap

ARWorldMap is a useful new feature for ARKit 2, and allows for both persistence of AR experiences as well as shared multiplayer AR experiences. (Read more on ARWorldMap here.)

See example in plugin: Examples/ARKit2.0/UnityARWorldMap/UnityARWorldMap.unity

Every session builds up an ARWorldMap as you move around and detect more feature points. You can get the current ARWorldMap from a session from C# and save it to somewhere on your Application.persisentDataPath.

You may also load a saved ARWorldMap from where you saved it. This allows virtual objects to persist in the same coordinate space even if you leave a session and come back to it later.

ARWorldMap can be serialized to a byte array and sent across to another device using WiFi, Bluetooth or some other means of sharing. It can also be deserialized on the other side and used to relocalize the other device to the same world mapping as the first device, so that you can have a shared multiplayer experience.

Once you have the ARWorldMap, either from loading it, or from memory, or from receiving it from another device, your device can share coordinate systems with that ARWorldMap by setting it as a parameter in the configuration and resetting the ARSession with that configuration.

This resets the session, and as you move around, it tries to match up the feature points in the ARWorldMap to the feature points it's detecting in your environment. When they match up, it relocalizes your device coordinates to match up with the coordinates that were saved in the ARWorldMap.

Here is a video of a more complete example called SharedSpheres which uses Unity’s Multiplayer Networking feature UNet to transmit the ARWorldMap to another device to sync up their coordinate systems:

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

ARReferenceObject and ARObjectAnchor

Similar to ARReferenceImage and ARImageAnchor for image recognition that existed in ARKit 1.5, we now have ARReferenceObject and ARObjectAnchor to do object recognition. This was the feature prominently showed off by Lego in the WWDC keynote to recognize their real playset and enhance it with a virtual playset overlayed on top of the real one.

UnityARObjectAnchor example

How to use it in Unity: See the Examples/ARKit2.0/UnityARObjectAnchor/UnityARObjectAnchor.unity scene in the plugin.

This example imagines that you already have .arobject files that describe the objects you want to recognize. You may create .arobject files to use here either from the UnityObjectScanner example described below, or from Apple's ARKit Object Scanner app which both produce the same format of file for use in the workflow.

Again very similar to ARReferenceImage, we're going to set up a ARReferenceObjectsSetAsset, which contains references to ARReferenceObjectAssets. Then we will add a reference to that ARReferenceObjectsSetAsset to the config for ARSession so that it tries to detect the ARReferenceObjects in that set when in the session.

All this can be done in the Unity Editor.

Whenever an object is recognized, an ARObjectAnchor is created, and just like for other anchors, you can subscribe to an event that tells you when these anchors are added, updated or removed.

When that event is triggered, you can decide what you want to do (e.g create a prefab at that location).

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

UnityObjectScanner example

Examples/ARKit2.0/UnityARObjectScanner/UnityARObjectScanner.unity is a more complete example that does both object creation using a pickable bounding box as well as object detection.

You can save .arobjects that you have scanned using this example and then use iTunes FileSharing to transfer them to your Mac. Once you have the files on the Mac, you can rename them before you put them into your Unity project.

This example has different modes: the scanning mode and the detecting mode. For the scanning mode, we use an ARKitObjectScanningSessionConfiguration, which does a more detailed exploration of the scene, but which uses more CPU and power (so it should be limited in use).

Using this configuration, you can tap on a plane near the object that you want to scan to produce a red bounding box to cover it. You can manipulate the box so that it just covers the object of interest. Then scan all around the box to get as many feature points on the object as possible. Then you create an ARReferenceObject by tapping a button. The created object gets saved to a list.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Pressing the Detect button takes you to the detecting mode which works like the object anchor example above, but uses a method for dynamically adding the ARReferenceObjects to the list of detected objects.

Pressing the Save button saves all the objects that have been scanned so far on to a folder on the device. It saves them as .arobjects and you can use iTunes FileSharing to transfer them to your Mac. Once you have the files on the Mac, you can rename them before you put them into your Unity project.

AREnvironmentProbeAnchor

AREnvironmentProbeAnchor is a new kind of anchor that can either be generated automatically or you can specify where to create it. This anchor creates and updates a reflected environment map of the area around it based on the ARKit video frames and world tracking data. It also uses a machine learning algorithm to approximate the environment texture for parts of the scene it has not seen yet, based on an ML training model involving thousands of environments.

How to use it in Unity: See the Examples/ARKit2.0/UnityAREnvironmentTexture folder for examples

There is a new parameter on the session configuration that controls this feature, and that can have one of three values: UnityAREnvironmentTexturingNone, UnityAREnvironmentTexturingManual or UnityAREnvironmentTexturingAutomatic.

With the UnityAREnvironmentTexturingManual mode, you will have to create an AREnvironmentProbeAnchor yourself, but ARKit will update the texture that is captured from the environment.

If you use UnityAREnvironmentTexturingAutomatic mode instead, ARKit will generate the AREnvironmentProbeAnchors in procedurally spaced intervals according to the data it infers from your session and your movement through the space.

Both of the examples generate a prefab that contains a Unity ReflectionProbe component and updates it with the environment texture from the AREnvironmentProbeAnchor. This ReflectionProbe now participates in the standard Unity rendering pipeline and will enhance any GameObject that uses it.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Image tracking

Reference images work the same as in ARKit 1.5, but now instead of just recognizing images, ARKit allows you to track them: when you move the reference image, the ARImageAnchor associated with them moves with the image so you can move content that is anchored on those moving images.

There is one extra parameter on the session configuration that allows you to do this by stating how many images you want to track simultaneously during the session. The existing example has been updated to use this new feature.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Face Tracking improvements

ARKit 2 also improves face tracking for iPhone X with some new features. First, there is one extra blendshape coefficient called TongueOut. This returns a value between 0.0 and 1.0 depending on how much you have stuck your tongue out as perceived by ARKit 2 face tracking. Apple showed this at WWDC on their Animojis, and it appeared to be very popular with the audience.

The other improvement is that it now does eye gaze tracking. You receive a transform that describes where each eye on the face is pointed at, as well as the position of the object which is being looked at.

Take a look at Examples/ARKit2.0/UnityTongueAndEyes/UnityTongueAndEyes.unity for an example of how to make use of this new data coming in from the face anchors.

We’ll leave you with this creepy image of the author demonstrating this example:

https://www.youtube.com/edit?o=U&video_id=ZBw_f_my5-M

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Make some great AR apps!

For more technical details on the features above and how to use them from Unity, take a look at What’s New In Unity ARKit Plugin for ARKit 2. Please download the latest version of the plugin from Bitbucket and try building the examples to your iOS devices. Come to the forums with your questions. Then take the next step and create your own amazing AR experiences with the easy to use tools that we have provided. Most of all, have fun doing it!

June 14, 2018 in Technology | 7 min. read

Is this article helpful for you?

Thank you for your feedback!

Topics covered