Search Unity

ARKit Remote: Now with face tracking!

January 16, 2018 in Technology | 3 min. read
Topics covered
Share

Is this article helpful for you?

Thank you for your feedback!

When developing any application is it essential to iterate as quickly as possible, and having to build to the device to test functionality is frustrating and can dramatically increase development time and cost.

At Unity, we strive to make your job as a developer easier and more efficient, and since the release of Apple’s ARKit in mid-2017, we have been working hard to streamline AR development for ARKit with our ARKit plugin and the ARKit Remote. The ARKit remote allows developers to iterate on ARKit experiences right inside the Unity Editor, without building to the device each time. Today we are happy to announce that you can now access ARKit Remote functionality for Face Tracking on iPhone X by downloading or updating the ARKit plugin for Unity.

Build ARKit Remote

To use ARKit Remote for Face Tracking, you will first need to build the ARKit Remote scene as an app to your iPhoneX. You will need an iPhoneX since it is the only device right now to feature the front facing TrueDepth camera, which is needed for Face Tracking.  Follow these steps for building the app to the device:

1. Get the latest Unity ARKit Plugin project from Bitbucket or Asset Store and load it up in the Unity Editor.

2. Open the “Assets/UnityARKitPlugin/ARKitRemote/UnityARKitRemote” scene.

3. Select the “Assets/UnityARKitPlugin/Resources/UnityARKitPlugin/ARKitSettings” file and activate the “ARKit Uses Facetracking” check box.

4. Select PlayerSettings (in the menu: Edit/Project Settings/Player) and make sure you have some text in the entry “Camera Usage Description.”

5. Select BuildSettings (in menu File/Build Settings...) and check the Development Build checkbox.

6. Now build this scene to your iPhone X as you would normally build an app via XCode.

Here’s a video of the steps needed for building the ARKit Remote.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Connect Editor to ARKit Remote

The steps in the previous section need only be done once to build ARKit Remote to your device. The following steps can be used over and over again to iterate on the ARKit Face Tracking in the editor:

1. Connect the iPhone X to your Mac development machine via USB.

2. Start up the ARKit Remote app on the device.  You should see a “Waiting for connection..” screen.

3. In the Unity Editor, connect to your iPhone X by going to your Console Window and selecting the iPhone X connected via USB.

4. Load up one of the FaceTracking examples in the project e.g. “Assets/UnityARKitPlugin/Examples/FaceTracking/FaceAnchorScene” and press Play in the Editor.

5. You should see a green screen with a button on top that says “Start ARKit Face Tracking Session.” Press that button and you should see your front camera video feed in your Editor “Game” window.  If your face is in the view it will be sending ARKit Face Tracking data from the device to the Editor as well.

Here is a video that demonstrates the connection steps:

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Play with ARKit Face Tracking Data

Once you have connected your ARKit Face Tracking scene to ARKit Remote, all the Face Tracking data (face anchor, face mesh, blendshapes, directional lighting) is sent from device to Editor. You can then manipulate that data in the Editor to affect the scene immediately. Here are a couple of videos to demonstrate this:

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

New, Streamlined ARKit Remote Workflow!

As part of adding Face Tracking functionality to the ARKit Remote, we also made it much easier to work with ARKit Remote without altering your original ARKit scene in the Unity Editor. Previously, you had to add a GameObject that connects from your scene to the ARKit Remote. Now, we check if you are trying to initialize an ARKit configuration from the Editor and it automatically adds the RemoteConnection GameObject to your scene at runtime.

Have fun playing around with ARKit Face Tracking in the Unity Editor!

January 16, 2018 in Technology | 3 min. read

Is this article helpful for you?

Thank you for your feedback!

Topics covered