Search Unity

When developing any application is it essential to iterate as quickly as possible, and having to build to the device to test functionality is frustrating and can dramatically increase development time and cost.

At Unity, we strive to make your job as a developer easier and more efficient, and since the release of Apple’s ARKit in mid-2017, we have been working hard to streamline AR development for ARKit with our ARKit plugin and the ARKit Remote. The ARKit remote allows developers to iterate on ARKit experiences right inside the Unity Editor, without building to the device each time. Today we are happy to announce that you can now access ARKit Remote functionality for Face Tracking on iPhone X by downloading or updating the ARKit plugin for Unity.

Build ARKit Remote

To use ARKit Remote for Face Tracking, you will first need to build the ARKit Remote scene as an app to your iPhoneX. You will need an iPhoneX since it is the only device right now to feature the front facing TrueDepth camera, which is needed for Face Tracking.  Follow these steps for building the app to the device:

1. Get the latest Unity ARKit Plugin project from Bitbucket or Asset Store and load it up in the Unity Editor.

2. Open the “Assets/UnityARKitPlugin/ARKitRemote/UnityARKitRemote” scene.

3. Select the “Assets/UnityARKitPlugin/Resources/UnityARKitPlugin/ARKitSettings” file and activate the “ARKit Uses Facetracking” check box.

4. Select PlayerSettings (in the menu: Edit/Project Settings/Player) and make sure you have some text in the entry “Camera Usage Description.”

5. Select BuildSettings (in menu File/Build Settings…) and check the Development Build checkbox.

6. Now build this scene to your iPhone X as you would normally build an app via XCode.

Here’s a video of the steps needed for building the ARKit Remote.

Connect Editor to ARKit Remote

The steps in the previous section need only be done once to build ARKit Remote to your device. The following steps can be used over and over again to iterate on the ARKit Face Tracking in the editor:

1. Connect the iPhone X to your Mac development machine via USB.

2. Start up the ARKit Remote app on the device.  You should see a “Waiting for connection..” screen.

3. In the Unity Editor, connect to your iPhone X by going to your Console Window and selecting the iPhone X connected via USB.

4. Load up one of the FaceTracking examples in the project e.g. “Assets/UnityARKitPlugin/Examples/FaceTracking/FaceAnchorScene” and press Play in the Editor.

5. You should see a green screen with a button on top that says “Start ARKit Face Tracking Session.” Press that button and you should see your front camera video feed in your Editor “Game” window.  If your face is in the view it will be sending ARKit Face Tracking data from the device to the Editor as well.

Here is a video that demonstrates the connection steps:

Play with ARKit Face Tracking Data

Once you have connected your ARKit Face Tracking scene to ARKit Remote, all the Face Tracking data (face anchor, face mesh, blendshapes, directional lighting) is sent from device to Editor. You can then manipulate that data in the Editor to affect the scene immediately. Here are a couple of videos to demonstrate this:

New, Streamlined ARKit Remote Workflow!

As part of adding Face Tracking functionality to the ARKit Remote, we also made it much easier to work with ARKit Remote without altering your original ARKit scene in the Unity Editor. Previously, you had to add a GameObject that connects from your scene to the ARKit Remote. Now, we check if you are trying to initialize an ARKit configuration from the Editor and it automatically adds the RemoteConnection GameObject to your scene at runtime.

Have fun playing around with ARKit Face Tracking in the Unity Editor!

Comentários encerrados.

  1. anyone knows why mouse click with the newest unity beta does not work in play mode on Mac book pro?

  2. How can I save the keyframes of the animations?

  3. Awesome! I know what I’m working on tonight!

  4. This looks fantastic. Two questions:
    – Does it work only live, or you can also record the facial animations?
    – In what format the animation is saved? Is it possible to apply the same animation recorded to multiple characters?
    – If so, what’s the best pipeline of tools for recording – retargeting? (directly in Unity, or better to use Maya)


    1. I’m with You, I would like to see a asset that would plug the data directly to an animation file to be used on external meshes

  5. great work – we can use this to make animations!

  6. Will this work with a PC instead of a Mac? For example, connect the iPhone X to a PC development machine via USB and have the ARKit features working?

  7. Awesome news Jimmy! I’m really looking forward to trying this. Doing an iOS build each time was a bit frustrating.
    Thanks for the update!

  8. This is fantastic news, I can gather face meshes and face textures on my server and after a few month, make money by selling those on the black market.