Search Unity

ARKit Face Tracking on iPhone X

November 3, 2017 in Technology | 6 min. read
Topics covered
Share

Is this article helpful for you?

Thank you for your feedback!

ARKit has been established as a reliable way to do stable consumer level AR since its first announcement at WWDC in June. More recently, during the iPhone X announcement, it was revealed that ARKit will include some face tracking features only available on the iPhone X using the front camera array, which includes a depth camera.

Unity has been working closely with Apple from the beginning to deliver a Unity ARKit plugin with the ARKit announcement so that Unity developers could start using those features as soon as it was available. Since then, we have continued to work closely with Apple to deliver the face tracking features of ARKit as part of the Unity ARKit plugin.

New code and examples for these features are integrated into the Unity ARKit plugin, which you can get from BitBucket or from the Asset Store.

API additions in plugin

There is now a new configuration called ARKitFaceTrackingConfiguration which can be used when running on an iPhone X. There are new RunWithConfig and RunWithConfigAndOptions methods that take the ARKitFaceTrackingConfiguration to start the AR session.

public void RunWithConfig(ARKitFaceTrackingConfiguration config)

public void RunWithConfigAndOptions(ARKitFaceTrackingConfiguration config, UnityARSessionRunOption runOptions)

There are also event callbacks for when ARFaceAnchor is added, removed or updated:

public delegate void ARFaceAnchorAdded(ARFaceAnchor anchorData);

public static event ARFaceAnchorAdded ARFaceAnchorAddedEvent;



public delegate void ARFaceAnchorUpdated(ARFaceAnchor anchorData);

public static event ARFaceAnchorUpdated ARFaceAnchorUpdatedEvent;



public delegate void ARFaceAnchorRemoved(ARFaceAnchor anchorData);

public static event ARFaceAnchorRemoved ARFaceAnchorRemovedEvent;

Features Exposed by Face Tracking

There are four main features exposed by Face Tracking in ARKit. They are described below and corresponding examples that use them are detailed.

Face Anchor

The basic feature of face tracking is to provide a face anchor when ARKit detects a face with the front camera on the iPhone X. This face anchor is similar to the plane anchor that the ARKit returns usually, but tracks the position and orientation of the center of the head as you move it around. This allows you to use the movement of the face as input for your ARKit app, but also allows you to use this anchor to attach objects to the face or head so that it will move around with the movement of your head.

We have made an example scene to demonstrate the use of this called FaceAnchorScene.  This has a GameObject with the component UnityARFaceAnchorManager that initializes ARKit with ARKitFaceTrackingConfiguration. It also hooks into the FaceAnchor to create, update, and remove events so that it can do the following:

  1. On face anchor creation, it enables a GameObject that is referenced, in this case a model of three axes, and moves it to the position and orientation that is returned by the FaceAnchor.
  2. On face anchor update, it updates the position and orientation of the GameObject
  3. On face anchor removal, it disables the GameObject

This scene also uses ARCameraTracker component that updates the Main Unity Camera via the FrameUpdateEvent that a regular ARKit app uses.

Face Mesh Geometry

The face tracking API can also return the geometry of the face it detects as a mesh. We can then use the mesh vertices to create a corresponding mesh in Unity. Then we can use the mesh in Unity with a transparent texture to allow all sorts of face painting and masks. We can also put an occlusion material on this mesh when we attach things to the face anchor so that the attachments occlude properly against the video of the face.

The example scene called FaceMeshScene shows how to display the face mesh geometry on top of your face with a default material on it (so it appears grey). It has the usual ARCameraTracker GameObject to move the camera in the scene. In addition, it has ARFaceMeshManager GameObject, which has a standard Mesh Renderer and an empty Mesh Filter component. This GameObject also has UnityARFaceMeshManager component, which does the following:

  1. Configuration of the face tracking
  2. Anchor position and rotation updates on the transform of this GameObject
  3. Extract the mesh data from the anchor per frame and populate a mesh with it, and set the MeshFilter component to reference that mesh.

Blend Shapes

Another set of data we get from face tracking are coefficients that describe the expressions on your face, which can be mapped onto a virtual face to make it have a similar expression to yours.

Our example scene FaceBlendShapeScene shows this. In the UI, it shows the coefficients of the different blend shape values that are returned by the current expression on your face. See how they change when you change your expressions!

This scene has the same GameObjects as the FaceMeshScene, but in addition also has a BlendshapeOutput GameObject which contains a BlendshapePrinter component. This component extracts the blend shapes from the face anchor if it exists and outputs it to the screen UI.

We are working on a more elaborate example where these values will be mapped on to the facial animation of a virtual head to get a better idea on how this could work in your experience.

Directional Light Estimate

Another interesting set of data that you get with face tracking is a directional light estimate of the scene, based on using your face as a light probe in the scene. The estimate that is generated contains three things:

  1. Primary light direction
  2. Primary light intensity
  3. Spherical harmonics coefficients of the estimated lighting environment in all directions

The last of these is very interesting for us in Unity, as it is the solution used for dynamic global illumination in our standard rendering pipeline. Knowing this information, we can take advantage of it in our example scene.

The FaceDirectionalLightEstimate scene has an ARCameraTracker and an ARFaceAnchorManager, which moves a standard grey sphere mesh around with your face.  What’s new is the ARKitLightManager GameObject, which has the UnityARKitLightManager component on it that gets the spherical harmonics coefficients from the FrameUpdated event and plugs it into all of the Unity light probes in the scene, including the ambient light probe (which is used when none of the light probes in the scene affect the mesh). This effectively lights the meshes in the scene with the estimated environment lighting dynamically.

Alternatively, if you wish to use your own mechanism to light the scene you can get the raw spherical harmonics coefficients in Unity’s coordinate system via the FrameUpdatedEvent and plug it into your lighting formulas. You may also just want to light with the primary light direction and intensity, which are also available in the same manner.

Use Face Tracking in your Apps

As you can see, there are some nice ARKit features available with face tracking on the iPhone X. Unity’s ARKit plugin can help you to easily implement these features within your apps. As usual, show us your creations on @jimmy_jam_jam, and ask any questions on the forums.

November 3, 2017 in Technology | 6 min. read

Is this article helpful for you?

Thank you for your feedback!

Topics covered