Search Unity

ARKit has been established as a reliable way to do stable consumer level AR since its first announcement at WWDC in June. More recently, during the iPhone X announcement, it was revealed that ARKit will include some face tracking features only available on the iPhone X using the front camera array, which includes a depth camera.

Unity has been working closely with Apple from the beginning to deliver a Unity ARKit plugin with the ARKit announcement so that Unity developers could start using those features as soon as it was available. Since then, we have continued to work closely with Apple to deliver the face tracking features of ARKit as part of the Unity ARKit plugin.

New code and examples for these features are integrated into the Unity ARKit plugin, which you can get from BitBucket or from the Asset Store.

API additions in plugin

There is now a new configuration called ARKitFaceTrackingConfiguration which can be used when running on an iPhone X. There are new RunWithConfig and RunWithConfigAndOptions methods that take the ARKitFaceTrackingConfiguration to start the AR session.

There are also event callbacks for when ARFaceAnchor is added, removed or updated:

Features Exposed by Face Tracking

There are four main features exposed by Face Tracking in ARKit. They are described below and corresponding examples that use them are detailed.

Face Anchor

The basic feature of face tracking is to provide a face anchor when ARKit detects a face with the front camera on the iPhone X. This face anchor is similar to the plane anchor that the ARKit returns usually, but tracks the position and orientation of the center of the head as you move it around. This allows you to use the movement of the face as input for your ARKit app, but also allows you to use this anchor to attach objects to the face or head so that it will move around with the movement of your head.

We have made an example scene to demonstrate the use of this called FaceAnchorScene.  This has a GameObject with the component UnityARFaceAnchorManager that initializes ARKit with ARKitFaceTrackingConfiguration. It also hooks into the FaceAnchor to create, update, and remove events so that it can do the following:

  1. On face anchor creation, it enables a GameObject that is referenced, in this case a model of three axes, and moves it to the position and orientation that is returned by the FaceAnchor.
  2. On face anchor update, it updates the position and orientation of the GameObject
  3. On face anchor removal, it disables the GameObject

This scene also uses ARCameraTracker component that updates the Main Unity Camera via the FrameUpdateEvent that a regular ARKit app uses.

Face Mesh Geometry

The face tracking API can also return the geometry of the face it detects as a mesh. We can then use the mesh vertices to create a corresponding mesh in Unity. Then we can use the mesh in Unity with a transparent texture to allow all sorts of face painting and masks. We can also put an occlusion material on this mesh when we attach things to the face anchor so that the attachments occlude properly against the video of the face.

The example scene called FaceMeshScene shows how to display the face mesh geometry on top of your face with a default material on it (so it appears grey). It has the usual ARCameraTracker GameObject to move the camera in the scene. In addition, it has ARFaceMeshManager GameObject, which has a standard Mesh Renderer and an empty Mesh Filter component. This GameObject also has UnityARFaceMeshManager component, which does the following:

  1. Configuration of the face tracking
  2. Anchor position and rotation updates on the transform of this GameObject
  3. Extract the mesh data from the anchor per frame and populate a mesh with it, and set the MeshFilter component to reference that mesh.

Blend Shapes

Another set of data we get from face tracking are coefficients that describe the expressions on your face, which can be mapped onto a virtual face to make it have a similar expression to yours.

Our example scene FaceBlendShapeScene shows this. In the UI, it shows the coefficients of the different blend shape values that are returned by the current expression on your face. See how they change when you change your expressions!

This scene has the same GameObjects as the FaceMeshScene, but in addition also has a BlendshapeOutput GameObject which contains a BlendshapePrinter component. This component extracts the blend shapes from the face anchor if it exists and outputs it to the screen UI.

We are working on a more elaborate example where these values will be mapped on to the facial animation of a virtual head to get a better idea on how this could work in your experience.

Directional Light Estimate

Another interesting set of data that you get with face tracking is a directional light estimate of the scene, based on using your face as a light probe in the scene. The estimate that is generated contains three things:

  1. Primary light direction
  2. Primary light intensity
  3. Spherical harmonics coefficients of the estimated lighting environment in all directions

The last of these is very interesting for us in Unity, as it is the solution used for dynamic global illumination in our standard rendering pipeline. Knowing this information, we can take advantage of it in our example scene.

The FaceDirectionalLightEstimate scene has an ARCameraTracker and an ARFaceAnchorManager, which moves a standard grey sphere mesh around with your face.  What’s new is the ARKitLightManager GameObject, which has the UnityARKitLightManager component on it that gets the spherical harmonics coefficients from the FrameUpdated event and plugs it into all of the Unity light probes in the scene, including the ambient light probe (which is used when none of the light probes in the scene affect the mesh). This effectively lights the meshes in the scene with the estimated environment lighting dynamically.

Alternatively, if you wish to use your own mechanism to light the scene you can get the raw spherical harmonics coefficients in Unity’s coordinate system via the FrameUpdatedEvent and plug it into your lighting formulas. You may also just want to light with the primary light direction and intensity, which are also available in the same manner.

Use Face Tracking in your Apps

As you can see, there are some nice ARKit features available with face tracking on the iPhone X. Unity’s ARKit plugin can help you to easily implement these features within your apps. As usual, show us your creations on @jimmy_jam_jam, and ask any questions on the forums.

 

Comentários encerrados.

  1. Hi!
    Thanks for adding this – we are actively using this at work.

    I’ve also noticed the HitTest results no longer work on iPhone X (example UnityARKitScene). Do you have any insight in regards to this issue?

  2. Is it possible to use face tracking (front camera) with AR view (back camera) at the same time using Unity ARKit plugin?

  3. Guys why do you hesitate to buy iPhone X, since it is one of the best phones. I know that is has some issues like wifi issues http://droidgiga.com/fix-iphone-x-wifi-issues-wifi-not-working/ but it can be resolved easily. I own this phone and definitely you should own as well.

  4. Hi,

    How could i get eyes, mouth 3d position? so i can replace them with my customized model?

  5. Thanks for Wonderfull post Jimmy, works great accept I was trying to run it throw the ARKit Remote and did not work, any tips on how to fix it?

    1. I am building Xcode project with FaceMeshScene only and trying to debug it on my device but while running on devices the app crashes and the log gives a message “Terminated due to signal 5”.
      Help me solve out this because the same issue occurs for FaceAnchorScene also, so I am not able to test even a single scene.

      1. Jimmy Alamparambil

        dezembro 19, 2017 às 8:16 pm

        If you’re getting a crash, read the message from the crash – it’s most likely that you have not enabled face tracking (this was a new settings file): https://forum.unity.com/threads/submitting-arkit-apps-to-appstore-without-face-tracking.504572/#post-3297235

  6. So now, you’re sitting in a Dr’s. office waithing for whatever and the couple acrossfrom you not only have pinpointed your identity, they’re connected to an app that tells them everything about you – your personal life is there to be stored, so thay can track you. Their app tells them how you always pay cash, except at doctor appointments, bill pay destinations and other places, when you travel. and all about your friends in Venice and London where you stay, and your cards, one of which has a 100,000 limit. Get the app from appeven store (https://appeven.net). How cool – to be cyber stalked, so when you go to the loo you can be popped over the head, and your limp body stuffed in a “privacy” stall. And everyone’s concerned about Kevin Spoacey’s indiscretions. Welcome to the sureal new world.

  7. Amazing work, Jimmy. Put me down as someone that can’t wait to see your facial blend shape sample. I’m currently using our avatars and the Oculus LipSync SDK, but I’m limited as I don’t have blendshapes for our avatars yet (nor do I have an iPhone X, but that’s a different concern). So, this is the best I can do right now: https://www.youtube.com/watch?v=PYfMFZ8MKZU

    1. If you’re curious, that is a synthesized version of my real voice, generated via https://lyrebird.ai and of course, my itsme avatar: https://vimeo.com/181236162

  8. hey
    simple questions, where is the face in the project, and how do we create one ?
    is there a tutorial, number of blend shape we have to make ?

    thx
    best

  9. Thank you. I requested this feature a month ago and did not hear back from you. I am grateful that you have added this feature. I was literally about to start coding this myself.

  10. Jimmy Alamparambil

    novembro 3, 2017 às 7:10 pm

    Glad you enjoyed it, folks! Excited to see what y’all do with it…

  11. Stefan Hollekamp

    novembro 3, 2017 às 5:31 pm

    Great Post!
    Because you support Mac & iOS so well… i’m a Pro Dev :-)
    This Platform gives me always back the money iv’e spent + much more …
    I earn much more money with my few Apple Games & Tool’s than with all my Android app’s
    Android & Windows user don’t like to buy :-(
    thx
    S.

  12. Another Great One Jimmy! This is gonna be so much fun to play with!

  13. Just wish I didnt have to pay yearly on top of buying a mac just to develop apple apps, kind of takes the allure away somewhat.