Search Unity

With the release of the iPhone X, Apple popularized the use of “Animojis” for sending animated messages and everyone was showing off their newfound karaoke animation skills. Less known was the fact that Apple had released a face tracking API for the iPhone X which allows you to create your own animated emojis. Unity already supports the use of ARKit Face Tracking but in this blog we’ll show you how to use this API to create your own version of these animated messages, or even facial animations within your games or homemade videos.

As mentioned in the previous blog, ARKit face tracking returns coefficients for the expressions on your face. In the example previously included, we printed out the coefficients on the screen.  In the new example we’re describing here, we use a virtual face setup in our scene to mimic our expressions and thus create our animated emoji.

Create blendshapes

The content needs to be created with the intention of using the blendshape coefficients returned from face tracking as parameters into the animation of the virtual face. In our case, we created the head of a stylized sloth whose face was manipulated to conform to the shape of each of the different coefficients we have. Then we set up all the different shapes to be blended together in our content creation software (e.g. Maya or Blender). We named each of the blendshapes such that they could be easily identified and matched to the coefficients returned from the SDK.

In the case of Mr. Sloth, we used all 51 blendshape locations that face tracking gives to us.  We could have opted to only have a fewer number of blendshapes that would still convey the characteristics of our virtual face.  E.g. we could have used a more stylized face that would only react to shape locations like eyeLook, jawOpen, or mouthClose and not to more subtle ones.

For each blendshape coefficient that we use, we would create a blendshape that would convey the expression of that particular part of the face based on the reference shape given in the ARKit SDK.

For example, ARKit SDK gives us this reference image for jawOpen. For our Sloth face, we create a blendshape called jawOpen that looks like this (left image is base mesh, right image is mesh with fully open jaw):

Continue to do this for all the shapes you want to support. Next, we create the mesh with the blendshapes using the guide for the content creation software we’re using. (E.g follow these steps for Maya.) Finally we exported the whole sloth head mesh as an FBX file so that we could import it into Unity.

Set it up in Unity

In Unity, we drop the FBX file described above into an Assets folder, where it gets imported and made into a Unity Mesh that has a SkinnedMeshRenderer containing a list of the blendshapes. We then use this mesh in a scene like FaceBlendshapeSloth which is a new example scene in the Unity ARKit Plugin code.

We need to set a reference to the sloth mesh in the scene on the ARFaceAnchorManager GameObject that will keep track of your face, placing and rotating the sloth face however you move your head around.

Then we put the script BlendshapeDriver.cs on the GameObject that has the SkinnedMeshRenderer component, which takes the blendshape coefficients from face tracking and plugs each of the values (multiplied by 100 to convert ARKit fractions to Unity percentages) into the blendshape factor with the same name on the list of blendshapes on the SkinnedMeshRenderer.

Now if you build out this scene to your iPhone X, you should be able to see Mr. Sloth’s head move with your head and have the same expression on its face as you do. You can use iOS video recording to send an animated Slothoji to your friends, or use the Sloth face to talk trash (slowly) at your rivals in your Unity game.

Have fun!

As you can see, setting up a virtual character whose facial animation is controlled by your face is pretty easy on iPhone X using Unity. You can have a lot of fun recording animated messages and movies for your friends and loved ones. Please tweet us your creations and slowjam karaokes to @jimmy_jam_jam, and send us any questions or suggestions on the forums.

Comments are closed.

  1. Are you having a hard time getting the frown value accurately represented?

  2. Nice demo! Is there a way to have access to a sample ? Thank you

  3. where can i find a model like this? i want to test the feature but i cant create the model…

  4. Can you test using Unity Remote or you have to build everytime to test?
    Does Arcore support face tracking too?

  5. It’s great to have this kind of articles but we are still waiting an official post clarifying the support state of Unity to macOs High Sierra. After a couple of months of its release there are still a lot of problems, in the next weeks Apple will ship their new computers with High Sierra preinstalled and Unity is not working properly without workaournds. The information is only in the forum, there wasn’t any blog post related to this huge problem that is affecting a consierable part of the community.

  6. How can it be uploaded to Appstore and make it work for iPhone X users?

  7. This would be perfect for as a Messages app. Is there a way to embed a Unity app in iMessage?

    1. Hey yeah, I’d be interested to know this too.