Search Unity

Create your own animated emojis with Unity!

December 3, 2017 in Technology | 4 min. read
Topics covered
Share

Is this article helpful for you?

Thank you for your feedback!

With the release of the iPhone X, Apple popularized the use of “Animojis” for sending animated messages and everyone was showing off their newfound karaoke animation skills. Less known was the fact that Apple had released a face tracking API for the iPhone X which allows you to create your own animated emojis. Unity already supports the use of ARKit Face Tracking but in this blog we’ll show you how to use this API to create your own version of these animated messages, or even facial animations within your games or homemade videos.

As mentioned in the previous blog, ARKit face tracking returns coefficients for the expressions on your face. In the example previously included, we printed out the coefficients on the screen.  In the new example we’re describing here, we use a virtual face setup in our scene to mimic our expressions and thus create our animated emoji.

Create blendshapes

The content needs to be created with the intention of using the blendshape coefficients returned from face tracking as parameters into the animation of the virtual face. In our case, we created the head of a stylized sloth whose face was manipulated to conform to the shape of each of the different coefficients we have. Then we set up all the different shapes to be blended together in our content creation software (e.g. Maya or Blender). We named each of the blendshapes such that they could be easily identified and matched to the coefficients returned from the SDK.

In the case of Mr. Sloth, we used all 51 blendshape locations that face tracking gives to us.  We could have opted to only have a fewer number of blendshapes that would still convey the characteristics of our virtual face.  E.g. we could have used a more stylized face that would only react to shape locations like eyeLook, jawOpen, or mouthClose and not to more subtle ones.

For each blendshape coefficient that we use, we would create a blendshape that would convey the expression of that particular part of the face based on the reference shape given in the ARKit SDK.

For example, ARKit SDK gives us this reference image for jawOpen. For our Sloth face, we create a blendshape called jawOpen that looks like this (left image is base mesh, right image is mesh with fully open jaw):

Continue to do this for all the shapes you want to support. Next, we create the mesh with the blendshapes using the guide for the content creation software we’re using. (E.g follow these steps for Maya.) Finally we exported the whole sloth head mesh as an FBX file so that we could import it into Unity.

Set it up in Unity

In Unity, we drop the FBX file described above into an Assets folder, where it gets imported and made into a Unity Mesh that has a SkinnedMeshRenderer containing a list of the blendshapes. We then use this mesh in a scene like FaceBlendshapeSloth which is a new example scene in the Unity ARKit Plugin code.

We need to set a reference to the sloth mesh in the scene on the ARFaceAnchorManager GameObject that will keep track of your face, placing and rotating the sloth face however you move your head around.

Then we put the script BlendshapeDriver.cs on the GameObject that has the SkinnedMeshRenderer component, which takes the blendshape coefficients from face tracking and plugs each of the values (multiplied by 100 to convert ARKit fractions to Unity percentages) into the blendshape factor with the same name on the list of blendshapes on the SkinnedMeshRenderer.

Now if you build out this scene to your iPhone X, you should be able to see Mr. Sloth’s head move with your head and have the same expression on its face as you do. You can use iOS video recording to send an animated Slothoji to your friends, or use the Sloth face to talk trash (slowly) at your rivals in your Unity game.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Have fun!

As you can see, setting up a virtual character whose facial animation is controlled by your face is pretty easy on iPhone X using Unity. You can have a lot of fun recording animated messages and movies for your friends and loved ones. Please tweet us your creations and slowjam karaokes to @jimmy_jam_jam, and send us any questions or suggestions on the forums.

December 3, 2017 in Technology | 4 min. read

Is this article helpful for you?

Thank you for your feedback!

Topics covered