Search Unity

I am Krasimir Nechevski, and I am the animation director in Unity’s demo team. Even though I have a degree as a software engineer, I have been working as a character animator for more than 15 years. Before joining Unity’s demo team a year ago I was lead animator in Crytek for 7 years.  

Working on the short film “Adam” was both a privilege and an exciting opportunity to take part in a production involving a great original idea, cutting edge technologies, and an awesome team. I knew it would not be easy, because at times we had to sail in uncharted waters, but nevertheless the team carried out the task with confidence and enthusiasm.

You can read more about my teammates’ experience during the production of the Adam demo in their blog posts:

In this blogpost, I will cover all areas of animation-related work. Most of it I did myself, and in some areas we were assisted by contractors whom I managed. These are the things I was directly responsible for: the previs, Adam’s rig, most of Adam’s animations, environment animations, camera animations and management of the mocap sessions. I also worked closely with our junior coder Dominykas Kiauleikis on designing and developing the crowd system. Last but not least, I did the assembly of the movie inside Unity’s in-development sequencing tool.


The plan was to keep everything flexible at all times. We wanted to be able to switch assets and iterate where necessary. As a result, the project was in a semi-previs stage for a lot of the time. We started by building a placeholder model for Adam.

The first version of Adam

The aim was to get the basic proportions right and also iterate on the mechanics of the shoulder rig, because even though Adam’s shoulder looked humanoid it was not a balljoint with many degrees of freedom but a combination of single axis rotational joints.

An early version of Adam’s shoulder rig test

Another thing that was very important to us was to be able to work with the actor as early as possible. Our actor, Ovanes Torosian, had experience in both film and theatre, but hadn’t done motion capture before. We decided to use the Perception Neuron system, a very affordable markerless mocap solution, in the early previs stage. By doing so, we were able to give the actor some time to adjust to the technology while we tested different approaches and iterated on Adam’s pose, body language, and performance. We were also able to try out camera movements with a handheld DSLR rig.


Rehearsing Adam’s performance with a markerless solution (Neuron)

We started making the previs in Motionbuilder with tests of one of our signature scenes, where we tried a couple of camera cuts, ranging from a single seamless camera to fast paced cuts.


Still from the first version of the previs assembled in Motionbuilder

Soon after we decided to move the previs into Unity. There, Vess, the Director, was able to add lights and materials, and to bring the previs closer to final quality while taking full advantage of working with a real-time engine. This iterative approach, and the very early assembly of the whole movie in Unity, proved to be very efficient in giving us a fast turnaround between different shot tests.


An early version of the previs in Unity

Adam – Rig

Vess wanted to achieve a realistic, documentary look for the film and we needed Adam’s mechanics to look convincing. I did not want to be forced to cover or fake anything like bend hard surfaces or hide them from the camera. Moreover, when we started, a lot of the functionality was not clear, so throughout the initial stage of production Adam had only his basic skeleton and some parts of his arms functioning. This structure and proportions remained until the end but many more parts were added later while some were altered to fit the required range of motion. The changes which needed to be made in order to achieve the desired freedom of motion influenced the character’s concept, and there was a lot of back and forth between me, our production designer, Georgi, and our modeller, Paco, until we reached the final result.

Different stages of Adam's model

Different stages of Adam’s model

The rig which was exported to Unity was done in 3D Studio Max with the help of Character Studio as the main skeleton. On top of that, I added procedurally animated bones, e.g. the scapulae, clavicles, upper arm, and forearm. For the rubber parts around the belly and neck, I decided not to use blend shapes and baked simulated deformations onto a set of helper joints.


Adam’s rig

Another part of the rig I put a lot of attention into was the eyes. For them, I made a functioning rig resembling the blend of a photocamera with separate rotating blades for the iris. The eyelids were segmented and each piece was individually pushed by the cornea.

The rig of the eyes

The rig was then imported into Motionbuilder. There, the Character Studio skeleton was characterized as a Human IK Biped. All of the motion captured data was retargeted, cleaned, and altered inside of Motionbuilder. Most of the hand-keyed animation work was also done inside Motionbuilder the exception being the eyes.

Adam’s rig inside Motionbuilder

Adam’s rig inside Motionbuilder

After a shot was ready, it was exported back to 3DS Max where all of the procedural animations were added, along with the hand-keyed animation for the eyes. The result was then baked and exported as a FBX to Unity, where it was imported as a generic rig.

Adam – Mocap and Animation

During the production, a lot of shots proved to be challenging – especially in the first part where Adam was falling from a stretcher and crawling on his knees trying to remove his mask. The challenge lay mostly in finding the perfect sync between the character and the camera’s movements. During our previs mocap sessions, the actor managed to iterate on the performance and also had the chance to understand the role in depth. At the same time, the director used the opportunity to give feedback about the performance and really fine-tune the movements, as well as iterate with the cameras to make sure all the elements fitted together.

Eventually, when we got to shoot the final performance in a proper mocap volume, we knew exactly what we wanted, so we were able to shoot all the initial takes we had in mind in a single day. The facility at studio Cinemotion in Sofia, Bulgaria, provided everything we needed in order to simultaneously capture Adam and the camera by using a virtual camera setup. Moreover, since the price levels were very reasonable, we could comfortably apply our iterative production approach, and return for reshoots, additional shot exploration and sudden last-minute creative ideas as necessary.

Final mocap session at studio Cinemotion, Sofia

Final mocap session at studio Cinemotion, Sofia

After we shot the final performance, I used data from different takes and managed to stitch together the parts of the actor’s performance and camera capture that Vess liked the most. This wouldn’t have been possible without spending that extra time in the volume trying to nail all the parts.

Early version of one of the interior shots

After I retargeted and cleaned the data, it was time to add the finishing touches. Next, I hand-keyed the gaps where it was not possible to capture the proper motion or pose, such as Adam hanging on the stretcher machine, and proper hand movements. After that, I hand-keyed the fingers inside Motionbuilder and exported the shots to Max.

Eye performance study

During our mocap sessions, I separately captured some footage of our actor’s eyes with a head-mounted camera. Finally, in Max, and by using the footage as a reference, I added the eye movements. Adam was ready for export!

Proof of concept of the eyes in motion

Camera – Mocap and Animation

To achieve the desired documentary look, we needed the recognizable motion of a handheld camera. For that, we used a virtual camera setup operated by our cameraman, Mihail Moskov. Even though we captured the camera in the mocap volume, we knew that we would continue to iterate on the edit and that some shots might need to be added later in the process. To keep our options open for as long as possible, we conveniently captured some cameras with generic movement: i.e different pans and rotations. We then used Motionbuilder or Unity’s sequencer to blend those together to create a new camera which we hadn’t anticipated needing when we performed the mocap. This allowed our Director Vess lots of flexibility – he wasn’t bound to the captured material.


Mihail and Ovanes in the mocap volume

Sebastian and Lu

Sebastian and Lu are the two strangers that appear in the second part of the movie. Capturing their performance was yet another challenge, because they walked on retractable stilts. Stanimir ‘Stunny’ Stamatov, a stuntman who had prior experience with stilts, did the performance for both characters.


Sebastian and Lu performed by Stanimir ‘Stunny’ Stamatov, stuntman — Studio Cinemotion, Sofia

By the time we shot the mocap session for these characters, Vess had several different ideas about their performance, and hadn’t made a decision. We needed to adjust our method so he would be able to experiment with building and playing out various versions of their actions. So we used more of a game-like approach – we captured different animation loops and transitions which we could later assemble. We did a number of versions of some of animations so we could pick the one we liked best at a later date. The animations utilized a set of loops: walk, idle, stop, descend etc. and some transitions. It proved flexible, but it did have drawbacks as there were issues with the flow between some of the animations.



Sebastian walk cycle

For the rigging and animation work on Sebastian and Lu, we contracted Bläck Studios and Pixelgrinder two companies with whom we already had previous experience from our demo The Blacksmith (2015). They did the rigs in Maya and Motionbuilder with the help of HIK and a custom extension of the rig for the stilts. Adam’s functionality was not used for Sebastian and Lu because they had garments covering most of their bodies.

The cloth and rope simulations were done by our VFX artist Zdravko Pavlov with the help of CaronteFX – a Unity plugin developed by Next Limit Technologies and available on the Asset Store. We’ll publish a blogpost focused on Zdravko’s work on the production where you can learn more about his work with CaronteFX and the other effects in the movie soon.


Sebastian and Lu rigs in Motionbuilder

The Guards

The guard rig was done in a manner very similar to Adam’s rig. It had a base Character Studio skeleton and additional procedural parts on top of that. The additional rigs took care of armor sliding and bending. The cloth below the waist and the pouches were done by Zdravko with CaronteFX.

Guard’s rig

For the guards’ motion capture session, we captured both Ovanes and Stanimir. The scene where one of the guards starts shooting at the crowd was a bit more complex than the rest, so I captured the performance in segments which I later stitched together. For the retargeting and cleaning work on the guards, we used Cinemotion’s animation services.

Ovanes, Stanimir and Mihail in Cinemotion mocap facility

The Crowd

The main challenge with making a crowd is the sheer amount of content that needs to be produced in order to achieve enough variety. On top of that, these mechanical, stumbling, confused characters needed to look sentient and as conscious as possible. Our approach was to make crowd variants which had unique behavior (i.e. sad, energetic, curious). For every variant of the crowd we had a very simple state machine which would drive each agent.


Animation Controller for the crowd

Ideally, we were aiming to have around 8 variants, all of them with the full set of the required animations. However, as the movie evolved and the edit gradually solidified, it became clear that each crowd variant would need to have around 90 seconds of animation. This proved to be more than we could handle and we ended up with only 4.


Crowd model and bones in Motionbuilder

We built three LOD versions of Adam, with 41, 28, and 13 bones, respectively. We ended up using only the two higher LODs since the GPU skinning tech, developed by our tech lead Torbjorn Laedre, proved to be able to handle the character count we needed. As previously, Ovanes and Stanimir provided theThe mocap performance, was done again by Ovanes and Stanimir and after that the mocap retargeting and cleaning was carried out with the help of Cinemotion’s animators.

The Crowd System

The crowd system was developed by our tech lead Torbjorn Laedre, our junior programmer Dominykas Kiauleikis, and me. Torbjorn wrote all of the actual ‘playback code’, keyframe interpolation, gpu skinning, crowd instancing, and material variation. Dominykas wrote all of the vector field code and tools.

Foremost, we needed to identify the tasks our crowd system would need to solve:

-The crowd simulation needed to be 100% deterministic

-The system had to work with Unity’s sequencing tool allowing for fast scrubbing

-We needed a robust way of controlling the flow of the crowd as a whole

-We also needed a way to tune in custom values for separate crowd agents.

To control the crowd’s flow, we decided that the system should use a vector field implementation, which would affect the crowd agents’ orientation in space as they walked through it. Dominykas made some simple yet robust vector field authoring tools in the form of splines. Each spline could either align the field to its direction, attract or repel it. Each had an area of effect and strength parameters.


Vector field visualization in Unity

We added a feature for initial random distribution of the agents in an area. Each agent in this resulting distribution could later be altered by picking a starting point’s position and rotation, initial state in the state machine, and a delayed trigger. We also added a feature that captures a snapshot of a crowd’s state at any point on the timeline, and which could then be used as starting point for another shot.


Crowd control panel

The Sequencing tool

One of the roles of the Demo team within Unity is to give a user’s perspective to the engine developers. For Adam, we worked very closely with the team developing the new sequencing tool in Unity. We used early prototypes and pre-alpha versions of the tool, constantly providing feedback. In that way, we were able to influence its design and development. Working with the sequencer on our film felt very comfortable and familiar as it resembled other sequencing tools widely used in the film industry.

For the character and camera animations, we used standard animation tracks, which we could trim, blend, and offset in world space.


Animation tracks (marked in red)

For the more advanced features driven by the sequencer, we used another type of track which enabled us to run custom scripts: Playables. Examples include the Crowd system Scene Manager- used to enable/disable objects – change lighting, and cameras etc.


Custom script tracks, a.k.a. Playables

Another really useful feature of the sequencing tool is the ability to record a change in any parameter. Using that in combination with the inline curve editor allowed me to easily animate camera movement, properties, fades etc.


Inline curve editor

For the final outcome of the project, our first deliverable was a stage demo. We built the executable directly out of Unity and delivered it for presentation.

In order to provide a preview of the film for online audiences, we also made a video intended for YouTube. It was captured with the help of a small script that we have, which we call BeautyShot. Unity’s R&D is currently looking into the possiblity to implement a video capturing solution directly in-engine.

Thanks for reading! Right now, we’re working on preparing a standalone for release, and there are more blog posts about how we made Adam in the pipeline, so stay tuned…

16 replies on “Adam – Animation for the real-time short film”

Hello Unity Technology,

We Hope , We Can Use of Tools WITHOUT import packages in Unity 5.6 :

01-cinematic sequencer
02-volumetric fog
03-Real-Time Polygonal-Light Shading with Linearly Transformed Cosines
04-Cinematic Image Effects in action
05-crowd system

Thanks a lot Unity Technology

Hey is it possible to get alpha/beta access to Director? Our company’s applications are essentially a series of cut-scenes (basically a “choose your own adventure” game for educational purposes) and we use our own custom “director”. It would be very interesting to try Unity’s version though! We have Unity Pro if that matters.

How amazing you took the time to give back to the community with this information – thank you:)

Adam made my artist friends sit up and take notice of Unity. I think the engine can be used for a lot of different applications besides games (im a game designer so i love unity for games of course!) and it’s a great way to reach out while showing off some of Unity’s potential.

Nice. Does sequencer allow real-time recording of parameter changes? Recording dragging values on a loop at half or quarter speed would be a better animation tool than… animation tools :)

for this part
“For the rubber parts around the belly and neck, I decided not to use blend shapes and baked simulated deformations onto a set of helper joints.”
How do you bake the simulation into the joints?

simulate>constraints all joints into simulated surface> bind skin lowpoly meshes into the joints helper?

Very nice, but very distant from my reality. All those mocap, sensors, expensive softwares, actors and cameras. Maybe explore how the illumination was made in Unity would be more realist for we, poor mortals. But really nice, congrats for the good job!


I’d also add that while the film is very cool, it doesn’t really deal with the realities of actual game development as much as it should. I think the Unity demo team’s time would be much better spent working on an actual game made with Unity instead of realtime movies like Adam or The Blacksmith.

If the Unity demo team was working on an actual game, they could much better drive the development of things like UNET, PlayablesAPI, Nested Prefabs, Terrain tools, Lightmapping tools, rendering in general, etc, etc….

These are the things that would help Unity users tremendously. I’m afraid a realtime movie like this doesn’t really do much for us.

Absolutely! Making something cool and making something that requires support and development for several years are two quite different workflows. I don’t really understand why nested (or better inheritable) prefabs is not a number one priority feature on the todo list. I think a major part of community would appreciate an article about its current state.
The movie is outstanding though:) I suspect there’s almost no chance Unity will turn it into real AAA title, since the amount of work to be done is enormous, but maybe there’s a chance you will sell the IP rights to some decent studio:)

Absolutely! Making something cool and making something that requires years of support and development are quite different tasks. Not sure I understand why nested (or better inheritable) prefabs is not a number one feature on the todo list. I think a major part of the community would appreciate an article about its current state.
The movie is outstanding, though:) I suspect there’s almost no chance that Unity will turn it into a real AAA title since amount of work to be done is enormous, but maybe there is a chance you will sell the franchise rights to some decent studio:)

Very cool and informative.

Is there any plan to release the project publicly? I would love to dig into all those tools you’ve made and see this running in real time with my own eyes!

Comments are closed.