Search Unity

We are continuing our series of blog posts from last year, this time featuring a technical, behind-the-scenes glimpse of how the visual effects in the Adam demo were created.

About me

My name is Zdravko Pavlov and I’m a visual effects artist with 11 years experience in motion graphics and compositing for CG productions, VFX art for games, and 3D modeling, lighting and rendering. My responsibilities in the production of the Adam demo included the creation of particle effects and the dynamic simulations.

Particle Effects


I used PhoenixFD to set up a fluid simulation. The rendered sequence was used as a flipbook texture in Unity’s Shuriken particle system.

PhoenixFD simulation preview inside 3D Studio Max

The particles controlled the density of a fog volume that was affected and properly lit by all area light sources in the scene. The area lights code was implemented by our GFX programmer Robert Cupisz, and is now available on GitHub.

Volume fog density controlled by the particles


I also used PhoenixFD for the liquid simulation of the ripples in the puddles.

3DS Max viewport preview of the simulated liquid surface and the rendered texture

Shuriken particle system is triggered under Adam’s feet

The rendered sequence was once again combined into a flipbook normal map texture. The particles were spawned under the robot’s feet and rendered in an offscreen buffer. The result was used to modify the surface normals of the puddles.

Dust and smoke

I used Unity’s native Shuriken particle system for the sparks and plumes of dust when the bullets hit the concrete in the shooting sequence, as well for some of the secondary effects like tiny dust specs flowing through the air and columns of smoke rising from the vents of the city wall.

Some environmental effects examples

Dynamic simulations

In Adam we had motion captured performances for all of our characters. Still, we needed to add a layer of secondary animation for all the clothes and different accessories that weren’t present physically on the mocap set. Instead of animating those by hand, we set up a number of dynamic simulations to help us with the process.

From the very beginning we knew that we were going to have some closeups and slow moving shots. The simulations needed to match the quality of the demo in terms of precision and visual fidelity, so we looked into some high end offline simulation solvers. The calculated results were baked and played back as animations.

The sleeve

The “sleeve shot”, as we were calling it throughout the production, was actually the very first shot I started working on. I looked into some different cloth solvers, and also skinned particles approaches.

Eventually I ended up using 3DS Max’s built-in cloth modifier for the final simulation. I set some basic tests and showed it to the director.

Some of the very first cloth tests (3ds Max viewport preview)

After some iterations, I took Adam’s actual animation and geometry and started working on the final version of the simulation.

Cloth simulation (3DS Max viewport)

The results were baked and brought into the engine via the alembic importer developed by Unity Japan, which is freely available on GitHub.

Different versions of the simulation imported as alembic streams in Unity

The mask

For the damaged mask, I built a simple Thinking Particles setup.

Thinking Particles setup for the mask fracture

The procedural nature of TP and its integrated Volume Breaker helped me iterate faster and explore different options. I handed over the approved version to our 3D modeler Plamen (Paco) Tamnev who used it as a base for the final model.

Different Thinking Particles sim variations

Both the torn sleeve and the broken mask of the main character were not only important for the story, but they also serve as tools that help Adam stand out from the crowd later on.


Caronte is a multiphysics simulation engine developed by Next Limit, and had been Real Flow’s rigid and soft body dynamic solver for quite some time now. CaronteFX brings the same high quality physics solution to Unity through a seamless engine integration, which made most of our FX shots possible within the tight production timeframe.

It was still an early alpha version when we first picked it up. It didn’t have ropes and cloth integrated yet, but we started working in close collaboration with the CaronteFX team and they’ve put a tremendous effort into providing us with all the tools and options that we needed throughout the production. Currently all of these are available with CaronteFX on the Asset Store.

The cables

Early in the production, we weren’t sure about the cables that keep Adam attached to the machine. We were worried about potentially twitchy cables behaviour, geometry intersections and so on.

Fortunately CaronteFX has quite precise ropes solver that calculates collisions based on the triangles of the geometry, thus taking the actual rope (cable) thickness into account.

CaronteFX player in Unity 5

Baking a complex simulation like this one produces quite a lot of cached data. We used the automatic skinning option that was added in CaronteFX. A hierarchy of bones is generated and the result can be exported as FBX file.

Skinned animation playback in Unity 5

The shooting

There are quite a few things going on in this action sequence. Bullet impacts lifting plumes of dust and sparks off the ground. Concrete fragments flying everywhere. A robot’s leg and arm shredded to pieces.

Early Thinking Particles prototype

I started working on this shot quite early. I used Thinking Particles for some of the early versions. TP is incredibly flexible and robust and I was able to quickly try out different scenarios. Later, when we adopted CaronteFX in our pipeline, I recreated this effect in it, so I could use its Unity integration and the convenience of the automatic skinning.

Early prototype of the shooting sequence inside Unity

CaronteFX has a very flexible fragmenting system. As a VFX artist, I was able to fracture assets into pieces, without leaving the Unity Editor. The new assets are saved along with a new material for the newly generated geometry faces.

Ground bullet impacts preview made with CaronteFX

Sebastian and Lu

These two characters were particularly difficult to build. Not only the sheer number of different types of dynamic objects (soft, rigid bodies and clothes), but, especially in Sebastian’s case, the layers of complex geometry layered over one another.

The base animated model (left) and the objects set up for simulation (right)

I split Sebastian’s geometry in three separate groups. The mantle, the cape and the shards that are hanging on the leather straps in one group. The skirt with the tablet and the knife in a second group. And the hood and the earrings in a third one. After that I began to reposition the elements to avoid initial intersection and started modeling lowpoly proxies to ease up the calculations.

Example of proxy collision object for the knife and a few joints helpers

Preview of Sebastian’s simulation in Unity 5

We have published the Sebastian character on the Asset Store and included a Caronte player in the package, so you can see how the simulation was set up.

Lu went through the same process, but there were fewer objects that needed to be simulated so setting it up was a bit easier and straightforward.

3DS Max viewport screenshot of Lu’s model prepared for simulation

CaronteFX player previewing the calculated sim, before baking

There was some back and forth going on between me and our production designer Georgi Simeonov, until we got the right thick and heavy leather feel for Lu’s skirt.

In the end we used soft body for the skirt instead of a single layer of cloth. The soft body keeps the volume better, and doesn’t wave as much as a cloth would, so it gave us the look and behaviour that we were after.

Lu is available on the Asset Store as well, in the package with the characters Adam and Guard. I have remade their simulations to use the native Unity cloth simulation physics, to make it easier for you to peruse the characters, and also play around with Unity’s cloth.

Have fun!

20 replies on “Adam – VFX in the real-time short film”

Thanks for a great post.
Was a bit disappointed at first, but then I realized that the professionals working with this need to use the tools they are most productive in and that results in the best visual quality. Unity seems to allow professionals to use such tools, which is a great strength.
But unfortunately, in this case, visual quality trumps everything else for most of the target audience. I would also have preferred to see the use of more core engine features

Thank you for the great article!

But to be honest, it would be more interesting to see what actually can be done by Engine, not by “pre-rendered” meshes, offline simulations and so on.

But we can think about this demo as “render engine abilities” demo, not like “game engine demo”

Really great work. Seeing this I was stunned by the quality. I also appreciate you sharing the process for how this was built.

I do feel a bit mislead though. As a demo for a game engine its a bit misleading when even the lighting is custom. Also the physics uses a 3rd party plugin that could someday just not be supported.

Is Unity officially supporting caronteFX? Are the area lights going to be officially supported, included in 5.6, or a future version? If your selling me Unity you should show me what it can do with the features it includes out of the box.

I’m sorry I know its hard to share things like this and don’t want it to feel like I’m being difficult. I just want to express that though I do understand the need for baking, creating flipbooks etc.. , in an engine demo I expect a bit more engine. I don’t think your intention was to mislead anyone but it’s something to consider in the future, so false expectations are not set and people are not left upset.

CaronteFX is an asset from the store developed by Next Limit Technologies. We worked closely together on this demo and were very satisfied with it and what it could bring. No integration of it is planned in the engine.
Area lights will be integrated in the engine and we plan to even support more shapes (we only supported rectangle area lights at the time of the demo). Same with volumetric lighting. There is no official release date for this though. We have worked hard to offer both of them in an open source package in the meatime which also has its advantages.
Many features which were used in this demo and not available in the engine at that time are now already integrated (instancing, motion vectors, texture arrays) or will be available in future versions (Timeline).
If you look at things another way, this demo shows also that it is possible to customize the input formats and lighting in a quite advanced way. :)

It’s an incredible piece of work. I notice Adam’s face is held on by Hummel plastic grips (chevron logo). A hint at Unity’s danish begyndelse?

This is beautiful work!

Thank you for sharing the intricate process so much for all of us devs who are looking to create analogous effects and looking for examples like this to learn from.


So most of the VFX were done outside Unity which means Unity was used as a renderer. Not bad but completely missing an oportunity to improve the engine.
Now you should try all the VFX in real time sim and only using the engine.

I’d really like to see future Demos to use more of Unity’s own real-time systems instead of baking everything out in Max/Maya/Houdini/CaronteFX etc. These really should be real-time demos, not “everything’s offline-simulated and baked but we just render it real-time” demos, in my opinion.

Yes, it’s true that you won’t be able to achieve the same level of quality in the simulations if they’re fully real-time but at least it’ll be more truthful to what one can accomplish when building games in Unity.

Regardless of these criticisms, I appreciate the effort you put into these Demos and into writing these blog posts.

My thoughts exactly. Almost everyone of these posts has has the same idea — do it in an actual VFX program and then just export geometry. It becomes unclear what benefit Unity provides in these scenarios. I was initially really excited about this demo, but as more posts come out and more is revealed to not be actually be Unity, I’m definitely disappointed. Even the cables!

Hm, i don’t fully agree with you.
I really like these projects and dissections and liked this one in particular a lot.
Where i agree is that yes, it would be interesting to have another project some time only made inside Unity with zero outside app/tool used, but to me more interesting on an academic level.
It would not feel any less or “more truthful to what one can accomplish when building games in Unity.” Actually thinking more about it, it would indeed feel less like what one can make with Unity to me, because it is common sense to use the best tools for all aspects and then bring them all together and that is what’s commonly done in most Unity projects and one of the biggest strengths of Unity to allow bringing that all together.
Most people would not try to make all their textures or animated biped models in the engine alone.
That brings me to : Regarding the point whether this is a good project as in showing realistically/well what can be made with Unity, i think it is a great project.
I think it is an important lesson to learn for each of us devs and designers that most of the audience does not care at all how something was made or runs behind the scenes, they care about how it looks and feels to them and how it runs on their hardware, too.
(Besides a tiny exception fraction who are geeks like us usually..who also get very excited when an impressive 5kb demo is made all procedurally =) )
And until we all have way faster cpus/gpus, it’s just common sense that a lot of things can be done in way nicer looking/running form when made in dedicated tools where then the mid product or end result is running in quasi/semi pre pre rendered form to some degree at runtime in the engine.

Yes, if one looks carefully one can see clearly the difference between a volumetric all gpu particles particle effect and one made in “more pre rendered” way with a bunch of billboards, but what will be more important to the user? Both will look fine and one of the two will be way less taxing on his cpu/gpu so will likely run way better with way more instances.

The project is meant to show high end looking visuals running in Unity, whether models, textures, effects, physics sim etc is done partially or entirely in another tool does not matter to the end user enjoying the content.
The Adam demo would not look or feel better to me if the cables were all made with joints inside Unity alone.
I think it’s very common and understood by most that usually for most games textures would not be made in Unity, because there are other tools dedicated to making 2d graphics. So why should it be any less ok to make some other aspects outside of Unity and then bring it into Unity?
I know, a geek side in many of us would ideally love to make everything in engine or some of us even all realtime with zero outside/pre rendered/pre created assets maybe even completely procedurally, but for many types of creations that would actually lead to a way worse result in looks/performance etc and/or take longer to do.

And if some then wonder what is the advantage to still use Unity when one uses many external tools still/too, well, Unity allows to bring this all together, make it even cooler than the sum of it’s parts and deploy it to many platforms in many ways, too. One can still in own projects decide oneself how much one makes inside Unity and how much in external tools or using plugins/editor extensions or whatever, the strength is that it allows to do all that and bring it together as one wants.

Of course in the long run one could/should consider which of those aspects of the whole pipeline and creation process and effects and shaders etc and all one integrates and implements inside Unity, but it seemed to me that that is also done when looking at some of the stuff shown at Unite regarding some of the things we’ll get this year.

I think it is a misconception that games don’t use baking techniques. Even cloth on realtime characters can also be cooked when done smartly and it can improve performance and look better. All sorts of secondary animation effects do make sense to be pre-baked. Sometimes you want the control and predictability baking gives you. Thats not to say there are no use cases for fully dynamic realtime simulation, of course there is and in a game thats pretty common, the more open world, dynamic the game is there more you want that. But games have a very wide range from linear/cinematic to fully procedural.

Even for fully dynamic worlds if you want to do destruction, most AAA games pre-fracture geometry offline to get the best looking results & avoid heavy computation while the game is running.

And particularly cinematic games very often use baked physics even for things that could totally be done with a realtime physics engine. For example in the latest Tomb Raider, when the ice and rocks are falling down etc. all of that is baked. It makes a lot of sense to pre-bake that into an animation, with CaronteFX this can be done all inside Unity, which is very powerful.

Nobody is saying that baking should be avoided, but since the demo should showcase Unity capabilities we would prefer something that showcase capabilities 360 degrees. Otherwise one could just slap a video texture and do everything outside Unity.
What you did showcase is only the graphic capabilities of Unity as a Rendering engine, but Unity is supposed to do much more than simple 3D rendering.

I agree with Chris on this one. Pre-calculated things definitly have their place in games, but you want to showcase Unity, not just the results that come out of Max and other tools. The primary reason why is that indies do not have access to those tools due to the cost barrier. The way it is it’s more like “look what you can do if you have the money!”.

Comments are closed.