Search Unity

When we showed the Adam demo for the first time at GDC last year, everyone was blown away by the look and quality achieved with Unity. But soon, on everyone’s lips, including my hairy ones, was the question: “How did they do it?”. So I decided to prepare a live demo explaining how to recreate one entire scene of Adam from an empty scene. 

I gave this talk at Unite Los Angeles 2016, which you can watch below, and this blog post gives most details about it.:


The first question I asked myself, was what do the assets look like in normal conditions. At the time I was working with the internal repository, while the Demo team was hard at work to release some nicely wrapped up asset packages containing indoor and outdoor environments and characters which would allow anyone to play around with Adam assets.

Clearly, when put in a new empty project, the raw assets did not look as in the Adam demo! The project had to be first setup the proper way (deferred rendering, linear rendering, HDR camera) and some reflection probes added to get the “real material feeling”. This was a good lesson on how to work collaboratively with PBR assets, the importance of cross-industry compatible materials, and how key is lookdev tools in this process.

In these conditions one could finally really appreciate the quality of the assets creation and their design.


Adam’s look is the combination of all visual effects used, from nice particle effects, to vertex cache animation, baked physical simulation and an entire custom volumetric fog and lighting system. Demo team’s VFX Artist Zdravko Pavlov explains in detail his work process in the dedicated blogpost about VFX in Adam. The great news was that by Unite LA time, most of these systems were already publicly available:

  • The particle effects used the built-in version of Shuriken and make intense use of flipbooks. The image sequences for the fog were created with Chaos Group’s Phoenix FD plugin for 3DS Max and rendered with V-Ray. Initially they were put together in Adobe After Effects, but as of Unite LA, the VFX toolbox developed by our internal VFX artist was made available and open source. It allows a much easier management and processing of the image sequences. We also recently released a package of ready-to-use fires, smokes, and explosions sequences

  • The alembic plugin used in Adam to replay the cloth ripping simulation is available and open source as well. Unity Japan who developed it is now working on a USD importer/exporter also open source.

  • The volumetric lighting package, developed for Adam” by Demo team’s GFX programmer Robert Cupisz, is now publicly available. It includes fog lights, soft shadows and volumetric shadows, noise and wind simulation that makes most of the indoor scene atmosphere. More details about it can be found in Robert’s talk at Unite Europe 2016.


The lighting is probably what is the most specific to a short versus a classic game. Just like in a movie, when doing a cinematic sequence, the lighting artist can light each shot in his own way, placing lights to get the best shadows and highlights for this camera angle in particular. In my talk I wanted to show and explain how the lights were placed in one of the shots. Some shots can utilize many lights to get nice lighting on the walls, arms, hands, face, eyes, back of the head, chest, body shadows, etc.

The lighting is particularly improved in Adam thanks to the new real time area light technology presented by Unity at Siggraph 2016. Our demo team made the first implementation of this research and used it in production. Working with a light which does not come from a point, but from a surface, produces much more realistic results. Apart from the most obvious – sharp specular highlights having the proper rectangular shape as the light source, there’s also a more subtle, yet arguably more important effect. That effect comes from replacing point lights, which don’t exist in real world and we’re not used to seeing materials affected by them, with area lights producing a much more familiar material response, increasing perceived image quality. Combined with the PCSS soft shadows, volumetric shadows, tube lights and fog lighting from the volumetric lighting package, I could recreate the exact same scene as presented in Adam in a few minutes.

Post processing

Finally, just like any movie production, compositing and camera effects are the heart of cinematography. During Adam production, the team had to deal with various effects from various sources (standard assets, cinematic effects, Keijiro’s effects, asset store, custom TAA with an alpha version of Unity supporting motion vectors…) to get the final result. The Demo team also developed its own tone mapping and color grading effect, as well as motion blur.

Moreover, the right order and the right effect setup had to be mixed to achieve good result.
Explaining all this would have taken an entire talk (
which I did a few weeks before!).

But fortunately, we released a new post processing stack, assembling together most of the needed effects with latest improvements (eg: HDR color grading) and in an artist friendly way. In a few clicks one can enable temporal anti-aliasing which gives this very sharp image in the Adam demo, motion blur and depth of field to get the feeling that it was shot through a real camera, and tone mapping and color grading to get the cinema look.

On top of that, grouping effects together allowed to use minimal number of passes and hence great gain of performances…  so that it could all run on my MacBook Pro 2014!


What used to require a lot of additional work and features from various sources at the time when the Adam demo was originally produced, is now possible for anyone to achieve with only publicly available assets and Unity 5.4.1.

What I did not have time to show, was how to make all this come together as one big sequence and export it as a video. First, most animations for characters and cameras came from motion capture and camera tracking. So for my talk I decided to take advantage of the publicly released characters being Humanoid to do the character layouting using Inverse Kinematics in Editor. Second, at Unite LA, our sequencer (Timeline) was still in private alpha, but will available publicly in the coming weeks.

I hope to make a full cinematic talk one day, to inspire more people to make films and cinematics with Unity. “Creating a short film from an empty scene in one hour”. Maybe another talk for one of the 2017 Unites!



Subscribe to comments

Comments are closed.

  1. It’s interesting, you used more 3rd party tool than unity standard :)
    I really would like to play it on my HW, but the demo crash on loading screen (config RX480/Win10/FX4350/8BG ) :(

  2. Hi guys, someone use the area light of the Adam project?
    I have a problem with the standard material (alpha problem) using the area light, if I use cutout the texture looks ok but if I use Transparent or Fade the texture looks black, I really want to use this light ( I love the shadow) my project look amazing with it, but I need the transparent because the cutout looks bad in a close up object. Other problem is the shadow, if I use this material with transparent (all) the object don´t have shadow.

  3. OK….. I have questions about Unity.

    I know a little bit about Maya, and Poser (and Mudbox/ZBrush). And AutoCAD (which was where I started in all this mess, making blueprints for Radio-Controlled Airplanes…. Which translated into doing Architectural Sculpture – AutoCAD, Maya, and Mudbox/ZBrush…. Which translated into doing work in 3D Printing, and Animation – Maya, Mudbox/ZBrush, and Poser). Specifically about Modeling, Rigging, Skinning, and then Animating the figure to make it look like it is moving properly (or unnaturally – think Lizard or Giant Cockroach wearing a human skin, or a robot moving mechanically).

    But I know next to nothing about Rendering (trying to learn), Lighting, Cameras, Particle Effects, etc.

    What is the big deal here with Unity? Why should I be concerned with it rather than doing all this same stuff in Maya, which seems to be able to do all of it (well, save the compositing thing…. That seems to be another “Big” topic that I don’t yet understand)?

    I am not trying to be flippant here, or dismissive. The short was really freaking amazing. But I really want to know what it was about it that makes it different than doing the same stuff in Maya, 3ds Max, or Softimage?

    1. Unity is a game engine, not a modelling or rendering package.

    2. Like Stan said, Unity is a Game Engine, which means that all of the stuff you see is calculated in realtime. You can download the executable and see for yourself, since you can stop it at any time, change the camera angle and play around with the light for a bit on the fly.

    3. Unity is real time – purpose to create games, which responds to user input, not movies . meaning the player can interact with on screen items.

  4. Hi,
    Did you export to video in real-time capturing from the screen or render to a sequence of images with a fixed step frame by frame?

    1. Sorry, I’ve just noticed there’s a link to . I guess it’s real-time then.

      1. how to install that plug in?

      2. On Adam, for the video, we used a custom frame capturer which, just as the FrameCapturer by Unity-Jp referenced in this article, is capturing frames directly in the engine.

  5. and… the volumetrix project doesn’t work in 5.5

    1. Hi Marge, I just tried on 5.5.0p4 and 550 and it seems to work fine. What problem do you experience?

  6. Pierre Schiller

    January 11, 2017 at 4:02 pm

    Thank you so much for this post. 3GB of assets, and yes, it didn´t come out like the movie Adam. Thanks for breaking it down to us now.