Search Unity

Lighting tips & tricks in the Adam films

December 7, 2017 in Entertainment | 14 min. read
Topics covered
Share

Is this article helpful for you?

Thank you for your feedback!

Are you curious about how Oats Studios created such high-fidelity sets, characters and costumes in real time for Adam: The Mirror and Adam: Episode 3? Read our in-depth, behind-the-scenes blog posts on lighting, Alembic support, clothing simulation, Timeline, shaders, real-time rendering and more.

My name is Jean-Philippe Leroux, and I am a senior technical artist specialized in lighting on the Made with Unity team. I have nearly 15 years of experience in lighting and spent 13 of those at Ubisoft Montreal where I dedicated many years to the lighting development of the Watch_Dogs brand. My role in the production of the Adam shorts was as a technical advisor, supporting Oats Studios’ lighting artists who were intent on achieving some very impressive effects in real time.

Project background

The Adam: The Mirror and Adam: Episode 3 shorts were my first projects for Made with Unity. The production had already started when I began consulting with Oats Studio in July 2017. I enjoyed a smooth collaboration with the Oats team, including Youngbin Eun, Nathaniel Holroyd and Abhishek Joshi (lighting artists), Jim Spoto (Unity lead) and Chris Harvey (VFX supervisor). I worked remotely with them via Perforce until October and I also spent a week with them at their studio, the coolest “man cave” I’ve ever seen.

Setting the stage in Adam: Lighting basics

Unity supports a large number of platforms, which also means there is a wide choice of lighting workflows. For a high-fidelity cinematic project like Adam, some of Oats’ basic choices were quite obvious.

To start, Oats set Color Space to Linear. This is always the first thing I set up in a new project, as it’s actually essential for proper lighting calculations. The only reason you would not use this color space would be if your target platforms are lower-tier devices that can’t support it.

They set their Rendering Path to Deferred. Not only did this allow Oats to use some cool rendering techniques like screen-space reflections, it greatly simplified their lighting workflow by letting them use any number of lights, which is especially important for set lighting, since much of it has huge geometry.

For Tonemapping, Oats chose ACES from Unity’s Post-Processing Stack. ACES is the standard developed by the television and film industries as the common ground for all image sources. It provides an important natural-looking contrast that modern television thrives on, and it also correctly preserves the whites from clipping.

Since a big part of the Adam films is “humanizing” the characters, proper tonemapping is a crucial step in the look development (the art of conveying emotion to a character through lighting, shaders, etc.). Additionally, tonemapping will actually define how you light and expose your shots. (Good news, the next Post-Processing Stack release will let you create your own tonemapping curve.)

A scene in Adam: Episode 3 with and without tonemapping

The importance of Global Illumination (GI)

Without a doubt, Precomputed Realtime GI was a key choice for this project. It allowed Oats the proper light bounce since all their shots used dynamic lights for the characters and the sun position was also tweaked shot by shot. Together, the sun and dynamic lights gave them the indirect lighting input they were after.

(First, a bit of background. If you’re not that familiar with real-time GI and are considering using it for one of your own projects, I highly recommend that you invest a few hours going through this great tutorial: Introduction to Precomputed Realtime GI. This is a must-read for every artist or developer as it will help you produce really high quality lighting effects and save you many hours of iterations.)

Back to Adam . . . One caveat of Unity’s real-time GI solution is that only directional light shadows are supported. All other light-source shadows are not considered in the calculation, so they pass through every object and wall. Because of that limitation, for example, we had to find a way to contain the GI created by the bright lights inside The Mirror’s room.
Oats employed a very simple trick to achieve this. They used emissive planes culled from the camera view that was contributing to the GI, without leaking light outside the room.

The disposition of camera-culled emissive planes in the room

Solving the dynamic GI issue with emissive planes in The Mirror’s room

A Perforce problem slows the team down temporarily

During development, a problem arose when Oats started pushing updated versions of the GI to their Perforce server: everyone on the team was losing the GI and receiving error messages. It turns out the default Perforce configuration caused an unexpected issue. While LightingData.asset are binary files, by default they were being treated as text files by the Perforce server, which corrupted them when they were integrated into the depot. To put production back on track, Oats corrected the configured filetype assigned to .asset and rebuilt the GI, making sure the files were submitted as binaries since they already existed in the depot as text files.

Handling new sun positions

Another issue Oats noticed was seeing pops every time the sun position changed. Oats designed their lighting per-camera shot, like they would on a normal film production. This meant that modifying the position of the sun, shot by shot, would require a full update of the Realtime GI solution. Unfortunately, even with Realtime GI CPU usage set to Unlimited, the update was not ready in time when the first frame of a new shot was drawn, making a noticeable pop at every sun position change. To compensate, Oats improvised a temporary solution: In a higher Timeline Director, we added, shot by shot, a two-frame buffer that wasn’t output by the Frame Recorder. Good news: Unity 2017.2 now handles this automatically using the DynamicGI.IsConverged command in the Frame Recorder.

Here you can see where we added two frames between cuts

Working with alembic caches

“Now that Unity’s added alembic support – it’s a super key bridge for us. We need it to be able to cache large geometry datasets.” – Chris Harvey, VFX Supervisor, Oats Studios
Alembic caches, which Oats used for their cloth and facial simulations, are effectively meshes. In Adam, they sit at the origin of the world and they evaluate the GI probes as such. To ensure everything was handled properly, Oats simply set the “anchor override” renderer probe to the character’s pelvis. By doing so, the probes are properly evaluated based on the character’s position in the set, not by the pivot point of the cache. You can learn more about Alembic support in Adam by reading Sean Low’s blog post.

Setting the Anchor Override

Employing Dynamic Decals

In the third Adam episode, keep your eyes peeled for some cool graffiti, pebbles, dirt and other details. To add these, Oats pulled many goodies from the Dynamic Decals asset package. However, it didn’t initially work well when they introduced the real-time GI, because the decals didn’t get properly illuminated, appearing really dark in the shadows.
In deferred rendering, these effects are written to the buffer via the reflection pass before the GI is resolved. The problem occurred because we used the Skybox for the Environment light. If you use gradient or color, don’t worry – you won’t encounter this issue. Oats’ solution was to write its own ambient override for the Dynamic Decals and tweak the values until they matched as closely as possible, using a Lambert sphere and a custom DecalAmbient shader sphere.

Oats’ custom Decal Ambient Override vs a Lambert sphere

Getting the shadows right

As you probably know, shadows are hugely important when we talk about the quality of light. And, to be honest, it’s an area that hasn’t seen enough functional breakthroughs in the last decade. Nevertheless, Adam lighting quality comes from a good understanding and mastering of the currently available techniques.
First, a little background: All lights should cast shadows and those should be fully opaque, but many developers use shadow transparency thinking it will fill the shadowed area, when in fact it breaks the lighting continuity because lighting is still directional and the back faces will remain dark. See below how once Oats used GI to fill shadowed areas, the need for shadow transparency was alleviated, giving excellent results.

Note the difference between shadow strengths of 0.8 and 1

Oats also really boosted the shadow quality using a simple modification to the spotlight shadow filtering. They pushed PCF filtering to 7x7 resolution by overriding the internal deferred shading. If you want to try this trick, note that it only works for spot: point light shadows don’t benefit from it.
Another Unity 2017.1 feature that Oats took advantage of was custom shadow resolution per light, which you can find on the Inspector debug tab. The resolution must be a power of two and is capped at 8k.
If you’re going to try this, be careful when using high-resolution shadow maps. Higher resolutions do not equal better quality. The higher you go, the crisper your shadow will be – but that is not what you normally want to achieve because it basically means going in the opposite direction of the high quality provided by area lights. Always remember that light quality equals soft shadow and wide specular. Here you can see the difference between shadow resolution and filtering at different settings in Adam.

The impact of shadow resolution vs filtering: Default PCF 3x3 (left) and PCF 7x7 (right)

Getting shadow bias right

Setting the shadow bias properly was key for Oats. They wanted their shadows to be connected but without causing acne. In Unity, spotlight shadow bias is enmeshed with the clip plane value. To ensure a perfect connection, they needed to push it as close as possible to the subject. A good starting point for character lighting is around two units. Check out the effect below.

Proper shadow bias settings (0.005, 0.1, 5) vs default bias settings (0.05, 0.4, 0.2)

Setting cascades

Similarly, the cascade distribution settings of the directional lights had to be authored shot by shot, which Oats did using a Timeline custom track. As well, they tapped the SE Screen-Space Shadows package for some amazing shadow-quality enhancements. By pairing it with the Cascade Shadow Map (CSM), they got perfect contact shadows that the best bias settings in the world could not achieve.

A scene from Adam with and without Screen-Space Shadows

Using Timeline for shot-by-shot lighting

Once the lighting lookdev was approved for a sequence, Oats broke it apart in Timeline, enabling them to easily do shot-by-shot adjustments. They basically duplicated the lookdev lighting rig for each shot and activated it via Timeline.
Since using the basic Switch Active track will overpopulate Timeline, Oats created a custom Switch Active Asset clip that allowed them to activate two groups of light per clip, all on the same track. That kept it clean and simple.

Oats' custom Switch Active Asset setup

Organizing their lighting per shot let Oats do small adjustments without the risk of impacting any other shots. If you’re going to try this, I strongly recommend using a clean naming convention that follows your shot numbers and group ordering. This will make you more efficient and will greatly help others working with you, as Oats found.

Shot-by-shot breakdown of a lighting sequence in Timeline

You can learn more about Timeline in Adam by reading Sean Low’s blog post.

Getting perfect control over punctual lights

Early in production, Oats expressed the need for better control over punctual lights. Specifically, they wanted more control over the falloff and the spread of the lights. John Parsaie, a Made with Unity software engineer, created an asset for them that takes advantage of the Cookie slot. As a bonus, this gave Oats curve control over both parameters and a smoothness clamp. The cool thing was that all the lights were still contributing to GI and reflections.

A custom script gave Oats better control over attenuation and falloff

Adding flickering lights

To achieve the dynamic, realistic look for Adam's fires and flares, Oats created a comprehensive script that provided light position, intensity, and hue modulation. This script really added to the fidelity of these combined visual effects. They attached it to an empty object and it propagates to all lights and FX children.

Light Fire Flicker: Oats’ script to set position, intensity, and hue modulation

The result was also visible in the Scene view, which is always great when developing looks. We ran into a small glitch with this one though — it initially prevented the baking of cube maps and GI to complete. Once we realized that the script was causing the problem, we simply used another command to disable all the flickering in the scene at bake time.

Where there’s smoke, there’s fire

Burning up the night sky: The opening of The Mirror

When Adam: The Mirror begins, a series of very impressive oil fires provide a lot of drama and tension in the scene. Will the prisoners succumb to the “heat” as they proceed through a deadly night-time landscape? Many people want to know how Oats accomplished this impressive effect. According to Stephen Cooney at Oats, they combined a number of cool industry tricks and did a lot of layering:
We used animated texture atlases for most of the fire elements. We generated some in Houdini and we also used some stock Unity texture atlases. Fire was treated as “heat” in the texture and then remapped in the shader to any intensity and graded with a power curve.

(L) Looping ground fire made in Houdini (R) Frame-to-flame flowmap (relatively low detail works well for this)

We combined this with a tiling detail fire texture to add extra grit. After the final “heat” was calculated this was remapped between two colors (one for cool, one for hot). We used optical flow maps to smoothly blend between frames. This was useful for the large plume fire atlas that had only a few frames.
The smaller fire comprised very long (but smaller resolution) texture atlases, while the plume component was a very low frame-count but high-resolution explosion (flow maps made this easy). We did the large plumes as a combination of the small ground-fire elements with the large plume puffs tacked on.

An explosion texture used for plume (red for heat, green for lighting) and its flowmap

We achieved a major part of the look by managing the visualization order manually. Also, since the fire was treated as “heat” and then remapped to actual values (with the detail texture on top), it made it much easier to get visually appealing levels of brightness. These play well with Unity’s Post-Processing System without the need to revisit the texture.
Oats’ particle lighting system, which enables particles to look lit (useful for smoke), helped us tie the various particle systems together. Because the light could be animated, the smoke’s brightness would follow and the ground would take the same brightness.
We also used this for the flare in order to “light” the smoke. Basically, it allowed our emissive particle systems to have a way to connect to the custom diffuse systems.

A dystopian campfire: Adam prisoners listening to Needalus explain who they are

Final reflections

This project was a great introduction for me into the Made with Unity world as I collaborated with an accomplished team of lighting and graphics experts who were creating lifelike CG shorts in our off-the-shelf game engine. I think that over the course of the project, Oats proved that Unity has what it takes to deliver outstanding film-quality effects, while keeping iterations and costs down.
And we learned a lot too, seeing them constantly pushing VFX and storytelling boundaries and all the imagination and hard work that requires. I’ll let Abhishek Joshi, Oats’ CG supervisor have the last word:
Lighting in real time in Unity was a huge leap for us creatively. Coming from offline raytraced renders, the speed and interactivity allowed us complete creative freedom and iteration speed unheard of with a non-real-time workflow. This is the future of creating animated content.

Learn more about Unity

Find out how Unity 2017 and its features like Timeline, Cinemachine, Post-Processing Stack, and real-time rendering at 30 FPS help teams like Oats Studios change the future of filmmaking.

December 7, 2017 in Entertainment | 14 min. read

Is this article helpful for you?

Thank you for your feedback!

Topics covered