Search Unity

Share

Is this article helpful for you?

Thank you for your feedback!

In March, Unity announced real-time ray tracing support for NVIDIA RTX technology. Real-time ray tracing introduces photorealistic lighting qualities to Unity’s High-Definition Render Pipeline (HDRP), unlocking new potential in Unity’s visual capabilities. Preview release of NVIDIA RTX in Unity is planned for 2019.3.

To show off this new lighting technology, we took on the challenge to match a CG Unity-rendered BMW to a real BMW 8 Series Coupe that we filmed in a warehouse. The final project incorporated both the live interactive demo and a pre-rendered 4K video, cutting between the real car and the CG car. We challenged viewers to distinguish between the two. Our final video featured 11 CG BMW shots.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Both the advertising and automotive industries thrive on fast turnaround. We produced a sample commercial featuring a BMW to demonstrate how decisions made during the advertisement creation process can benefit from the versatility of a real-time engine.

We launched the demo and made official announcements at the Game Developers Conference (GDC) as well as at NVIDIA’s GPU Technology Conference (GTC).

This blog post will take you through our production process.

Pre-production

Previsualization was essential in helping us prepare for the shoot, and in creating a polished, professional ad. We worked directly in Unity alongside Director of Photography Christoph Iwanow to create a full CG version of the film. For this initial take, we used simple lighting and shaders and focused on nailing down pacing, camera placement, shot framing, depth of field, and areas of focus.

Using Unity’s Physical Camera feature, we were able to match the real-life on-set camera and lenses to be used down to the most precise detail: sensor size, ISO, shutter speed, lens distortions, and more. This allowed us to match the look and feel of the real camera in order to achieve a 1:1 match on all levels.

Cinemachine enabled us to easily create believable camera movements while retaining rapid iteration speeds.

The previsualization stage was also our lighting experimentation playground. The real-life lighting equipment we planned to use was also replicated in Unity in every detail: the shapes and dimensions of the lights, temperature, and intensity. We could instantly see what light tubes hanging from the ceiling would look like and refine their placement, intensity, and color; we could generate perfect reflections on the car. This process would otherwise take up several hours of precious on-set time.

By digitally replicating the real-world lighting and camera setups, Christoph was able to experiment with more combinations and find the look he wanted in advance of filming, so we could use filming time more effectively.

Post-production

The day after filming, we could start refining shots right away. Since the previz was done in Unity, we didn’t need to migrate assets. We used the lighting references we took on set, as well as referencing the real car, to ensure that our asset reacted realistically in look development. The technology now allows us to use image textures on area lights, so we were able to use photos of the actual light sources as textures for more realistic lighting and reflections.

With the power of Unity, we were able to render in 4K ultra high definition from day one. This is an obvious advantage over offline rendering, where in many cases work-in-progress deliveries are rendered at lower resolutions to save costs and time. By delivering in 4K from the very first iterations, we were able to fine-tune small details in look dev and lighting. Many more iterations on renders in a short time allowed us to obtain strong visuals with a small art team. It also meant the results in the final renders were predictable.

For the shots using plate integrations, the footage was tracked outside of Unity, and the camera and track data were then imported into Unity as an FBX file.

As for the other cameras that we created directly in Unity, two Cinemachine features were essential in creating believable and realistic camera movements:

Cinemachine Storyboard extension: Among its many features, the Storyboard extension allows you to align camera angles. It was an essential tool for us in easily replicating the specific camera movements required to recreate a shot completely in CG. We used a frame from the on-set camera footage as an overlay to act as a guide to align the CG camera. This was done for the first, middle and last frames of some shots.

Cinemachine noise: Applying procedural noise to our camera moves made it easy for us to get rid of the unnatural perfectness of CG cameras by adding convincing micro-movements to the camera’s motion. We could ensure the movements were interesting, without being obviously repetitive.

Our original concept featured a CG BMW with a different paint color than the real BMW. As the project evolved, we felt it was a stronger statement to have both cars be the same color so we could cut between them seamlessly. Changing the color of the car was a late-stage decision that could be made organically in Unity, as lighting and look dev artists could work on updates concurrently. A similar project in an offline renderer would have dailies with notes like “rotate the wheel 20 degrees to the right” or “drop the key light down a stop.” Instead of needing a full day turnaround for small technical changes, we could work out these changes interactively and focus our energy on creative decisions.

For the two shots using plate integrations, we used Shader Graph to create a screen-space projected material with the original footage at HD resolution. We used this shader on the ground and walls around the car so that we could have realistic reflections from the plate onto the car. We supplemented with additional lighting, rendered the shots out of Unity, and then finished the final ground integration with the full-res plate using external compositing software.

Real-time ray tracing

The usual technique used in game production to simulate reflections relies on a set of reflection probes set up at various locations combined with screen space ray tracing. This usually results in various light leaks and a coarse approximation of surface properties. With real-time ray tracing, we can now correctly reflect what is offscreen without any setup from the artists. However, such a reflection effect requires some arrangement of the rendering engine. In traditional game production, everything that isn’t within the frustum of the camera tends to be disabled, but now it is possible to reflect objects illuminated and shadowed by sources that are not initially visible onscreen. In addition to accurately simulating metal objects, effective ray tracing requires multiple bounces, which isn’t affordable within our performance constraint. We chose to handle only one bounce. The result of other bounces have been approximated but using the color of the metal multiply by current indirect diffuse lighting.

Traditional offline renderers are good at managing the rendering of large textured area lights. However, using an offline renderer is costly and produces a lot of noise, (the more rays you use, the less noise, but that increases the rendering cost per frame). To achieve a real-time frame rate while upholding quality, our Unity Labs researchers developed an algorithm in conjunction with Lucasfilm and NVIDIA (see the paper they produced, Combining Analytic Direct Illumination and Stochastic Shadows). With this approach, the visibility (area shadow) can be separated from the direct lighting evaluation, while the visual result remains intact. Coupled with a denoising technique applied separately on these two components, we were able to launch very few rays (just four in the real-time demo) for large textured area lights and achieve our 30 fps target.

Indirect diffuse or diffuse light bouncing enhances the lighting of a scene by grounding objects and reacting to changing lighting conditions. The usual workflow for game production is painful and relies on setting up Light Probes in a scene or using lightmaps. For the movie, we used a brute force one-bounce indirect diffuse approach with ray tracing – with this approach several rays are launched, allowing us to get the desired light bleeding effect.

Such an approach gives artists immense freedom, without them having to set up anything. However, it is costly.

For the real-time version, we selected a cheaper approach: with ray tracing, we were able to rebake a set of light probes each frame dynamically; traditionally, we would have used a set of pre-bake light probes baked lightmaps.

Just as ray-traced reflection can replace the screen space technique, real-time ray tracing can generate ambient occlusion comparable to that produced by the widely used screen space technique. The resource-friendly indirect diffuse method mentioned above can be enhanced with ray-traced ambient occlusion to better handle the light leaks implied by the technique. For performance reasons, we chose not to support transparent objects, which would require handling the transmission of light through transparent objects.

Real-time ray tracing is the only tool able to achieve the rendering of photorealistic headlights in real-time. The shape of a headlight and its multiple lens and reflector optics result in a complex light interaction that is challenging to simulate. We added the interaction of multiple successive smooth reflective and transmissive rays, which allows the light beam to shift as you see it in the real world. The fine details can be controlled with texture details that will influence the ray direction.

A new reality

invent yourself and then reinvent yourself,
don’t swim in the same slough.
invent yourself and then reinvent yourself
and
stay out of the clutches of mediocrity.

– Charles Bukowski

Unity’s real-time ray tracing is a new reality. We aren’t trying to rebuild traditional production pipelines from the ground up. But we are removing some of the pain points that are typically associated with a project like this. Having the power to interactively change shots and get immediate feedback from the creative director and director of photography is invaluable. Thanks to the decision to build this film in Unity, we could potentially migrate this work to other projects with ease and create a diverse yet cohesive campaign across multiple mediums. Real-time ray tracing is affording us the ability to refine the traditional automotive advertising production pipeline to work in a more creative, collaborative and affordable way.

---

You can explore NVIDIA RTX and Unity today with this experimental release. Please note that this is a prototype and the final implementation of DXR will be different from this version.

April 11, 2019 in Industry | 9 min. read

Is this article helpful for you?

Thank you for your feedback!