Search Unity

Achieve beautiful, scalable, and performant graphics with the Universal Render Pipeline

February 10, 2020 in Engine & platform | 15 min. read
Topics covered
Share

Is this article helpful for you?

Thank you for your feedback!

Universal Render Pipeline is a powerful, ready-to-use solution with a full suite of artist tools for content creation. You should use this rendering pipeline if you want to make a game that has full Unity platform reach with best-in-class visual quality and performance. We covered the benefits of the Universal Render pipeline in this blog post. In this blog post, we’ll dive into how the Universal Render Pipeline was used to create the vertical slice Boat Attack demo.   We first created the Boat Attack demo to help us validate and test the Universal Render Pipeline (which, at the time, was known as the Lightweight Render Pipeline). Producing a vertical slice as part of our development process was also an exercise in using real-world production processes in our feature development. We have upgraded the Boat Attack demo considerably since we first created it. It now uses many of the Universal Render Pipeline’s new graphical features, along with recent Unity features such as the C# Job System, Burst compiler, Shader Graph, Input System, and more. You can download the Boat Attack demo now, and start using it today with Unity 2019.3.

The Demo

The Boat Attack demo is a small vertical slice of a boat-racing game. It is playable and we are continually adjusting it to take full advantage of the latest Unity features.

The demo is designed to work well on a wide variety of platforms: mid- to high-range mobile devices, all current consoles and standalone apps. We demonstrated Boat Attack live at Unite Copenhagen 2019 on a range of devices, from the iPhone 7 to the PlayStation 4.

To use the demo, we suggest you install the latest version of Unity 2019.3, and then grab the project from GitHub (make sure to read the readme for usage instructions).

Shader Graph

Shader Graph is an artist-friendly interface for creating shaders. It’s a powerful prototyping tool for technical artists. We used Shader Graph to create some of the unique shading effects in the Boat Attack demo.

Using Shader Graph allowed us to create great shading effects, and then painlessly maintain them across many versions of the Lightweight Render Pipeline and the Universal Render Pipeline.

The cliff shader in Boat Attack demonstrates the effects you can achieve using mesh data – it’s easy to get data from a mesh in Shader Graph. We use the normal vector of the mesh to draw grass on the parts of the cliff face that are flat and facing upwards, and we use the world space height of the mesh to ensure that cliffs and rocks close to the water level will not have grass.

From left to right: Y height mask, Y normal mask, height + normal mask remapped, final shader.

Vegetation shading

The vegetation in Boat Attack was initially a custom vertex/fragment shader, but this was painful to maintain when the render pipeline was in early development and code was changing frequently. Recreating the shader in Shader Graph let us take advantage of Shader Graph’s easy upgradeability.

This Shader Graph effect is based on an implementation from Tiago Sousa of Crytek, which makes great use of vertex colors to control wind animation via vertex displacement. In Boat Attack, we created a Sub-graph to house all the required nodes needed for calculating the wind effect. The Sub-graph contains nested Sub-graphs, which are a collection of utility graphs that perform repeating math calculations.

Individual vertex animations and their masks. From left to right: main bending from distance to origin, leaf edge from vertex color Red channel, and branches from vertex color Blue using vertex color Green channel for phase offset.

Another big part of creating believable vegetation is subsurface scattering (SSS), which is currently not available with the Universal Render Pipeline. However, to create an SSS-like effect, you can use Shader Graph’s custom function node to retrieve lighting information from Universal Render Pipeline to create your own SSS-like effect.

Node layout. The SSS Mask is made from the vertex color Green (leaf phase) and the albedo texture map.

The custom function node gives you a lot of creative freedom. You can read up on custom rendering techniques here, or simply grab the code for the node in the Boat Attack repository to try out your own custom lighting ideas.

From left to right: without SSS, SSS only, final shader.

Boat customization

The boats needed to have multiple variations of colors. In Substance Painter, two livery masks were painted and stored in a packed texture containing Metallic (red), Smoothness (green), Livery 1 (blue) and Livery 2 (alpha). Using the masks via Shader Graph, we can selectively apply coloring to these masked areas.

An overview of how the boats are colored. Using overlay blending allows subtle coloring to come through the base albedo map.
The node layout in Shader Graph, wrapped into a Sub-graph for easy use in the parent RaceBoats graph.

Houses

Boat Attack covers a full day/night cycle. To enhance this illusion, we created a Shader Graph for the windows of the buildings throughout the level. The Shader Graph lights up the windows at dusk and switches them off at dawn.

We achieved this using a simple emission texture that was mapped to a day/night value. We added an effect to slightly randomize the order, using the objects’ positions, so that the houses would light up at different times.

The node map that enables random emissions.

Clouds

Now that we have added changing lighting to Boat Attack, a simple high-dynamic-range imaging (HDRI) skybox is no longer sufficient. The clouds should be dynamically lit by the lighting in the Scene.

But rendering big puffy clouds in real-time is demanding, especially with the need to run on mobile hardware. Because we don’t need to see the clouds from many angles, we decided to use cards with textures to save on performance.

The whole current graph for rendering the clouds.

Shader Graph was crucial in prototyping the look. We baked out some volumetric cloud data from Houdini, and created fully custom lighting in Shader Graph. These clouds are still a work in progress, but they prove that a wide range of surfaces can be created with the node-based editor.

Rendering from API for seamless Planar Reflections

Unity’s goal with Scriptable Render Pipelines was to allow users to customize rendering code, instead of hiding it in a black box. Rather than simply opening up our existing rendering code, we pushed our rendering tech with new APIs and hardware in mind. 

The Universal Render Pipeline lets you extend its out-of-the-box rendering capabilities with your own C#. It exposes 4 hooks:

  • RenderPipelineManager.beginFrameRendering
  • RenderPipelineManager.beginCameraRendering
  • RenderPipelineManager.endCameraRendering
  • RenderPipelineManager.endFrameRendering

These hooks let you easily run your own code before rendering the Scene or before rendering certain Cameras. In Boat Attack, we used these hooks to implement Planar Reflections by rendering the Scene into a texture before the main frame is rendered.

private void OnEnable()
{
   RenderPipelineManager.beginCameraRendering += ExecutePlanarReflections;
}

Because this is a callback we subscribe to, we also unsubscribe from it in OnDisable.

Here we can see the entry point in the Planar Reflection script. This code lets us call a custom method every time Universal Render Pipeline goes to render a camera. The method we call is our ExecutePlanarReflections method:

public void ExecutePlanarReflections(ScriptableRenderContext context, Camera camera)
{
    //rendering code....
}

Because we are using the [beginCameraRendering] callback, our method must take a [ScriptableRenderContext] and a [Camera] as its parameters. This data is piped through with the callback, and it will let us know which Camera is about to render.

For the most part, the code here is the same code as you would normally use to implement planar reflections: you are dealing with cameras and matrices. The only difference is that Universal Render Pipeline provides a new API for rendering a camera. 

The full method for implementing planar reflections is as follows:

private void ExecutePlanarReflections(ScriptableRenderContext context, Camera camera)
{
   // we dont want to render planar reflections in reflections or previews
   if (camera.cameraType == CameraType.Reflection || camera.cameraType == CameraType.Preview)
       return;

   UpdateReflectionCamera(camera); // create reflected camera
   PlanarReflectionTexture(camera); // create and assign RenderTexture

   var data = new PlanarReflectionSettingData(); // save quality settings and lower them for the planar reflections

   beginPlanarReflections?.Invoke(context, m_ReflectionCamera); // callback Action for PlanarReflection
   UniversalRenderPipeline.RenderSingleCamera(context, m_ReflectionCamera); // render planar reflections

   data.Restore(); // restore the quality settings
   Shader.SetGlobalTexture(planarReflectionTextureID, m_ReflectionTexture); // Assign texture to water shader
}

Here we use the new [UniversalRenderPipeline.RenderSingleCamera()] method to render a given camera. In this case, the camera is our Planar Reflection Camera.

Since this camera renders to a texture (which we set using [Camera.targetTexture]), we now get a RenderTexture we can use in our water shading later in the rendering. Check out the whole PlanarReflection script on the GitHub page.

Planar reflection composition. From left to right: raw planar reflection camera output, fresnel darkening and normal offsetting, final water shader, water shader without planar reflections.

These callbacks are used here to invoke some rendering, but they can be used for several things. For example, we also use them to disable shadows on the Planar Reflection Camera, or choose which Renderer to use for a camera. Rather than hard coding the behavior in the Scene or a Prefab, using an API allows you to handle more complexity with greater control.

Injecting Custom Render Passes for specialized effects

In the Universal Render Pipeline, rendering is based upon ScriptableRenderPasses. These are instruction sets on what and how to render. Many ScriptableRenderPasses are queued together to create what we call a ScriptableRenderer.

Another part of Universal Render Pipeline is ScriptableRendererFeatures. These are essentially data containers for custom ScriptableRenderPasses and can contain any number of passes inside along with any type of data attached.

Out of the box we have two ScriptableRenderers, the ForwardRenderer and the 2DRenderer. ForwardRenderer supports injecting ScriptableRendererFeatures.

To make it easier to create ScriptableRendererFeatures, we added the ability to start with a template file, much like we do for C# MonoBehaviour scripts. You can simply right-click in the Project view and choose [Create/Rendering/Universal Pipeline/Renderer Feature]. This creates a template to help you get started. Once created, you can add your ScriptableRendererFeature to the Render Feature list on the ForwardRendererData assets.

In the Boat Attack demo, we used ScriptableRendererFeatures to add two extra rendering passes for the water rendering: one for caustics and one called WaterEffects.

Caustics

The Caustics ScriptableRendererFeature adds a pass that renders a custom caustics shader over the scene between the Opaque and Transparent passes. This is done by rendering a large quad aligned with the water to avoid rendering all the pixels that might be in the sky. The quad follows the camera but is snapped to the water height, and the shader is additively rendered over what’s on screen from the opaque pass.

Caustic Render Pass compositing. From left to right: depth texture, world space position reconstruction from depth, caustics texture mapped with world space position, and final blending with Opaque pass.

Using [CommandBuffer.DrawMesh], you can draw the quad, supply a matrix to position the mesh (based on water and camera coordinates), and set up the caustics material. The code looks like this:

public class WaterCausticsPass : ScriptableRenderPass
{
   const string k_RenderWaterCausticsTag = "Render Water Caustics";
   public Material m_WaterCausticMaterial;
   public Mesh m_mesh;

   public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
   {
       var cam = renderingData.cameraData.camera;
       if(cam.cameraType == CameraType.Preview) // Stop the pass rendering in the preview
           return;

       // Create the matrix to position the caustics mesh.
       Vector3 position = cam.transform.position;
       position.y = 0; // TODO should read a global 'water height' variable.
       Matrix4x4 matrix = Matrix4x4.TRS(position, Quaternion.identity, Vector3.one);

       // Setup the CommandBuffer and draw the mesh with the caustic material and matrix
       CommandBuffer cmd = CommandBufferPool.Get(k_RenderWaterCausticsTag);
       cmd.DrawMesh(m_mesh, matrix    , m_WaterCausticMaterial, 0, 0);
       context.ExecuteCommandBuffer(cmd);
       CommandBufferPool.Release(cmd);
   }
}

Water effects

Split view of the WaterFXPass in action. Left, the final render; right, a debug view showing only the result of the pass on the water.

The WaterFXPass is a bit more complex. The goal for this effect was to have objects affect the water, such as making waves and foam. To achieve this, we render certain objects to an offscreen RenderTexture, using a custom shader that is able to write different information into each channel of the texture: a foam mask into red channel, normal offset X and Z into green and blue, and finally water displacement in the alpha channel.

WaterFXPass compositing. From left to right: final output, the green and blue channels used to create world space normals, the red channel used for a foam mask, and the alpha channel used for creating water displacement (red positive, black no change, blue negative).

First, we need a texture to render into, which we create at half resolution. Next, we create a filter for any transparent objects that have the shader pass called WaterFX. After this, we use [ScriptableRenderContext.DrawRenderers] to render the objects into the scene. The final code looks like this:

class WaterFXPass : ScriptableRenderPass
{
   const string k_RenderWaterFXTag = "Render Water FX";
   private readonly ShaderTagId m_WaterFXShaderTag = new ShaderTagId("WaterFX");
   private readonly Color m_ClearColor = new Color(0.0f, 0.5f, 0.5f, 0.5f); //r = foam mask, g = normal.x, b = normal.z, a = displacement
   private FilteringSettings m_FilteringSettings;
   RenderTargetHandle m_WaterFX = RenderTargetHandle.CameraTarget;

   public WaterFXPass()
   {
       m_WaterFX.Init("_WaterFXMap");
       // only wanting to render transparent objects
       m_FilteringSettings = new FilteringSettings(RenderQueueRange.transparent);
   }

   // Calling Configure since we are wanting to render into a RenderTexture and control cleat
   public override void Configure(CommandBuffer cmd, RenderTextureDescriptor cameraTextureDescriptor)
   {
       // no need for a depth buffer
       cameraTextureDescriptor.depthBufferBits = 0;
       // Half resolution
       cameraTextureDescriptor.width /= 2;
       cameraTextureDescriptor.height /= 2;
       // default format TODO research usefulness of HDR format
       cameraTextureDescriptor.colorFormat = RenderTextureFormat.Default;
       // get a temp RT for rendering into
       cmd.GetTemporaryRT(m_WaterFX.id, cameraTextureDescriptor, FilterMode.Bilinear);
       ConfigureTarget(m_WaterFX.Identifier());
       // clear the screen with a specific color for the packed data
       ConfigureClear(ClearFlag.Color, m_ClearColor);
   }

   public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
   {
       CommandBuffer cmd = CommandBufferPool.Get(k_RenderWaterFXTag);
       using (new ProfilingSample(cmd, k_RenderWaterFXTag)) // makes sure we have profiling ability
       {
           context.ExecuteCommandBuffer(cmd);
           cmd.Clear();

           // here we choose renderers based off the "WaterFX" shader pass and also sort back to front
           var drawSettings = CreateDrawingSettings(m_WaterFXShaderTag, ref renderingData,
               SortingCriteria.CommonTransparent);

           // draw all the renderers matching the rules we setup
           context.DrawRenderers(renderingData.cullResults, ref drawSettings, ref m_FilteringSettings);
       }
       context.ExecuteCommandBuffer(cmd);
       CommandBufferPool.Release(cmd);
   }

   public override void FrameCleanup(CommandBuffer cmd)
   {
       // since the texture is used within the single cameras use we need to cleanup the RT afterwards
       cmd.ReleaseTemporaryRT(m_WaterFX.id);
   }
}

Both of these ScriptableRenderPasses live in a single ScriptableRendererFeature. This feature contains a [Create()] function that you can use to set up resources and also pass along settings from the UI. Since they are always used together when rendering water, a single feature can add them to the ForwardRendererData. You can see the full code on Github.

Future plans

We will continue to update this project throughout the Unity 2019 cycle including 19.4LTS. As of Unity 2020.1, we intend to maintain the project to make sure it runs, but we will not add any new features.

Some of the planned improvements include:

  • Finish day/night cycle (this requires more features to be integrated into Universal Render Pipeline to reduce the need for customization)
  • Refine Water UX/UI 
  • Implement Imposters
  • Continue code cleanup and performance tweaks

Useful links

Boat Attack GitHub repository 

Full 2019.3 project link (if you don’t want to use GitHub)

Universal Render Pipeline manual

Universal Render Pipeline and High Definition Render Pipeline

The Universal Render Pipeline does not replace or encompass the High Definition Render Pipeline (HDRP).

The Universal Render Pipeline aims to be the future default render pipeline for Unity. Develop once, deploy everywhere. It is more flexible and extensible, it delivers higher performance than the built-in render pipeline, and it is scalable across platforms. It also has fantastic graphics quality. 

HDRP delivers state-of-the-art graphics on high-end platforms. HDRP is best to use if your goal is more targeted – pushing graphics on high-end hardware, delivering performant powerful high-fidelity visuals. 

You should choose which render pipeline to use based on the feature and platform requirements of your project.

Start using the Universal Render Pipeline

You can start taking advantage of all the production-ready features and performance benefits today. Upgrade your projects using the upgrade tooling, or start a new project using our Universal Project Template from the Unity Hub.

Please send us feedback in the Universal Render Pipeline forum!

February 10, 2020 in Engine & platform | 15 min. read

Is this article helpful for you?

Thank you for your feedback!

Topics covered