Search Unity

Achieve beautiful, scalable, and performant graphics with the Universal Render Pipeline

, 10 февраля, 2020

Universal Render Pipeline is a powerful, ready-to-use solution with a full suite of artist tools for content creation. You should use this rendering pipeline if you want to make a game that has full Unity platform reach with best-in-class visual quality and performance. We covered the benefits of the Universal Render pipeline in this blog post. In this blog post, we’ll dive into how the Universal Render Pipeline was used to create the vertical slice Boat Attack demo.  

We first created the Boat Attack demo to help us validate and test the Universal Render Pipeline (which, at the time, was known as the Lightweight Render Pipeline). Producing a vertical slice as part of our development process was also an exercise in using real-world production processes in our feature development.

We have upgraded the Boat Attack demo considerably since we first created it. It now uses many of the Universal Render Pipeline’s new graphical features, along with recent Unity features such as the C# Job System, Burst compiler, Shader Graph, Input System, and more.

You can download the Boat Attack demo now, and start using it today with Unity 2019.3.

The Demo

The Boat Attack demo is a small vertical slice of a boat-racing game. It is playable and we are continually adjusting it to take full advantage of the latest Unity features.

The demo is designed to work well on a wide variety of platforms: mid- to high-range mobile devices, all current consoles and standalone apps. We demonstrated Boat Attack live at Unite Copenhagen 2019 on a range of devices, from the iPhone 7 to the PlayStation 4.

To use the demo, we suggest you install the latest version of Unity 2019.3, and then grab the project from GitHub (make sure to read the readme for usage instructions).

Shader Graph

Shader Graph is an artist-friendly interface for creating shaders. It’s a powerful prototyping tool for technical artists. We used Shader Graph to create some of the unique shading effects in the Boat Attack demo.

Using Shader Graph allowed us to create great shading effects, and then painlessly maintain them across many versions of the Lightweight Render Pipeline and the Universal Render Pipeline.


The cliff shader in Boat Attack demonstrates the effects you can achieve using mesh data – it’s easy to get data from a mesh in Shader Graph. We use the normal vector of the mesh to draw grass on the parts of the cliff face that are flat and facing upwards, and we use the world space height of the mesh to ensure that cliffs and rocks close to the water level will not have grass.

From left to right: Y height mask, Y normal mask, height + normal mask remapped, final shader.

Vegetation shading


The vegetation in Boat Attack was initially a custom vertex/fragment shader, but this was painful to maintain when the render pipeline was in early development and code was changing frequently. Recreating the shader in Shader Graph let us take advantage of Shader Graph’s easy upgradeability.

This Shader Graph effect is based on an implementation from Tiago Sousa of Crytek, which makes great use of vertex colors to control wind animation via vertex displacement. In Boat Attack, we created a Sub-graph to house all the required nodes needed for calculating the wind effect. The Sub-graph contains nested Sub-graphs, which are a collection of utility graphs that perform repeating math calculations.


Individual vertex animations and their masks. From left to right: main bending from distance to origin, leaf edge from vertex color Red channel, and branches from vertex color Blue using vertex color Green channel for phase offset.

Another big part of creating believable vegetation is subsurface scattering (SSS), which is currently not available with the Universal Render Pipeline. However, to create an SSS-like effect, you can use Shader Graph’s custom function node to retrieve lighting information from Universal Render Pipeline to create your own SSS-like effect.

Node layout. The SSS Mask is made from the vertex color Green (leaf phase) and the albedo texture map.

The custom function node gives you a lot of creative freedom. You can read up on custom rendering techniques here, or simply grab the code for the node in the Boat Attack repository to try out your own custom lighting ideas.

From left to right: without SSS, SSS only, final shader.

Boat customization


The boats needed to have multiple variations of colors. In Substance Painter, two livery masks were painted and stored in a packed texture containing Metallic (red), Smoothness (green), Livery 1 (blue) and Livery 2 (alpha). Using the masks via Shader Graph, we can selectively apply coloring to these masked areas.

An overview of how the boats are colored. Using overlay blending allows subtle coloring to come through the base albedo map.

The node layout in Shader Graph, wrapped into a Sub-graph for easy use in the parent RaceBoats graph.



Boat Attack covers a full day/night cycle. To enhance this illusion, we created a Shader Graph for the windows of the buildings throughout the level. The Shader Graph lights up the windows at dusk and switches them off at dawn.

We achieved this using a simple emission texture that was mapped to a day/night value. We added an effect to slightly randomize the order, using the objects’ positions, so that the houses would light up at different times.

The node map that enables random emissions.



Now that we have added changing lighting to Boat Attack, a simple high-dynamic-range imaging (HDRI) skybox is no longer sufficient. The clouds should be dynamically lit by the lighting in the Scene.

But rendering big puffy clouds in real-time is demanding, especially with the need to run on mobile hardware. Because we don’t need to see the clouds from many angles, we decided to use cards with textures to save on performance.

The whole current graph for rendering the clouds.

Shader Graph was crucial in prototyping the look. We baked out some volumetric cloud data from Houdini, and created fully custom lighting in Shader Graph. These clouds are still a work in progress, but they prove that a wide range of surfaces can be created with the node-based editor.

Rendering from API for seamless Planar Reflections

Unity’s goal with Scriptable Render Pipelines was to allow users to customize rendering code, instead of hiding it in a black box. Rather than simply opening up our existing rendering code, we pushed our rendering tech with new APIs and hardware in mind. 

The Universal Render Pipeline lets you extend its out-of-the-box rendering capabilities with your own C#. It exposes 4 hooks:

  • RenderPipelineManager.beginFrameRendering
  • RenderPipelineManager.beginCameraRendering
  • RenderPipelineManager.endCameraRendering
  • RenderPipelineManager.endFrameRendering

These hooks let you easily run your own code before rendering the Scene or before rendering certain Cameras. In Boat Attack, we used these hooks to implement Planar Reflections by rendering the Scene into a texture before the main frame is rendered.

Because this is a callback we subscribe to, we also unsubscribe from it in OnDisable.

Here we can see the entry point in the Planar Reflection script. This code lets us call a custom method every time Universal Render Pipeline goes to render a camera. The method we call is our ExecutePlanarReflections method:

Because we are using the [beginCameraRendering] callback, our method must take a [ScriptableRenderContext] and a [Camera] as its parameters. This data is piped through with the callback, and it will let us know which Camera is about to render.

For the most part, the code here is the same code as you would normally use to implement planar reflections: you are dealing with cameras and matrices. The only difference is that Universal Render Pipeline provides a new API for rendering a camera. 

The full method for implementing planar reflections is as follows:

Here we use the new [UniversalRenderPipeline.RenderSingleCamera()] method to render a given camera. In this case, the camera is our Planar Reflection Camera.

Since this camera renders to a texture (which we set using [Camera.targetTexture]), we now get a RenderTexture we can use in our water shading later in the rendering. Check out the whole PlanarReflection script on the GitHub page.

Planar reflection composition. From left to right: raw planar reflection camera output, fresnel darkening and normal offsetting, final water shader, water shader without planar reflections.

These callbacks are used here to invoke some rendering, but they can be used for several things. For example, we also use them to disable shadows on the Planar Reflection Camera, or choose which Renderer to use for a camera. Rather than hard coding the behavior in the Scene or a Prefab, using an API allows you to handle more complexity with greater control.

Injecting Custom Render Passes for specialized effects

In the Universal Render Pipeline, rendering is based upon ScriptableRenderPasses. These are instruction sets on what and how to render. Many ScriptableRenderPasses are queued together to create what we call a ScriptableRenderer.

Another part of Universal Render Pipeline is ScriptableRendererFeatures. These are essentially data containers for custom ScriptableRenderPasses and can contain any number of passes inside along with any type of data attached.

Out of the box we have two ScriptableRenderers, the ForwardRenderer and the 2DRenderer. ForwardRenderer supports injecting ScriptableRendererFeatures.

To make it easier to create ScriptableRendererFeatures, we added the ability to start with a template file, much like we do for C# MonoBehaviour scripts. You can simply right-click in the Project view and choose [Create/Rendering/Universal Pipeline/Renderer Feature]. This creates a template to help you get started. Once created, you can add your ScriptableRendererFeature to the Render Feature list on the ForwardRendererData assets.

In the Boat Attack demo, we used ScriptableRendererFeatures to add two extra rendering passes for the water rendering: one for caustics and one called WaterEffects.



The Caustics ScriptableRendererFeature adds a pass that renders a custom caustics shader over the scene between the Opaque and Transparent passes. This is done by rendering a large quad aligned with the water to avoid rendering all the pixels that might be in the sky. The quad follows the camera but is snapped to the water height, and the shader is additively rendered over what’s on screen from the opaque pass.

Caustic Render Pass compositing. From left to right: depth texture, world space position reconstruction from depth, caustics texture mapped with world space position, and final blending with Opaque pass.

Using [CommandBuffer.DrawMesh], you can draw the quad, supply a matrix to position the mesh (based on water and camera coordinates), and set up the caustics material. The code looks like this:

Water effects


Split view of the WaterFXPass in action. Left, the final render; right, a debug view showing only the result of the pass on the water.

The WaterFXPass is a bit more complex. The goal for this effect was to have objects affect the water, such as making waves and foam. To achieve this, we render certain objects to an offscreen RenderTexture, using a custom shader that is able to write different information into each channel of the texture: a foam mask into red channel, normal offset X and Z into green and blue, and finally water displacement in the alpha channel.

WaterFXPass compositing. From left to right: final output, the green and blue channels used to create world space normals, the red channel used for a foam mask, and the alpha channel used for creating water displacement (red positive, black no change, blue negative).

First, we need a texture to render into, which we create at half resolution. Next, we create a filter for any transparent objects that have the shader pass called WaterFX. After this, we use [ScriptableRenderContext.DrawRenderers] to render the objects into the scene. The final code looks like this:

Both of these ScriptableRenderPasses live in a single ScriptableRendererFeature. This feature contains a [Create()] function that you can use to set up resources and also pass along settings from the UI. Since they are always used together when rendering water, a single feature can add them to the ForwardRendererData. You can see the full code on Github.

Future plans

We will continue to update this project throughout the Unity 2019 cycle including 19.4LTS. As of Unity 2020.1, we intend to maintain the project to make sure it runs, but we will not add any new features.

Some of the planned improvements include:

  • Finish day/night cycle (this requires more features to be integrated into Universal Render Pipeline to reduce the need for customization)
  • Refine Water UX/UI 
  • Implement Imposters
  • Continue code cleanup and performance tweaks

Useful links

Boat Attack GitHub repository 

Full 2019.3 project link (if you don’t want to use GitHub)

Universal Render Pipeline manual

Universal Render Pipeline and High Definition Render Pipeline

The Universal Render Pipeline does not replace or encompass the High Definition Render Pipeline (HDRP).

The Universal Render Pipeline aims to be the future default render pipeline for Unity. Develop once, deploy everywhere. It is more flexible and extensible, it delivers higher performance than the built-in render pipeline, and it is scalable across platforms. It also has fantastic graphics quality. 

HDRP delivers state-of-the-art graphics on high-end platforms. HDRP is best to use if your goal is more targeted – pushing graphics on high-end hardware, delivering performant powerful high-fidelity visuals. 

You should choose which render pipeline to use based on the feature and platform requirements of your project.

Start using the Universal Render Pipeline

You can start taking advantage of all the production-ready features and performance benefits today. Upgrade your projects using the upgrade tooling, or start a new project using our Universal Project Template from the Unity Hub.

Please send us feedback in the Universal Render Pipeline forum!

31 replies on “Achieve beautiful, scalable, and performant graphics with the Universal Render Pipeline”

This is a nice post. First thing I tried was to see how well the water system works in VR. Looking forward to when you have it working — It looks promising.

Very good progress, this URP.
I’m missing:
— realtime GI
— volumetric fog (from HDRP. I know it’s expensive)
— lens flares…?

Also I couldn’t get any great performance with URP on HoloLens2 for some reason (<50fps in release mode for a few cubes on low quality).

Frankly the biggest hold back for me is waiting on various assets to start supporting URP, most don’t want to touch it because they can’t trust it isn’t going to change, or bugs will hinder progress causing wasted time etc….

It’s work in progrses

Frankly the biggest hold back for me is waiting on various assets to start supporting URP, most don’t want to touch it because they can’t trust it isn’t going to change, or bugs will hinder progress causing wasted time etc….

It’s work in progrses

This is nice and all, but why not integrate things like Planar Reflection and Water into the Render Pipeline itself, or at least have them as an optional «extras» package?

Great post, but I would argue that URP isnt really ready to use unless you dont really care about control or quality.

Until breaking changes and bugs stop happening, and until it performs well on XR, and until custom post processing and camera stacking is supported in a way that makes sense and doesnt make custom FX REALLY difficult to do, its not really a ready to use solution for anyone serious about making games.

Loving the work so far, but you guys need to stop flaunting your work constantly as ready. Its not ready, its very much a work in progress as evidenced by pretty much everything anybody working on URP has ever said publically.

The Demo Island scene always crashes for me in 2019.3.0f6 and 2020.1.0a22 — both with a fresh git clone and downloading the above zip file. I tried that on multiple different machines in different locations. Also, it throws hundreds of exceptions on project launch (before trying to open that scene).

(Case 1218663) [Editor Crash][Boat Attack Demo] Editor crashes on selecting «open demo_island»
(Case 1218678) [Editor][Boat Attack Demo] Errors in the console on clean project
(Case 1218693) [Editor] Boat Demo crashes on start (from a different machine)

My main complaint is that URP is not getting great support for XR. The boat attack demo mostly doesn’t render at all in XR if you pop a vr camera in the scene.

I have spent ages reporting bugs with URP for Oculus Quest and Go which are platforms that deserve to make the most of these performance improvements also!

Thanks for the post, it’s an interesting read from start to end!

I always wondered how caustics can be implemented in an efficient way. After reading you post, I finally know. It’s so simple! :)

Not any time soon, the Standard Render Pipeline will stay for many years to come as far as I’m aware. I’m pretty sure they already talked about this in a couple previous posts

Comments are closed.