Search Unity

Normal map compositing using the surface gradient framework in Shader Graph

, 11月 20, 2019

A recent Unity Labs paper introduces a new framework for blending normal maps that is easy and intuitive for both technical artists and graphics engineers. This approach overcomes several limitations of traditional methods.

Since the introduction of normal mapping in real-time computer graphics, combining or blending normal maps in a mathematically correct way to get visually pleasing results has been a difficult problem for even very experienced graphics engineers. Historically, people often blend normals in world space, which produces incorrect and less than satisfactory results. The paper “Surface Gradient Based Bump Mapping Framework” introduces a new framework that addresses the limitations of traditional normal mapping. This new approach is easy and intuitive for both technical artists and graphics engineers, even where different forms of bump maps are combined. 

In modern computer graphics, material layering is critical to achieve rich and complex environments. To do this, we need support for bump mapping across multiple sets of texture coordinates as well as blending between multiple bump maps. Additionally, we want to be able to adjust the bump scale on object space normal maps and be able to composite/blend these maps correctly with tangent space normal maps and with triplanar projection.

Traditionally, real-time graphics has supported bump mapping on only one set of texture coordinates. Bump mapping requires data for every vertex to be pre-calculated and stored (tangent space). Supporting additional sets of texture coordinates would require proportional amounts of extra storage per vertex. Further, doing so does not support procedural geometry/texture coordinates or advanced deformation of the geometry.

The High Definition Rendering Pipeline (HDRP) solves this problem by leveraging the surface gradient-based framework for bump mapping. In HDRP, traditional tangent space, per vertex, is used for the first set of texture coordinates to support strict compliance with MikkTSpace. This is required for difficult cases such as baked normal maps made for low-polygonal hard surface geometry.

For all subsequent sets of texture coordinates, we calculate tangent space on the fly in the pixel shader. Doing so allows us to support normal mapping across all sets of texture coordinates as well as using it with procedural geometry and advanced deformers beyond simple skinning. 

Correct blending is achieved by accumulating surface gradients as described in the paper.

Up until now, this framework has been available only when using shaders that are built into HDRP. However, a prototype version made for Shader Graph is now available on Github in a sample scene made with Unity 2019.3.0b4 and Shader Graph version 7.1.2. The framework itself is implemented entirely as subgraphs for Shader Graph, and each subgraph is made with only built-in nodes.

The method involves the following steps:

  1. Establish a surface gradient for each bump map.
  2. Scale each surface gradient by a user-defined bump scale.
  3. Add (or blend) all the surface gradients into one.
  4. Resolve to produce the final bump mapped normal.

By adhering to this framework, every single form of bump map produces a surface gradient, which allows for uniform processing. This includes tangent/object space normal maps, planar projection, triplanar projection and even bump maps in the form of a procedural 3D texture. This makes it much easier to correctly blend. 

The sample comes with several graphs that use the framework. Each graph illustrates a different use-case of the framework. We will go through some of these in the following sections.

Basic normal mapping

The graph for the shader basic illustrates the flow but also shows the difference between using vertex tangent space and using procedural tangent space.

  1. The subgraph basisMikkTS.shadersubgraph produces the conventional tangent and bitangent. This works for UV0 only.
  2. A procedural tangent and bitangent is obtained with the subgraph GenBasisTB.shadersubgraph using any texture coordinate.

Since this sample uses UV0 specifically, a Boolean property on the Blackboard serves as a toggle. For any other UV set, the shader would have to use the second method.

Note that a special subgraph – tex ts norm to deriv.shadersubgraph – is used to sample the tangent space normal map. Rather than returning a vector3, it returns a vector2 called a derivative. You can add or blend derivatives when you sample them using the same UV set. However, to support adding and blending when using different UV sets or even different forms of bump maps, you need to add or blend surface gradients.

To produce a surface gradient, use the subgraph called Surfgrad TBN.shadersubgraph, shown above.

To adjust the bump scale on a surface gradient, use a simple multiply node. Alternatively, you can use a subgraph called Apply Bump Scale.shadersubgraph.

Then you can convert the surface gradient into a final bump mapped normal by using the subgraph Resolve Surfgrad.shadersubgraph.

Object space normal maps

Object space normal maps are also integrated into the surface gradient-based workflow. This lets you adjust the bump mapping intensity and blend/add object space normal maps with other forms of bump maps.

Below is the graph called OS nmap basic.

First, transform the sampled normal from object space into world space. For best results, this would be transformed as a normal, but Shader Graph’s built-in transformations do not support this, so direction is the best option. Then, convert the normal into a surface gradient by using the subgraph called Normal to surfgrad.shadersubgraph. Once the bump contribution is a surface gradient, proceed as you would for other forms of bump maps – adjusting the bump scale as described in the previous section and adding/blending multiple surface gradients until you resolve at the end to produce the final normal.

Triplanar projection

Triplanar projection represents a special case for a bump map as a 3D texture. The paper “Surface Gradient Based Bump Mapping Framework” describes the full calculations to correctly do this. Below, see conventional normal map blending (left) compared with the surface gradient-based approach (right) (from a Unity presentation at Siggraph 2018). 

 

The graph Triplanar uses this method for blending by using the subgraph Triplanar to surfgrad.shadersubgraph to produce a surface gradient from a triplanar projection. As before, you can modulate the surface gradient using a bump scale, blend/add the surface gradient to other surface gradients, and then resolve to obtain the final normal.

Blending/adding multiple bump maps

One of the most valuable aspects of this framework is that it allows the correct blending of any category and number of bump maps, across any number of sets of texture coordinates, including procedural UVs. The example below shows three kinds of bump maps being blended: a tileable tangent space normal map, an object space normal map, and a bump map as a procedural 3D texture.

The graph Mixing performs blending in this way (see below). Note how each bump map results in a surface gradient that is modulated by its respective bump map intensity and also how the surface gradients can be combined by adding or blending them. In the end, the combined surface gradient is resolved to produce the final normal.

More examples

The Unity sample scene includes several other examples such as triplanar projection applied to a normal mapped surface, detail normal mapping applied after parallax correction or parallax occlusion mapping (POM), bump mapping from a height map, bump mapping using a procedural 3D texture, and more.

Outstanding issues for future work

The fact that this entire framework can be implemented using nothing but subgraphs containing all out-of-the-box nodes speaks to Shader Graph’s strength and flexibility. That being said, the framework does not comply fully with MikkTSpace for UV0 because of a few issues that we are actively working to address: 

  1. Shader Graph does not currently provide access to the unnormalized (after interpolation) tangent, bitangent and normal, which are required for full compliance MikkTSpace.
  2. Shader Graph requires the final bumped normal to be delivered in tangent space of UV0. This is problematic because we do not have access to a compliant transformation in Shader Graph.

Since Shader Graph requires the bumped normal to be delivered to the Master Node in tangent space of UV0, this is also a problem when using object space normal maps or even triplanar normal maps. A future solution to this will be to add support for delivering the normal to the Master Node in world space.

It is worth pointing out that the HDRP built-in shaders do not have this problem and are fully compliant with MikkTSpace.

What’s new in 2019.3

You can now visually author shaders in Shader Graph and use them in Visual Effect Graph to create custom looks and rendering behaviors. We have added Shader Keywords, which can create static branches in your graph. Use this for things like building your own Shader LOD system. There is also added support for vertex skinning for DOTS animation which allows you to author better water and foliage. In addition, sticky notes improve your workflow by allowing you to leave comments and explanations for whomever is working on the project. Finally, procedural pattern subgraph samples show how math can be used to create procedural shapes and patterns. 

Please share your feedback in the Shader Graph forum!

返信する

これらの HTML タグや属性を使用できます: <a href=""> <b> <code> <pre>

  1. Thank you Morten, for an unbelievable framework :)

  2. hi morton,
    thanks a lot for this heads up, much appreciated.
    but even more important i think is the fact that we actually see some efforts from all the people unity acquired during the last years.
    what is natalia working on, what part did sebastian aalton took over? you are so invisible to the community!

    1. Morten Mikkelsen

      11月 21, 2019 11:29 pm 返信

      Hi Lars,

      First thank you for expressing direct interest in our work. We are often working on a collective effort and our corner of this will most often not turn into an individual blog post. In my case I wrote the tiled (fptl) and the clustered light/refl probe list building that is in HDRP.