Search Unity

Bringing DirectX 11 features to mobile in Unity 5.1

May 26, 2015 in Technology | 11 min. read
Topics covered
Share

Is this article helpful for you?

Thank you for your feedback!

One of the new features in Unity 5.1 is a new unified OpenGL rendering backend.

... A unified what now?

Until now, we had a separate renderer for OpenGL ES 2.0, one for OpenGL ES 3.0 (that shared a good deal, but not all, code with ES 2.0) and then a completely different one for the desktop OpenGL (that was stuck in the OpenGL 2.1 feature set). This, of course, meant a lot of duplicate work to get new features in, various bugs that may or may not happen on all renderer versions etc.

So, in order to get some sense into this, and in order to make it easier to add features in the future, we created a unified GL renderer. It can operate in various different feature levels, depending on the available hardware:

  • OpenGL ES 2.0
  • OpenGL ES 3.0
  • OpenGL ES 3.1 ( + Android Extension Pack)
  • desktop OpenGL: all versions from 2.1 to 4.5 (desktop OpenGL support is experimental in 5.1)

All the differences between these API versions are baked into a capabilities structure based on the detected OpenGL version and the extensions that are available. This has multiple benefits, such as:

  • When an extension from the desktop GL land is brought to mobiles (such as Direct State Access), and we already support that on desktop, it is automatically detected on mobiles and taken into use.
  • We can artificially clamp the caps to match whichever target level (and extension set) we wish, for emulation purposes.
  • Provided that the necessary compatibility extensions are present on desktop, we can run GL ES 2.0 and 3.x shaders directly in the editor (again, still experimental in 5.1).
  • We get to use all the desktop graphics profiling and debugging tools against the OpenGL code already on the desktop and catch most of the rendering issues there.
  • We do not need to maintain separate diverging codebases, bugs need to only be fixed once and all optimizations we do benefit all the platforms simultaneously.

Compute shaders

<a href="https://assetstore.unity.com/packages/tools/particles-effects/cocuy-the-fluid-simulator-33564" target="_blank" rel="noopener noreferrer">Cocuy 2D fluid simulation package </a>from the Unity Asset Store running on OpenGL ES 3.1. No modifications needed.

One of the first new features we brought to the new OpenGL renderer is compute shaders and image loads/stores (UAVs in DX11 parlance). And again, as we have an unified codebase, it is (more or less) automatically supported on all GL versions that support compute shaders (desktop OpenGL 4.3 onwards and OpenGL ES 3.1 onwards). The compute shaders are written in HLSL just as you'd do on DX11 in previous versions of Unity, and they get translated to GLSL. You'll use the same Graphics.SetRandomWriteTarget scripting API to bind the UAVs and the same Dispatch API to launch the compute process. The UAVs are also available on other shader stages if supported by the HW (do note that some, usually mobile, GPUS have limitations on that, for example the Mali T-604 in Nexus 10 only supports image loads/stores in compute shaders, not in pixel or vertex shaders).

Tessellation and Geometry shaders

GPU Tessellation running on OpenGL ES 3.1

Both tessellation and geometry shaders from DX11 side should work directly on Android devices supporting Android Extension Pack. The shaders are written as usual, with either #pragma target 50 or #pragma target es31aep (see below for the new shader targets), and it'll "just work" (if it doesn't, please file a bug).

Other goodies

Here's a short list of other things that are working mostly the same as on DX11

  • DrawIndirect using the results of compute shader via append/consume buffers. The API is the same as the DX11 features are currently using.
  • Advanced blend modes (dodge, burn, darken, lighten, etc) are exposed whenever the KHR_blend_equation_advanced extension is supported by the GPU. The extension is part of the Android Extension Pack, and can be found on most semi-recent desktop GPUs as well as the high-end mobile ones (Adreno 4xx, Mali 7xx, nVidia K1+). DirectX 11 does not support these blend modes. These can be set both from the scripting API and from ShaderLab shaders. The new blend mode enums can be found from the UnityEngine.Rendering.BlendOp documentation.

Differences from DX11

There are some differences to the feature set available in DX11, apart from things discussed above:

  • The mobile GPUs have fairly limited list of supported UAV formats: 16- and 32-bit floating point RGBA, RGBA Int32, 8-bit RGBA, and single-channel 32-bit Int and floating point formats. Notably, the 2-channel RG formats are not supported for any data type. These formats are available on desktop GL rendering, though.
  • GL ES 3.1 does not support any other HLSL Shader interpolation qualifiers than 'centroid', all other qualifiers are ignored in ES shaders.
  • GL ES 3.1 still does not mandate floating-point render targets, although most GPUs do support them through extensions
  • The memory layout for structured compute buffers have some minor differences between DX11 and OpenGL, so make sure your data layouts match on both renderers. We're working on minimizing the impact of this, though.

Shader pipe

The shader compilation process for ES 2.0 and the old desktop GL renderer (and, until now, for ES3.0 as well) is as follows:

The problem with this is that neither of the modules above support anything later than Shader Model 3.0 shaders, effectively limiting the shaders to DX9 feature set. In order to compile HLSL shaders that use DX11 / Shader Model 5.0 features, we are using the following shader compilation pipeline for GL ES 3.0 and above, and for all desktop GL versions running on unified GL backend:

The new shader pipeline seems to be working fairly well for us, and allows us to use the shader model 5.0 features. It also can benefit from the optimizations the D3D compiler performs (but also all the drawbacks of having a bytecode that treats everything as vec4's, always). As a downside, we'll have a dependency to the D3D compiler and the language syntax it provides, so we'll have to go through some hoops to get our Unity-specific language features through (such as sampler2D_float for sampling depth textures).

Existing OpenGL ES 3.0 (and of course, OpenGL ES 2.0) shaders should continue to work as they did previously. If they do not, please file a bug.

So, how can I use it?

For Unity 5.1 release, we are not yet deprecating the legacy OpenGL renderer, so it will still be used on OS X and on Windows when using the -force-opengl flag. The desktop GL renderer is still considered very experimental at this point, but it will be possible to activate it with the following command line arguments for both the editor and standalone player (currently Windows only, OSX and Linux are on our TODO list):

  • "-force-glcore" Force best available OpenGL mode
  • "-force-glcoreXY" Force OpenGL Core X.Y mode
  • "-force-gles20" Force OpenGL ES 2.0 mode, requires ARB_ES2_compatibility extension on desktop
  • "-force-gles30" Force OpenGL ES 3.0 mode, requires ARB_ES3_compatibility
  • "-force-gles31" Force OpenGL ES 3.1 mode, requires ARB_ES3_1_compatibility
  • "-force-gles31aep" Force OpenGL ES 3.1 mode + Android Extension Pack feature level, requires ARB_ES_3_1_compatibility and the extensions contained in the AEP (if used by the application)

Remember to include the corresponding shaders in the Standalone Player Settings dialog (uncheck the "Automatic Graphics API" checkbox and you'll be able to manually select the shader languages that will be included).

Note that these flags (including the ES flags) can also be used when launching the editor, so the user will see the rendering results of the actual ES shaders that will be used on the target. Also note that these features are to be considered experimental on desktop at this stage, so experiment with these at your own risk. In 5.1, you can also use the standalone player to emulate GL ES targets: In Player settings just make sure you include GL ES2/3 shaders in the graphics API selection and start the executable with one of the -force-glesXX flags above. We're also working on getting this to function on Linux as well.

There are some known issues with running ES shaders on the desktop: Desktop and mobiles use different encoding for normal maps and lightmaps, so the ES shaders expect the data to be in different encoding than what's being packaged alongside the standalone player build. The OpenGL Core shader target should work as expected.

On iOS, the only change is that the ES 3.0 shaders will be compiled using the new shader pipeline. Please report any breakage. ES 2.0 and Metal rendering should work exactly as before. Again, please report any breakage.

On Android, if the “Automatic Graphics API” checkbox is cleared, you can select which shaders to include in your build, and also set manifest requirements for OpenGL ES 3.1 and OpenGL ES 3.1 + Android Extension Pack (remember to set your required API level to Android 5.0 or later as well). The default setting is that the highest available graphics level will always be used.

AN IMPORTANT NOTE:

Apart from some fairly rare circumstances, there should never be any need to change the target graphics level from Automatic. ES 3.1 and ES 3.0 should work just as reliably as ES 2.0, and if this isn't the case, please file a bug. (Of course it is possible to write a shader using #pragma only_renderers etc that will break on ES3 vs ES2 but you'll get the idea.) Same applies to the desktop GL levels once we get them ready. The Standard shader is currently configured to use a simpler version of the BRDF on ES 2.0 (and also cuts some other corners here and there for performance reasons), so you can expect the OpenGL ES 3.0 builds to both have more accurate rendering results and have slightly lower performance figures compared to ES 2.0. Similarily, directional realtime lightmaps require more texture units than is guaranteed to be available in ES 2.0, so they are disabled there.

When writing ShaderLab shaders, the following new #pragma target enums are recognized:

  • #pragma target es3.0  // Requires OpenGL ES 3.0, desktop OpenGL 3.x or DX Shader Model 4.0, sets SHADER_TARGET define to 35
  • #pragma target es3.1  // Requires OpenGL ES 3.1, desktop OpenGL 4.x (with compute shaders) or DX Shader Model 5.0. Sets SHADER_TARGET define to 45

When using the existing #pragma targets, they map to following GL levels:

  • #pragma target 40 // Requires OpenGL ES 3.1 or desktop OpenGL 3.x or DX Shader Model 4.0
  • #pragma target 50 // Requires OpenGL ES 3.1 + Android Extension Pack, desktop OpenGL >= 4.2 or DX Shader Model 5.0

For including and excluding shader platforms from using a specific shaders, the following #pragma only_renderers / exclude_renderers targets can be used:

  • #pragma only_renderers gles  // As before: Only compile this shader for GL ES 2.0. NOTE: ES 3.0 and later versions will not be able to load this shader at all!
  • #pragma only_renderers gles3  // Only compile for OpenGL ES 3.x. NOTE: All ES levels starting from OpenGL ES 3.0 will use the same shader target. Shaders using AEP features, for example, will simply be marked as unsupported on OpenGL ES 3.0 hardware
  • #pragma only_renderers glcore // Only compile for the desktop GL. Like the ES 3 target, this also scales up to contain all desktop GL versions, where basic shaders will support GL 2.x while shaders requiring SM5.0 features require OpenGL 4.2+.

Future development

As described above, a common GL codebase allows us to finally bring more features to the OpenGL / ES renderer. Here are some things we'll be working on next (no promises, schedule- or otherwise, your mileage may vary, please talk with your physician before use, and all the other usual disclaimers apply):

  • Finalise desktop GL, deprecate the legacy GL renderer and use this as the new default.
  • Deprecate the old "GL ES 2.0 graphics emulation" mode in the editor (it basically just clamps the DX renderer to Shader Model 2.0) and replace it with actually using the ES shaders and rendering backend.
  • More accurate target device emulation: Once we can run the ES shaders in the editor directly, we can finally do more accurate target device emulation. Using the caps system, we'd generate a database of GL caps for lots of Android/iOS devices, containing each supported GL extension, supported texture formats etc and apply them to the editor renderer directly. This way the developer could see (approximately) what the scene should look like on any given device (apart from differences in GPU-specific bugs, shader precisions etc).
May 26, 2015 in Technology | 11 min. read

Is this article helpful for you?

Thank you for your feedback!

Topics covered