Search Unity

Unity 3 launched recently, and with Surface Shaders we made it much easier to handle lighting, shadowing & rendering paths pieces of shaders. But Unity can run on so many platforms (some having very different shader languages), yet you can write a shader once, and somehow it «just works» on all platforms. How do we do that? I’m going to talk about some behind-the-scenes technology involved in compiling shaders.

Warning: technical post ahead!

Shading Languages

There are two widely used shader languages out there: HLSL (or Cg, which is the same for practical purposes) and GLSL. Cg/HLSL is used by Direct3D, Xbox 360 and PS3. GLSL is used by mobile platforms (OpenGL ES 2.0), Mac OS X (OpenGL) and upcoming WebGL.

Traditionally in Unity shaders are written in Cg/HLSL (there’s an option to write GLSL if you want to), so we had to find a way to take Cg shader and produce GLSL shader out of it. So you could write a shader like:

…and somehow it would also work on mobile platforms on OpenGL ES 2.0. In short, you can do that in Unity 3. But if you want to know how that works, read on.

HLSL to GLSL Translator

The two shading languages are similar in principle, but there are lots of subtle differences that make a regexp-based converter not quite work (regular expression based parsers are rarely a good idea by the way).

So we took HLSL2GLSL, an existing open source project that ATI has made 4 years ago and seemingly abandoned. Turns out it needed some massaging to get it work, as well as a ton of bug fixes, missing features and general code cleanup. But hey, it seems to work now!

Our fork of this translator, named «hlsl2glslfork», is here:
(you can probably tell that I’m not too good at picking fancy names)

So now we can translate Cg/HLSL shaders into GLSL, and we’re done? Turns out, not quite so.

GLSL Optimizer

The shaders produced by the translator work on mobile platforms (iOS/Android). However, some of them were running very slow. The OpenGL ES 2.0 drivers in the mobile platforms aren’t very good at optimizing shaders! Or rather, some of them are seriously bad at optimizing.

Suffice to say, something extremely simple like a particle shader was running 6 times slower than it should have. Six. Times. Ouch!

Almost by accident, I discovered that Mesa 3D guys are working on new GLSL compiler. I looked at the code and I liked it a lot; very hackable and “no bullshit” approach. So I took that Mesa’s GLSL compiler and made it output GLSL back after it has done all the optimizations (dead code removal, algebraic simplifications, constant propagation, constant folding, inlining and a bunch of other cryptic things that don’t mean anything to a normal person).

Here it is, named «GLSL Optimizer»:
(here’s my fancy-name-choosing in action again)

The good news is that it solves the performance problems on iOS/Android platforms. Yay!

Further work

For mobile platforms, using appropriate precision qualifiers is very important for performance. If some value has low range and does not need high precision, using «lowp» (low precision) qualifier can potentially make the shader work much faster. In Unity 3.0 we do some automagic to put the appropriate precision, but we could be better at it. I’m experimenting with proper precision support in HLSL to GLSL translator and the GLSL Optimizer, so that fixed/half/float types in Cg/HLSL would come out as lowp/mediump/highp in the resulting GLSL… we’ll see how that turns out.

Maybe at some point in the future it would make sense to have the opposite translation tool? So one could write GLSL shaders, and they would be converted into Cg/HLSL for the other platforms…

Oh, and next week Unity’s mobile tech. lead Renaldas «ReJ» Zioma is going to blog something technology related as well. Hear me ReJ? Now you have to do it!

6 replies on “Shader Compilation for Multiple Platforms”

I tried to build glslopt (in /contrib/glslopt) under vc2008 to see how it can optimize my generic shaders emulating GE (PSP gpu). But I couldn’t : the project is lacking a lot of files. It seems the project and the solution may need to be updated. My concern is more about how glslopt may «rewrite» the shader in a way more portable for all video card than for performance, because I get very different results with nvidia and amd.

«I looked at the code and I liked it a lot; very hackable and “no bullshit” approach.»

Haha, I love it! :D

Comments are closed.