Search Unity

In Unity you can write your own custom shaders, but it’s no secret that writing them is hard, especially when you need shaders that interact with per-pixel lights & shadows. In Unity 3, that would be even harder because in addition to all the old stuff, your shaders would have to support the new Deferred Lighting renderer. We decided it’s time to make shaders somewhat easier to write.

Warning: a technical post ahead with almost no pictures!

Over a year ago I had a thought that «Shaders must die» (part 1, part 2, part 3). And what do you know – turns out we’re doing this in Unity 3. We call this Surface Shaders cause I’ve a suspicion «shaders must die» as a feature name wouldn’t have flied very far.


The main idea is that 90% of the time I just want to declare surface properties. This is what I want to say:

Hey, albedo comes from this texture mixed with this texture, and normal comes from this normal map. Use Blinn-Phong lighting model please, and don’t bother me again!

With the above, I don’t have to care whether this will be used in a forward or deferred rendering, or how various light types will be handled, or how many lights per pass will be done in a forward renderer, or how some indirect illumination SH probes will come in, etc. I’m not interested in all that! These dirty bits are job of rendering programmers, just make it work dammit!

This is not a new idea. Most graphical shader editors that make sense do not have «pixel color» as the final output node; instead they have some node that basically describes surface parameters (diffuse, specularity, normal, …), and all the lighting code is usually not expressed in the shader graph itself. OpenShadingLanguage is a similar idea as well (but because it’s targeted at offline rendering for movies, it’s much richer & more complex).


Here’s a simple – but full & complete – Unity 3.0 shader that does diffuse lighting with a texture & a normal map.

Surface Shader: Diffuse NormalmappedGiven pretty model & textures, it can produce pretty pictures! How cool is that?

I grayed out bits that are not really interesting (declaration of serialized shader properties & their UI names, shader fallback for older machines etc.). What’s left is Cg/HLSL code, which is then augmented by tons of auto-generated code that deals with lighting & whatnot.

This surface shader dissected into pieces:

  • #pragma surface surf Lambert: this is a surface shader with main function «surf», and a Lambert lighting model. Lambert is one of predefined lighting models, but you can write your own.
  • struct Input: input data for the surface shader. This can have various predefined inputs that will be computed per-vertex & passed into your surface function per-pixel. In this case, it’s two texture coordinates.
  • surf function: actual surface shader code. It takes Input, and writes into SurfaceOutput (a predefined structure). It is possible to write into custom structures, provided you use lighting models that operate on those structures. The actual code just writes Albedo and Normal to the output.

What is generated

Unity’s «surface shader code generator» would take this, generate actual vertex & pixel shaders, and compile them to various target platforms. With default settings in Unity 3.0, it would make this shader support:

  • Forward renderer and Deferred Lighting (Light Pre-Pass) renderer.
  • Objects with precomputed lightmaps and without.
  • Directional, Point and Spot lights; with projected light cookies or without; with shadowmaps or without. Well ok, this is only for forward renderer because in Deferred Lighting the lighting happens elsewhere.
  • For Forward renderer, it would compile in support for lights computed per-vertex and spherical harmonics lights computed per-object. It would also generate extra additive blended pass if needed for the case when additional per-pixel lights have to be rendered in separate passes.
  • For Deferred Lighting, it would generate base pass that outputs normals & specular power; and a final pass that combines albedo with lighting, adds in any lightmaps or emissive lighting etc.
  • It can optionally generate a shadow caster rendering pass (needed if custom vertex position modifiers are used for vertex shader based animation; or some complex alpha-test effects are done).

For example, here’s code that would be compiled for a forward-rendered base pass with one directional light, 4 per-vertex point lights, 3rd order SH lights; optional lightmaps (I suggest just scrolling down):

Of those 90 lines of code, 10 are your original surface shader code; the remaining 80 would have to be pretty much written by hand in Unity 2.x days (well ok, less code would have to be written because 2.x had less rendering features). But wait, that was only base pass of the forward renderer! It also generates code for additive pass, for deferred base pass, deferred final pass, optionally for shadow caster pass and so on.

So this should be an easier to write lit shaders (it is for me at least). I hope this will also increase the number of Unity users who can write shaders at least 3 times (i.e. to 30 up from 10!). It should be more future proof to accomodate changes to the lighting pipeline we’ll do in Unity next.

Predefined Input values

The Input structure can contain texture coordinates and some predefined values, for example view direction, world space position, world space reflection vector and so on. Code to compute them is only generated if they are actually used. For example, if you use world space reflection to do some cubemap reflections (as emissive term) in your surface shader, then in Deferred Lighting base pass the reflection vector will not be computed (since it does not output emission, so by extension does not need reflection vector).

Surface Shader: Rim LightingAs a small example, the shader above extended to do simple rim lighting:

Vertex shader modifiers

Surface Shader: Normal ExtrusionIt is possible to specify custom «vertex modifier» function that will be called at start of the generated vertex shader, to modify (or generate) per-vertex data. You know, vertex shader based tree wind animation, grass billboard extrusion and so on. It can also fill in any non-predefined values in the Input structure.

My favorite vertex modifier? Moving vertices along their normals.

Custom Lighting Models

There are a couple simple lighting models built-in, but it’s possible to specify your own. A lighting model is nothing more than a function that will be called with the filled SurfaceOutput structure and per-light parameters (direction, attenuation and so on). Different functions would have to be called in forward & deferred rendering cases; and naturally the deferred one has much less flexibility. So for any fancy effects, it is possible to say «do not compile this shader for deferred», in which case it will be rendered via forward rendering.

Surface Shader: Wrapped Lambert lightingExample of wrapped-Lambert lighting model:

Behind the scenes

We’re using HLSL parser from Ryan Gordon’s mojoshader to parse the original surface shader code and infer some things from the abstract syntax tree mojoshader produces. This way we can figure out what members are in what structures, go over function prototypes and so on. At this stage some error checking is done to tell the user his surface function is of wrong prototype, or his structures are missing required members – which is much better than failing with dozens of compile errors in the generated code later.

To figure out which surface shader inputs are actually used in the various lighting passes, we’re generating small dummy pixel shaders, compile them with Cg and use Cg’s API to query used inputs & outputs. This way we can figure out, for example, that a normal map nor it’s texture coordinate is not actually used in Deferred Lighting final pass, and save some vertex shader instructions & a texcoord interpolator.

The code that is ultimately generated is compiled with various shader compilers depending on the target platform (Cg for Windows/Mac, XDK HLSL for Xbox 360, PS3 Cg for PS3, and our own fork of HLSL2GLSL for iPhone, Android and upcoming NativeClient port of Unity).

So yeah, that’s it. We’ll see where this goes next, or what happens when Unity 3 will be released. I hope more folks will try to write shaders!

39 replies on “Unity 3 technology – Surface Shaders”

@unisip: if you do not want to do any lighting, you should not use a surface shader. The reason for surface shaders is «taking care of all lighting details». If you do not need that, just write a vertex + fragment shader pair.

Hey Aras,

This surface shader thing is totally awesome!!!

When do we get a more comprehensive documentation about all this?

Right now I’m a bit frustrated as I feel somewhat limited to trial and error and copy/paste from other shaders to make it all work.

There are simple things that are probably obvious to you guys but that are not to newcomers. For instance, I couldn’t find a way to disable lighting altogether other than creating a custom lighting model that simply returns 0 (there’s gotta be a more efficient way to do that, right ;-) ).

The result is that I probably get shaders that are much longer than needed. Simple example: I couldn’t get a basic glass shader (transparent fresnel bump reflective cubemap, no lighting) to compile to shader model 2.0 (plain and easy writing the whole thing without surface shaders, but I definitely want to stick to your pipeline as much as I can).


I mean, maybe if you’re doing a huge project cost may leave the effort, but usually if you have something as common as a decent character, you should learn the whole language and syntax, just to write a couple of shaders. It’s too expensive, and is an area in which Unity will highlight offering a new approach for materials design.

Definitely shaders must die. Even version 3.0 ones.
I would like a first level of abstraction language to deal with real world parameters. I want to make stone, skin, hair materials. I don’t care what the hardware needs to do this.
Then the current level can be here for the freaks of the effects and performance.
Moreover, I think there should be predefined shaders for these things often necessary as stone, skin or hair.
Unity is supposed to be friendly and easy to use…

Hi Aras, I see that vertex and frag shader still works, um… not sure what you mean by handle lightning on deferred rendering and forward rendering, but I do see that I can’t produce anything with the shaders because it doesn’t take in any sort of light. What do I have to do to setup my own lightning model and enable lightning inside pixel and vertex shaders?

How would this actually be compatible with DX11, sure omit vertex shader is fine, but what about other shaders coming out, such as the compute shader and so on, are those going to be pre defined too. Not sure if this is a good idea for later on down the road where unity has to upgrade to DX11, loosing many more complex functions that programmer couldve written.

Thanks for your post and very clear exposition of the aspects of shading
Long ago, in the 90s I did some shading and rendering with renderman.
Can you suggest some reading (books or papers) in relation with your current topic so that I can get back into that field.
My company did just joined Unity and we are doing serious game and simulation for businesses.
Thanks again

Ok, so summer will be coming to a close very soon. You’ve not blogged in a while and we (the ones who cannot afford to preorder at this moment) are wondering when is Unity 3 coming??

The guys from Unity should also release a video that demonstrates the «Umbra Occlusion Culling» technology.

Looks cool but I just hope I won’t make it even harder to write custom FX shaders beyond tex+bumpmap+3lights+(maybe alphablended). If «#pragma surface surf Lambert» is going to save me X hours, I would be great if the X > 0. What is ARB_fog_exp2 and how it augments my code took me a while to figure it out.. All I could find was ‘here is the fog shader’ and «better don’t mess with it». I don’t say that documentation by example is wrong. Far from that. It’s required if you want to learn and copy&paste stuff. But if you are serious about writing shaders and your lighting models what you need is Reference with list of all magic values, secret functions, _UnderscorePredefiniedVariables and DONT_TOUCH_THIS sematntics.

I’m really looking forward to this! Tons of stuff can be done already with shaders in Unity, but simplifying the process will really increase how much actually is done with shaders.

Indeed this is fantastico Aras! Awesome work!

Hope the «couple simple lighting models are built-in» are also built-in-to the documentation :)

Indeed, the shader horse died quickly. Wait… what!? :P


If you are not rolling any visual editor, I strongly suggest you put up a dozen or more tutorials on how to learn to edit shaders in Unity 3. I have limited knowledge on that side, what I learned come from the excellent book SHADERS for game programmers and artist. Even after that, complicated terms and medium to hard shader seems beyond reach. Do you have on your side have any book suggestions that we could benefit on to learn this complex area of 3d programming ?

This looks fantastic Aras. I recall reading your «shaders must die» blog posts a while ago and always hoped you’d get time to implement such a progressive shading language in Unity some day.

I dabbled in Shader Programming for Unity 2.x but never really managed to get my training weels off so to speak. I just never found enough time to become intimate with a shader language during development. However, this looks to make shader programming a much more worthwhile investment of production time for small indy teams which is where Unity really shines.

Something very strange is going on.
Unity 3 now has Illuminate Labs Beast technology and Autodesk has just acquired Illuminate Labs.
I wonder what these guys are cooking up.
What do you think ?

I think this is really extremely awesome and certainly development time well spent because it will safe many of us a *lot* of time. While I agree that a visual editor for shaders would be nice to have I think UTs time is better spent doing these kinds of improvements and let the community build their own shader editors on top of that. Guess I’ll have to play around with shaders again ;-)


– «when you declare a texture property (_MainTex), the UV sets are generated»… that was exactly my first concern. If each time you declare a texture property (_MainTex) a UV stream is generated, then I have no way to have 2 textures share the same UV set without wasting a whole uv stream (memory concerns here), as the second one will be generated automatically by the underlying engine.
Unless some clever work is done by the compiler through detecting inside the vertex shader that the second uv set if never really referenced by any instruction.
…Or unless uv stream get generated only if explicit uv set are exported (in the same channel order ?)

– About Syntax highlighting; I just thought that Unity 3 would integrate its own shader editor (without having to use an external one), like what Virtools guys did.
This isn’t a big concern thou, but because Aras introduce this formalism of «surface shader» inside Unity it’d have been nice to see the «irrelevant part» really grayed out in the editor (just like in this blog). So you don’t get confused
If there’s no integrated shader editor planed in Unity 3 then my question makes no sense obviously :)

This is ridiculously wonderful. One of the best addons to Unity so far.
Unity is not only getting better with v3, but also v3 will improve your workflow and reduce your dev time by miles. I remember working and tweaking simple HLSL shaders in the past, wondering when this kind of thing will pop up, and here is. :D
We can now focus on the fun part of a shader! The visual and the look of your material in the surface. No more need to worry! Omg, amazing! :)
Keep the good work guys.

[…] Unity 3 Technologie – Surface Shaders ° Insight & Arena Demo @ […]


-UVs: when you declare a texture property (_MainTex), the UV sets are generated. If you want access to these generated UVs, you must pass them as input to your surface shaders (uv_MainTex)(the name is important!). Then, tex2D takes in a sampler2D for the texture and the UV set that you want to use as coordinates.

-Syntax highlighting? The code that Aras greyed out are uninteresting as he clearly said «grayed out bits that are not really interesting.» The syntax highlighting in your editor is really something specific to your chosen editor. If your editor is highlighting functions and structs in Unity 2.x cgshader code, then it will be the same in Unity 3.x as the Input is just a struct, and surface and lighting functions are just functions.

Thanks for the reply, and happy to here that you achieved such a good compilation time!

– Ok, it seems I’m missing something with UV sets. I thought the float2 uv stream were generated (filled) by the engine automatically if a texture was declared in the shader code, but considering what you’re saying this isn’t the case… I just have to figure out how to declare a new UV set (RTFM :)

– Abut syntax highlighting, my concern was: is the part you wrote in grey in your example is supposed to be greyed in the shader editor also, or is there some nice and special colors to make it more readable, or is there no special treatment at all in wich case it is interpreted as standard HLSL while in fact it’s not really (like the CGPROGRAM, Tags or Properties words wich have special meannings).

Of course, with Unity’s editor extensibility noone’s gonna stop you from writing a visual shader editor. The surface shader approach will only make it even easier to do so :)

@Casten…I totally agree….I was waiting for it in 3.0….and a lot of other people I know were waiting for it. Considering how easy it is to do so many other cool things in Unity…compared to other game engines…I would have thought a Visual Shader Editor was a no-brainer.

This is easily the most exciting feature of Unity 3 I’ve seen yet! Why wasn’t this bragged about sooner! I wondered what you meant when you mentioned to me at GDC said that you were trying to simplify the shader writing process…

Great job Aras !

Some questions thou:

– What if I need only one UV stream and two texture… is the other stream still being allocated?

– What is the typical compilation time (provided you may or may not need skinning, shadows, deferred) ?

– How does the syntax highlighting look like ?

I still think that Unity should also have a visual shader editor like Unreal Engine/UDK has.

Shader 3.0 is the type of feature which may not dazzle the crowds, not much glitter in job well done is there, but it is far more important than pretty cloth or baked textures because, unlike these eye candies, it addresses the very core of what Unity stands for : workflow – and so far it is the biggest improvement in that area.

I’ll starting learning the stuff about a 20 days later. I’ll start learning all low level things like programming OpenGL/D3D and shader programming with CG. hopefully in fall i would be able to write some shaders that they are useful for me and the community

Comments are closed.