Categories & Tags
Archive

Unity 3 technology – Surface Shaders

July 17, 2010 in Technology, Unity Products and Services by

In Unity you can write your own custom shaders, but it’s no secret that writing them is hard, especially when you need shaders that interact with per-pixel lights & shadows. In Unity 3, that would be even harder because in addition to all the old stuff, your shaders would have to support the new Deferred Lighting renderer. We decided it’s time to make shaders somewhat easier to write.

Warning: a technical post ahead with almost no pictures!

Over a year ago I had a thought that “Shaders must die” (part 1, part 2, part 3). And what do you know – turns out we’re doing this in Unity 3. We call this Surface Shaders cause I’ve a suspicion “shaders must die” as a feature name wouldn’t have flied very far.

Idea

The main idea is that 90% of the time I just want to declare surface properties. This is what I want to say:

Hey, albedo comes from this texture mixed with this texture, and normal comes from this normal map. Use Blinn-Phong lighting model please, and don’t bother me again!

With the above, I don’t have to care whether this will be used in a forward or deferred rendering, or how various light types will be handled, or how many lights per pass will be done in a forward renderer, or how some indirect illumination SH probes will come in, etc. I’m not interested in all that! These dirty bits are job of rendering programmers, just make it work dammit!

This is not a new idea. Most graphical shader editors that make sense do not have “pixel color” as the final output node; instead they have some node that basically describes surface parameters (diffuse, specularity, normal, …), and all the lighting code is usually not expressed in the shader graph itself. OpenShadingLanguage is a similar idea as well (but because it’s targeted at offline rendering for movies, it’s much richer & more complex).

Example

Here’s a simple – but full & complete – Unity 3.0 shader that does diffuse lighting with a texture & a normal map.

  Shader "Example/Diffuse Bump" {
    Properties {
      _MainTex ("Texture", 2D) = "white" {}
      _BumpMap ("Bumpmap", 2D) = "bump" {}
    }
    SubShader {
      Tags { "RenderType" = "Opaque" }
      CGPROGRAM
      #pragma surface surf Lambert
      struct Input {
        float2 uv_MainTex;
        float2 uv_BumpMap;
      };
      sampler2D _MainTex;
      sampler2D _BumpMap;
      void surf (Input IN, inout SurfaceOutput o) {
        o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
        o.Normal = UnpackNormal (tex2D (_BumpMap, IN.uv_BumpMap));
      }
      ENDCG
    } 
    Fallback "Diffuse"
  }

Surface Shader: Diffuse NormalmappedGiven pretty model & textures, it can produce pretty pictures! How cool is that?

I grayed out bits that are not really interesting (declaration of serialized shader properties & their UI names, shader fallback for older machines etc.). What’s left is Cg/HLSL code, which is then augmented by tons of auto-generated code that deals with lighting & whatnot.

This surface shader dissected into pieces:

  • #pragma surface surf Lambert: this is a surface shader with main function “surf”, and a Lambert lighting model. Lambert is one of predefined lighting models, but you can write your own.
  • struct Input: input data for the surface shader. This can have various predefined inputs that will be computed per-vertex & passed into your surface function per-pixel. In this case, it’s two texture coordinates.
  • surf function: actual surface shader code. It takes Input, and writes into SurfaceOutput (a predefined structure). It is possible to write into custom structures, provided you use lighting models that operate on those structures. The actual code just writes Albedo and Normal to the output.

What is generated

Unity’s “surface shader code generator” would take this, generate actual vertex & pixel shaders, and compile them to various target platforms. With default settings in Unity 3.0, it would make this shader support:

  • Forward renderer and Deferred Lighting (Light Pre-Pass) renderer.
  • Objects with precomputed lightmaps and without.
  • Directional, Point and Spot lights; with projected light cookies or without; with shadowmaps or without. Well ok, this is only for forward renderer because in Deferred Lighting the lighting happens elsewhere.
  • For Forward renderer, it would compile in support for lights computed per-vertex and spherical harmonics lights computed per-object. It would also generate extra additive blended pass if needed for the case when additional per-pixel lights have to be rendered in separate passes.
  • For Deferred Lighting, it would generate base pass that outputs normals & specular power; and a final pass that combines albedo with lighting, adds in any lightmaps or emissive lighting etc.
  • It can optionally generate a shadow caster rendering pass (needed if custom vertex position modifiers are used for vertex shader based animation; or some complex alpha-test effects are done).

For example, here’s code that would be compiled for a forward-rendered base pass with one directional light, 4 per-vertex point lights, 3rd order SH lights; optional lightmaps (I suggest just scrolling down):

#pragma vertex vert_surf
#pragma fragment frag_surf
#pragma fragmentoption ARB_fog_exp2
#pragma fragmentoption ARB_precision_hint_fastest
#pragma multi_compile_fwdbase
#include "HLSLSupport.cginc"
#include "UnityCG.cginc"
#include "Lighting.cginc"
#include "AutoLight.cginc"
struct Input {
	float2 uv_MainTex : TEXCOORD0;
};
sampler2D _MainTex;
sampler2D _BumpMap;
void surf (Input IN, inout SurfaceOutput o)
{
	o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
	o.Normal = UnpackNormal (tex2D (_BumpMap, IN.uv_MainTex));
}
struct v2f_surf {
  V2F_POS_FOG;
  float2 hip_pack0 : TEXCOORD0;
  #ifndef LIGHTMAP_OFF
  float2 hip_lmap : TEXCOORD1;
  #else
  float3 lightDir : TEXCOORD1;
  float3 vlight : TEXCOORD2;
  #endif
  LIGHTING_COORDS(3,4)
};
#ifndef LIGHTMAP_OFF
float4 unity_LightmapST;
#endif
float4 _MainTex_ST;
v2f_surf vert_surf (appdata_full v) {
  v2f_surf o;
  PositionFog( v.vertex, o.pos, o.fog );
  o.hip_pack0.xy = TRANSFORM_TEX(v.texcoord, _MainTex);
  #ifndef LIGHTMAP_OFF
  o.hip_lmap.xy = v.texcoord1.xy * unity_LightmapST.xy + unity_LightmapST.zw;
  #endif
  float3 worldN = mul((float3x3)_Object2World, SCALED_NORMAL);
  TANGENT_SPACE_ROTATION;
  #ifdef LIGHTMAP_OFF
  o.lightDir = mul (rotation, ObjSpaceLightDir(v.vertex));
  #endif
  #ifdef LIGHTMAP_OFF
  float3 shlight = ShadeSH9 (float4(worldN,1.0));
  o.vlight = shlight;
  #ifdef VERTEXLIGHT_ON
  float3 worldPos = mul(_Object2World, v.vertex).xyz;
  o.vlight += Shade4PointLights (
    unity_4LightPosX0, unity_4LightPosY0, unity_4LightPosZ0,
    unity_LightColor0, unity_LightColor1, unity_LightColor2, unity_LightColor3,
    unity_4LightAtten0, worldPos, worldN );
  #endif // VERTEXLIGHT_ON
  #endif // LIGHTMAP_OFF
  TRANSFER_VERTEX_TO_FRAGMENT(o);
  return o;
}
#ifndef LIGHTMAP_OFF
sampler2D unity_Lightmap;
#endif
half4 frag_surf (v2f_surf IN) : COLOR {
  Input surfIN;
  surfIN.uv_MainTex = IN.hip_pack0.xy;
  SurfaceOutput o;
  o.Albedo = 0.0;
  o.Emission = 0.0;
  o.Specular = 0.0;
  o.Alpha = 0.0;
  o.Gloss = 0.0;
  surf (surfIN, o);
  half atten = LIGHT_ATTENUATION(IN);
  half4 c;
  #ifdef LIGHTMAP_OFF
  c = LightingLambert (o, IN.lightDir, atten);
  c.rgb += o.Albedo * IN.vlight;
  #else // LIGHTMAP_OFF
  half3 lmFull = DecodeLightmap (tex2D(unity_Lightmap, IN.hip_lmap.xy));
  #ifdef SHADOWS_SCREEN
  c.rgb = o.Albedo * min(lmFull, atten*2);
  #else
  c.rgb = o.Albedo * lmFull;
  #endif
  c.a = o.Alpha;
  #endif // LIGHTMAP_OFF
  return c;
}

Of those 90 lines of code, 10 are your original surface shader code; the remaining 80 would have to be pretty much written by hand in Unity 2.x days (well ok, less code would have to be written because 2.x had less rendering features). But wait, that was only base pass of the forward renderer! It also generates code for additive pass, for deferred base pass, deferred final pass, optionally for shadow caster pass and so on.

So this should be an easier to write lit shaders (it is for me at least). I hope this will also increase the number of Unity users who can write shaders at least 3 times (i.e. to 30 up from 10!). It should be more future proof to accomodate changes to the lighting pipeline we’ll do in Unity next.

Predefined Input values

The Input structure can contain texture coordinates and some predefined values, for example view direction, world space position, world space reflection vector and so on. Code to compute them is only generated if they are actually used. For example, if you use world space reflection to do some cubemap reflections (as emissive term) in your surface shader, then in Deferred Lighting base pass the reflection vector will not be computed (since it does not output emission, so by extension does not need reflection vector).

Surface Shader: Rim LightingAs a small example, the shader above extended to do simple rim lighting:

  #pragma surface surf Lambert
  struct Input {
      float2 uv_MainTex;
      float2 uv_BumpMap;
      float3 viewDir;
  };
  sampler2D _MainTex;
  sampler2D _BumpMap;
  float4 _RimColor;
  float _RimPower;
  void surf (Input IN, inout SurfaceOutput o) {
      o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
      o.Normal = UnpackNormal (tex2D (_BumpMap, IN.uv_BumpMap));
      half rim =
          1.0 - saturate(dot (normalize(IN.viewDir), o.Normal));
      o.Emission = _RimColor.rgb * pow (rim, _RimPower);
  }

Vertex shader modifiers

Surface Shader: Normal ExtrusionIt is possible to specify custom “vertex modifier” function that will be called at start of the generated vertex shader, to modify (or generate) per-vertex data. You know, vertex shader based tree wind animation, grass billboard extrusion and so on. It can also fill in any non-predefined values in the Input structure.

My favorite vertex modifier? Moving vertices along their normals.

Custom Lighting Models

There are a couple simple lighting models built-in, but it’s possible to specify your own. A lighting model is nothing more than a function that will be called with the filled SurfaceOutput structure and per-light parameters (direction, attenuation and so on). Different functions would have to be called in forward & deferred rendering cases; and naturally the deferred one has much less flexibility. So for any fancy effects, it is possible to say “do not compile this shader for deferred”, in which case it will be rendered via forward rendering.

Surface Shader: Wrapped Lambert lightingExample of wrapped-Lambert lighting model:

  #pragma surface surf WrapLambert
  half4 LightingWrapLambert (SurfaceOutput s, half3 dir, half atten) {
      dir = normalize(dir);
      half NdotL = dot (s.Normal, dir);
      half diff = NdotL * 0.5 + 0.5;
      half4 c;
      c.rgb = s.Albedo * _LightColor0.rgb * (diff * atten * 2);
      c.a = s.Alpha;
      return c;
  }
  struct Input {
      float2 uv_MainTex;
  };
  sampler2D _MainTex;
  void surf (Input IN, inout SurfaceOutput o) {
      o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
  }

Behind the scenes

We’re using HLSL parser from Ryan Gordon’s mojoshader to parse the original surface shader code and infer some things from the abstract syntax tree mojoshader produces. This way we can figure out what members are in what structures, go over function prototypes and so on. At this stage some error checking is done to tell the user his surface function is of wrong prototype, or his structures are missing required members – which is much better than failing with dozens of compile errors in the generated code later.

To figure out which surface shader inputs are actually used in the various lighting passes, we’re generating small dummy pixel shaders, compile them with Cg and use Cg’s API to query used inputs & outputs. This way we can figure out, for example, that a normal map nor it’s texture coordinate is not actually used in Deferred Lighting final pass, and save some vertex shader instructions & a texcoord interpolator.

The code that is ultimately generated is compiled with various shader compilers depending on the target platform (Cg for Windows/Mac, XDK HLSL for Xbox 360, PS3 Cg for PS3, and our own fork of HLSL2GLSL for iPhone, Android and upcoming NativeClient port of Unity).

So yeah, that’s it. We’ll see where this goes next, or what happens when Unity 3 will be released. I hope more folks will try to write shaders!

Share this post

Comments (39)

Comments are closed.

17 Jul 2010, 5:12 pm

cool!

17 Jul 2010, 5:35 pm

@Aras
I’ll starting learning the stuff about a 20 days later. I’ll start learning all low level things like programming OpenGL/D3D and shader programming with CG. hopefully in fall i would be able to write some shaders that they are useful for me and the community

17 Jul 2010, 8:03 pm

Wow, awesome Aras. Good job!

laurent
17 Jul 2010, 10:12 pm

Shader 3.0 is the type of feature which may not dazzle the crowds, not much glitter in job well done is there, but it is far more important than pretty cloth or baked textures because, unlike these eye candies, it addresses the very core of what Unity stands for : workflow – and so far it is the biggest improvement in that area.

Jason Amstrad
18 Jul 2010, 4:18 am

I still think that Unity should also have a visual shader editor like Unreal Engine/UDK has.

18 Jul 2010, 7:59 am

Great job Aras !

Some questions thou:

- What if I need only one UV stream and two texture… is the other stream still being allocated?

- What is the typical compilation time (provided you may or may not need skinning, shadows, deferred) ?

- How does the syntax highlighting look like ?

18 Jul 2010, 8:10 am

@Jason: I think everyone agrees. Maybe someday…

@Araya:
- you can declare just one UV in your input and use that for sampling any number of textures.
- mostly depending on target platform count. E.g. compiling a shader for D3D9, OpenGL, OpenGL ES 2.0, Xbox 360 & PS3 takes about 1.5 seconds on my PC. Without 360 and PS3 it’s 0.5 seconds or so. We multithread the compilation where we can, but some compilers are not thread safe (e.g. Cg or HLSL2GLSL).
- since it’s just Cg/HLSL code, it has Cg/HLSL syntax. It’s not a new language; “surface shader” is just some Cg/HLSL functions where all the non-interesting code generated on top of that.

Carsten
18 Jul 2010, 9:09 am

It´s nice to see that Unity-team simplifies creating shaders for us.

But in my eyes the better way would have been to create a visual shader editor as Jason suggested. Look at this thread:
http://forum.unity3d.com/viewtopic.php?t=36679&highlight=kurt

People are really waiting for it!

18 Jul 2010, 8:45 pm

This is easily the most exciting feature of Unity 3 I’ve seen yet! Why wasn’t this bragged about sooner! I wondered what you meant when you mentioned to me at GDC said that you were trying to simplify the shader writing process…

PuckerFactor
18 Jul 2010, 9:25 pm

@Casten…I totally agree….I was waiting for it in 3.0….and a lot of other people I know were waiting for it. Considering how easy it is to do so many other cool things in Unity…compared to other game engines…I would have thought a Visual Shader Editor was a no-brainer.

bizziboi
19 Jul 2010, 9:13 am

Of course, with Unity’s editor extensibility noone’s gonna stop you from writing a visual shader editor. The surface shader approach will only make it even easier to do so :)

19 Jul 2010, 10:13 am

@Aras:
Thanks for the reply, and happy to here that you achieved such a good compilation time!

- Ok, it seems I’m missing something with UV sets. I thought the float2 uv stream were generated (filled) by the engine automatically if a texture was declared in the shader code, but considering what you’re saying this isn’t the case… I just have to figure out how to declare a new UV set (RTFM :)

- Abut syntax highlighting, my concern was: is the part you wrote in grey in your example is supposed to be greyed in the shader editor also, or is there some nice and special colors to make it more readable, or is there no special treatment at all in wich case it is interpreted as standard HLSL while in fact it’s not really (like the CGPROGRAM, Tags or Properties words wich have special meannings).

skovacs1
19 Jul 2010, 5:29 pm

@Araya

-UVs: when you declare a texture property (_MainTex), the UV sets are generated. If you want access to these generated UVs, you must pass them as input to your surface shaders (uv_MainTex)(the name is important!). Then, tex2D takes in a sampler2D for the texture and the UV set that you want to use as coordinates.

-Syntax highlighting? The code that Aras greyed out are uninteresting as he clearly said “grayed out bits that are not really interesting.” The syntax highlighting in your editor is really something specific to your chosen editor. If your editor is highlighting functions and structs in Unity 2.x cgshader code, then it will be the same in Unity 3.x as the Input is just a struct, and surface and lighting functions are just functions.

harmless
19 Jul 2010, 9:31 pm

I wonder if this will finally allow GPU skinning? fingers crossed!

21 Jul 2010, 8:20 am

This is ridiculously wonderful. One of the best addons to Unity so far.
Unity is not only getting better with v3, but also v3 will improve your workflow and reduce your dev time by miles. I remember working and tweaking simple HLSL shaders in the past, wondering when this kind of thing will pop up, and here is. :D
We can now focus on the fun part of a shader! The visual and the look of your material in the surface. No more need to worry! Omg, amazing! :)
Keep the good work guys.
Cheers,

21 Jul 2010, 10:43 am

@skovacs1

- “when you declare a texture property (_MainTex), the UV sets are generated”… that was exactly my first concern. If each time you declare a texture property (_MainTex) a UV stream is generated, then I have no way to have 2 textures share the same UV set without wasting a whole uv stream (memory concerns here), as the second one will be generated automatically by the underlying engine.
Unless some clever work is done by the compiler through detecting inside the vertex shader that the second uv set if never really referenced by any instruction.
…Or unless uv stream get generated only if explicit uv set are exported (in the same channel order ?)

- About Syntax highlighting; I just thought that Unity 3 would integrate its own shader editor (without having to use an external one), like what Virtools guys did.
This isn’t a big concern thou, but because Aras introduce this formalism of “surface shader” inside Unity it’d have been nice to see the “irrelevant part” really grayed out in the editor (just like in this blog). So you don’t get confused
If there’s no integrated shader editor planed in Unity 3 then my question makes no sense obviously :)

21 Jul 2010, 12:43 pm

I think this is really extremely awesome and certainly development time well spent because it will safe many of us a *lot* of time. While I agree that a visual editor for shaders would be nice to have I think UTs time is better spent doing these kinds of improvements and let the community build their own shader editors on top of that. Guess I’ll have to play around with shaders again ;-)

Jason Amstrad
21 Jul 2010, 3:18 pm

Hmmm,
Something very strange is going on.
Unity 3 now has Illuminate Labs Beast technology and Autodesk has just acquired Illuminate Labs.
I wonder what these guys are cooking up.
What do you think ?

Cameron
21 Jul 2010, 11:32 pm

This looks fantastic Aras. I recall reading your “shaders must die” blog posts a while ago and always hoped you’d get time to implement such a progressive shading language in Unity some day.

I dabbled in Shader Programming for Unity 2.x but never really managed to get my training weels off so to speak. I just never found enough time to become intimate with a shader language during development. However, this looks to make shader programming a much more worthwhile investment of production time for small indy teams which is where Unity really shines.

22 Jul 2010, 2:06 am

@Aras:

If you are not rolling any visual editor, I strongly suggest you put up a dozen or more tutorials on how to learn to edit shaders in Unity 3. I have limited knowledge on that side, what I learned come from the excellent book SHADERS for game programmers and artist. Even after that, complicated terms and medium to hard shader seems beyond reach. Do you have on your side have any book suggestions that we could benefit on to learn this complex area of 3d programming ?

22 Jul 2010, 4:56 am

着色器马死的快

22 Jul 2010, 12:58 pm

Indeed this is fantastico Aras! Awesome work!

Hope the “couple simple lighting models are built-in” are also built-in-to the documentation :)

@Minevr
Indeed, the shader horse died quickly. Wait… what!? :P

23 Jul 2010, 11:54 pm

I’m really looking forward to this! Tons of stuff can be done already with shaders in Unity, but simplifying the process will really increase how much actually is done with shaders.

26 Jul 2010, 9:44 pm

sounds abit like mental images’s MetaSL isn’t it ?

26 Jul 2010, 9:58 pm

@Araya: about multiple UV sets: when you declare uv_MainTex (for example) in your Input structure, this will take UV of the mesh, and transform them by “_MainTex” texture’s tiling/offset values. You can use this uv_MainTex in the surface function to sample other textures as well, if having the same tiling/offset for all of them is fine for you. If you need different tiling/offsets for your textures, you need to declare multiple uv_Foo members.

@harmless: this is actually totally unrelated to GPU skinning.

@FXCarl: in a sense, yes. It’s also similar to OpenShadingLanguage as well. Why we didn’t go with either of those? Because they both are primarily targeted at offline / non-realtime rendering; and also because efficient & open compilers into actual GPU shaders don’t exist for them yet. With “surface shaders”, we aren’t inventing a new programming language at all; it’s just Cg/HLSL, so we can use existing compilers on the different platforms. We’re just generating “the boring code” instead of forcing you to type it.

27 Jul 2010, 5:43 pm

Looks cool but I just hope I won’t make it even harder to write custom FX shaders beyond tex+bumpmap+3lights+(maybe alphablended). If “#pragma surface surf Lambert” is going to save me X hours, I would be great if the X > 0. What is ARB_fog_exp2 and how it augments my code took me a while to figure it out.. All I could find was ‘here is the fog shader’ and “better don’t mess with it”. I don’t say that documentation by example is wrong. Far from that. It’s required if you want to learn and copy&paste stuff. But if you are serious about writing shaders and your lighting models what you need is Reference with list of all magic values, secret functions, _UnderscorePredefiniedVariables and DONT_TOUCH_THIS sematntics.

Jason Amstrad
27 Jul 2010, 11:07 pm

The guys from Unity should also release a video that demonstrates the “Umbra Occlusion Culling” technology.

Koblavi
28 Jul 2010, 5:02 pm

Ok, so summer will be coming to a close very soon. You’ve not blogged in a while and we (the ones who cannot afford to preorder at this moment) are wondering when is Unity 3 coming??

Nathalie Abbortini
28 Jul 2010, 9:13 pm

I am wondering the same thing.

29 Aug 2010, 8:30 am

Thanks for your post and very clear exposition of the aspects of shading
Long ago, in the 90s I did some shading and rendering with renderman.
Can you suggest some reading (books or papers) in relation with your current topic so that I can get back into that field.
My company did just joined Unity and we are doing serious game and simulation for businesses.
Thanks again
yves

shan
1 Sep 2010, 8:09 pm

How would this actually be compatible with DX11, sure omit vertex shader is fine, but what about other shaders coming out, such as the compute shader and so on, are those going to be pre defined too. Not sure if this is a good idea for later on down the road where unity has to upgrade to DX11, loosing many more complex functions that programmer couldve written.

1 Sep 2010, 10:32 pm

@shan: not sure I agree with this. This does not omit vertex shader; in surface shaders the vertex shader is mostly generated for you, and so is large part of the pixel shader.

It seems to me that geometry shaders and compute shaders are not very often used for “regular scene objects”, but more for special-purpose cases. And when we’ll have DX10/11, you could write your own, just like in Unity 3.0 you can write your own vertex & pixel shaders if that’s your thing. Just for 99% of the cases where you want to write a shader for a lit surface, using this code generation approach (surface shaders) seems to be much easier.

Tessellation in DX11 also seems like something that a shader author wouldn’t want to deal with in 99% of the cases when writing shaders for regular objects. Unless I want to exploit tessellation for some funky purpose, all I’d want to do is check “use tessellation” checkbox, and have everything else “just happen” for me.

shan
20 Sep 2010, 11:36 am

so wait? I can still write my own vertex and pixel shaders still?

20 Sep 2010, 12:23 pm

@shan: yes, of course. “surface shaders” in Unity 3.0 is nothing more but a code generator; it takes your code and generates more code around it to handle lighting, forward vs. deferred rendering etc. If you don’t need lighting, or want to handle all the complexity of different lighting things (light types, shadow mapping, lightmapping, forward vs. deferred rendering etc.) yourself – sure, go ahead; just write plain vertex/pixel shaders.

shan
22 Sep 2010, 2:35 pm

Hi Aras, I see that vertex and frag shader still works, um… not sure what you mean by handle lightning on deferred rendering and forward rendering, but I do see that I can’t produce anything with the shaders because it doesn’t take in any sort of light. What do I have to do to setup my own lightning model and enable lightning inside pixel and vertex shaders?

Juanjo
22 Oct 2010, 4:53 am

Definitely shaders must die. Even version 3.0 ones.
I would like a first level of abstraction language to deal with real world parameters. I want to make stone, skin, hair materials. I don’t care what the hardware needs to do this.
Then the current level can be here for the freaks of the effects and performance.
Moreover, I think there should be predefined shaders for these things often necessary as stone, skin or hair.
Unity is supposed to be friendly and easy to use…

Juanjo
22 Oct 2010, 5:02 am

I mean, maybe if you’re doing a huge project cost may leave the effort, but usually if you have something as common as a decent character, you should learn the whole language and syntax, just to write a couple of shaders. It’s too expensive, and is an area in which Unity will highlight offering a new approach for materials design.

unisip
1 Dec 2010, 1:12 pm

Hey Aras,

This surface shader thing is totally awesome!!!

When do we get a more comprehensive documentation about all this?

Right now I’m a bit frustrated as I feel somewhat limited to trial and error and copy/paste from other shaders to make it all work.

There are simple things that are probably obvious to you guys but that are not to newcomers. For instance, I couldn’t find a way to disable lighting altogether other than creating a custom lighting model that simply returns 0 (there’s gotta be a more efficient way to do that, right ;-) ).

The result is that I probably get shaders that are much longer than needed. Simple example: I couldn’t get a basic glass shader (transparent fresnel bump reflective cubemap, no lighting) to compile to shader model 2.0 (plain and easy writing the whole thing without surface shaders, but I definitely want to stick to your pipeline as much as I can).

Tks!!

3 Dec 2010, 2:51 am

@unisip: if you do not want to do any lighting, you should not use a surface shader. The reason for surface shaders is “taking care of all lighting details”. If you do not need that, just write a vertex + fragment shader pair.

Leave a Reply

Comments are closed.