Search Unity

In Unity 5 we’ve been adding many user-visible graphics features (new Standard shader, realtime global illumination, reflection probes, new lightmapping workflow and so on), but we’ve also worked on rendering internals. Besides typical things like «optimizing it» (e.g. multithreaded light culling) and «making it more consistent» (e.g. more consistently between Linear & Gamma color spaces), we’ve also looked at how to make it more extensible.

Internally and within the beta testing group we’ve discussed various approaches. A lot of ideas were thrown around: more script callbacks, assembling small «here’s a list of things to do» buffers, ability to create complete rendering pipelines from scratch, some sort of visual tree/graph rendering pipeline construction tools and so on. For Unity 5, we settled on ability to create «list of things to do» buffers, which we dubbed «Command Buffers«.

A command buffer in graphics is a low-level list of commands to execute. For example, 3D rendering APIs like Direct3D or OpenGL typically end up constructing a command buffer that is then executed by the GPU. Unity’s multi-threaded renderer also constructs a command buffer between a calling thread and the «worker thread» that submits commands to the rendering API.

In our case the idea is very similar, but the «commands» are somewhat higher level. Instead of containing things like «set internal GPU register X to value Y», the commands are «Draw this mesh with that material» and so on.

From your scripts, you can create command buffers and add rendering commands to them (“set render target, draw mesh, …”). Then these command buffers can be set to execute at various points during camera rendering.

For example, you could render some additional objects into deferred shading G-buffer after all regular objects are done. Or render some clouds immediately after skybox is drawn, but before anything else. Or render custom lights (volume lights, negative lights etc.) into deferred shading light buffer after all regular lights are done. And so on; we think there are a lot of interesting ways to use them.

Take a look at CommandBuffer and CameraEvent pages in the scripting API documentation.

Pictures or it did not happen!

Ok, ok.

For example, we could do blurry refractions:


After opaque objects and skybox is rendered, current image is copied into a temporary render target, blurred and set up as a global shader property. Shader on the glass object then samples that blurred image, with UV coordinates offset based on a normal map to simulate refraction. This is similar to what shader GrabPass does does, except you can do more custom things (in this case, blurring).

Another example use case: custom deferred lights. Here are sphere-shaped and tube-shaped lights:


After regular deferred shading light pass is done, a sphere is drawn for each custom light, with a shader that computes illumination and adds it to the lighting buffer.

Yet another example: deferred decals.


The idea is: after G-buffer is done, draw each “shape” of the decal (a box) and modify the G-buffer contents. This is very similar to how lights are done in deferred shading, except instead of accumulating the lighting we modify the G-buffer textures.

Each decal is implemented as a box here, and affects any geometry inside the box volume.

Actually, here’s a small Unity (5.0 beta 22) project folder that demonstrates everything above:

You can see that all the cases above aren’t even complex to implement – the scripts are about hundred lines of code.

I think this is exciting. Can’t wait to see what you all do with it!

Ya no se aceptan más comentarios.

  1. Hi Aras, When I switch the project to iOS platform, I got «Tiled GPU perf. warning». I can’t find any «DiscardContents» api for the CommandBuffer’s Temporary RT. Any idea?
    BTW, The first «blurry refractions» scene get 12fps on the iPhone5. Is that normal ?

  2. Hello, I’m digging into this direction for a decal system, but I’m facing an issue:
    In your current demo project, if you lower your light’s shadow strength to less than 1.0f,
    let’s say 0.6 for the sun, and 0.4 for the other one,

    Toggling the master script «DeferredDecalRenderer» reveals that shadows are lighter enabled than disabled.
    Do you know why ? Is there a way to maintain the shadow intensity with deferred decals ?

    Thanks a lot for this great opportunity to learn CommandBuffers by the way !

  3. I tried the Area Light Demo and it is really looking nice. Much to my regret, these Lights do not contribute to Enlighten GI. Is there anyway to do it? Years age I have read the Enlighten Paper where they say that Direct Lighting Input is handled by the CPU. Is this still true? Is there any API support for this? I would like to have dynamic Area Lights that contribute to Enlighten.

  4. Guys please, quit asking for this tiny feature or that feature thingy. What we need is a customizable and flexible rendering pipeline and that’s what Unity is doing.
    Great post Aras, keep making unity the most flexible engine ever! :D

  5. Hi there, we’ve read that article with great interest! CommandBuffers will certainly simplify a couple of things when rendering custom lights in a deferred fashion. The ability to pass down parameters and define custom proxy geometry is great!

    However, right now we don’t see, how a similar approach could also be applied to forward rendering. In Unity 5.0 is there any way to define custom parameters for specific lights? Is there something we are missing?

    Thanks a lot!

    1. Aras Pranckevičius

      febrero 11, 2015 a las 5:43 pm

      Ability to add custom per-light parameters on any light is high on my «list of things to do», but not today. I’m thinking about perhaps ability to add a MaterialPropertyBlock per light, that then could be consumed by any shaders that deal with said light (either in forward or deferred).

      1. Sounds perfect. Thx for the Info!

  6. hi all

    how the download or Give Unity 5 Pro ?

  7. Thanks Aras and team. Fantastic addition to unity’s functionality. Having just implemented a limited deferred decal system in 4.6 I can really appreciate the additional scope this will bring to 5. Brings a happy tear to my eye :)

    1. Sorry about the double post, Multitasking sometimes gets the better of me.

  8. Thanks you Aras and team! Having just implemented a limited deferred decal system in 4.6 I can really appreciate this level of access to the render pipeline. So much scope has just been unveiled Hugs :)

  9. Awesome demo! Would there be an easy way to blit the result of the decal projection onto the surface texture as an optimisation?

  10. Hi,

    We threw in deferred decals. Love them. But could you give some clues how best to optimise them when they are not seen? Is it worth optimising them when they aren’t in view? Any suggestions for optimising (we have an open world game and fill rate can potentially be an issue on console).

    So any suggestions on how we can approach it with existing example will be super helpful, thanks!

    1. Aras Pranckevičius

      febrero 9, 2015 a las 3:04 pm

      Hey the code example is not a production ready decal system, and it tries real hard to say so in the readme :) This is an example for one of possible uses of command buffers, with small amount of code.

      wrt optimizing deferred decals – probably a good idea to cull them against camera frustum at least; or alternatively cull spatially close «chunks» of decals against frustum. Could also add some invisible fake renderer objects and tie decal visibility to them (that way they could benefit from occlusion culling too).

  11. Deferred decals? Finally! Because I’m sick with using third-party solutions, own scripts using projector. I’m looking forward for the list of features available for Indie and Pro versions. It would be cool to include decals into Free version. Thanks for Your hard work, Unity T.!

  12. What are the system requirements for this? Only for dx11, gles3.1?

    1. Aras Pranckevičius

      febrero 9, 2015 a las 1:03 pm

      Depends on what you mean by «this». Command buffers don’t need any particular hardware requirements and should work everywhere.

      What you do in your command buffers can have some requirements; and that depends on what exactly you do there. For example, this sample project I’ve added: blurry refraction should work everywhere; while the other two are targeted at deferred shading (so their hardware requirements are the same as deferred shading: shader model 3+ & multiple render targets; in practice that means DX9, DX11, OpenGL, and some high-end mobiles).

  13. Yesterday I have been thinking about a way to have decals in deferred rendering mode and wondered if Unity will make us able to modify the deferred rendering process. Now I am happy!
    But I should have read this earlier to safe some time brainstorming about stuff which you already implemented :)

    1. Aras Pranckevičius

      febrero 8, 2015 a las 6:24 pm

      Does tweeting count? Tweeted about it four months ago!

      1. I was talking about my fault not to look up the news regularly. Like I said this post was already there when I thought about it :)

        Sadly the implemented examples don’t solve my issue entirely. I have to get into how it works to see what can be done. Basically I have a similar issue as you faced at the Terrain Standard AddPass shader.

        I have to get to know how the decals work in detail and probably I will post something in the forums.

  14. Awesome post Aras. The example scenes will surely be helpful.

  15. Love the blur effect. Tried to hack it onto a new UI panel to make an iOS style blurring layer but it’s not blurring other UI objects. Anyone here wanna help me? :D

    1. No one wants to help me?

  16. «more script callbacks»

    How about you start using .NET events? Like… Everywhere. It’s 2015, and Unity still doesn’t have a «On Selection Changed» event!

    1. Aras Pranckevičius

      febrero 7, 2015 a las 8:23 pm

      I just called them «callbacks». Could have been events, delegates, whatever — does not change the fact that having arbitrary user script code executing in the middle of frame rendering makes is very, very hard to optimize.

      1. Was about to point that out too lol. Not sure what the issue would be with what syntax you use to describe the process :-P

  17. Awesome works really. I strongly believe that decal tools and area light should be built in for both forward and deferred rendering. Thanks unity team.

    1. totally agree with you. It should really be implemented for forward render. Otherwise, support MSAA on deferred.

    2. You may have noticed that these decals are literally called «deferred decals», so they don’t work in forward rendering, by definition. Their big benefit is that they’re really really cheap (just like lights are cheap in deferred).

      If you were to implement decals for forward rendering, you would have to do it completely differently and they wouldn’t be so cheap that you could just use them like you can use deferred decals.

  18. *Mind Blown* holy crap. i heard command buffers where awesome but HOLY CRAP! I wonder if you could do temporal re-projection using this? save the last several frames and their velocity buffers and re-project them back into a higher res image? obviously you would need a velocity buffer which would be the hard part. but still, those exist. *drool*. so much potential… i wonder if these exist on shadow lights? hmmm… this sounds so awesome. must go cuddle the documentation. alot!

    keep up the awesome guys!

    1. Aras Pranckevičius

      febrero 7, 2015 a las 8:20 am

      I think you can do temporal reprojection without command buffers too. Have an image effect where you copy away previous input image, and use it in next frame. Haven’t actually tried, but I don’t see why it should not work.

      1. true, true.

  19. This is awesome! I’m especially glad you included the sample project. I can already think of a ton of things I could use this for…

  20. Very interesting but only for PRO. Anyway im happy that unity evovle in featureset.

    Just curious: you mentioned that there were several ideas thrown during discussion, all rejected except one, but there is at least one another idea saved for «someday» (like ability to write custom rendering pipeline from scratch)?

    1. Aras Pranckevičius

      febrero 6, 2015 a las 8:19 pm

      I don’t think we ever go and decide «no we’ll never ever do this». Or if we do, that’s extremely rare.

      Idea of «let’s add more scripting callbacks» would be the most flexible one (and very easy for us to implement), but for example it does prevent certain kinds of optimizations we want to do (more multithreading in the rendering loops etc. – whenever there’s a script callback that can potentially do anything imaginable, things get tricky).

      Command buffers were well received in the beta group and customers we talked with; were quite easy to implement and are quite powerful.

      A completely arbitrarily-configurable rendering pipeline would have been significantly harder to implement, and frankly we weren’t totally sure if it wouldn’t be an overkill. Maybe someday we’ll do it, but I’d think we’ll first start looking at smaller pieces of possible extensibility (random example: ability to setup custom shader parameters on lights, for the cases when you do want to attach custom data on them, etc.).

      1. thank for reply, I understand. I think i badly spelled myself and i should ask for sooner future (3 years time frame) rather than never ever thingy :P and thanks for explanation of what you think for near future thats what i asked

      2. Ad ability write pipeline from scratch: i think this is very worth considering this for later future due to fact that customers sometime have very specific requirements and tech evolve very fast, for example screen resolution might jump insanely fast in relatively near future which may render defferred rendering as useless due to performance issue on higer res and forward+ will be needed (or something else no matter). in case were you be too slow to catch trendy then customer can implement this for himself. This open room for assets developers too and they could do something interesting (check freestyle rendering in blender for example)

  21. The deferred decals technique is exactly how we did the 2D on 3D stuff in Sideway : NewYork! Of course that was in Gamebrio and we had to write our own deferred rendering pipeline to do it. But it made the decals quite cheap to render!

  22. Hi,

    Great work, but I need to know a very simple important information. Has the management of dynamic meshes been optimised? Currently modifying vertex buffers and index buffers in real time is so slow (because of the unity implementation) that we have been forced to find a different solution uploading in real time textures and using shaders instead. I am still sure that, in our case, using dynamic index buffers properly would be a better solution.

  23. Okay, continuing our conversation from Twitter about using command buffers for forward rendered area lights. Unfortunately I just realized that it ultimately wouldn’t work because it sounds like CameraEvent.AfterForwardOpaque and CameraEvent.AfterForwardAlpha both happen after all the objects finish rendering. And while I could probably make it work for the opaque passes, it would be totally broken once alpha stuff enters the picture.

    Not to mention, I forgot about all that other nasty stuff needed to make good lighting, like lightmapping integration, shadow options, etc. So this idea probably isn’t going anywhere. :(

  24. Oh man those sample are awesome :). Any plans to made all of those became a native solution in future Unity5?

    1. Aras Pranckevičius

      febrero 6, 2015 a las 3:47 pm

      Having area lights & decals as built-in features would be good to have, yes. Someday! However, the point of this post is that we’re also trying to make the rendering pipeline to be extensible, so that even if there’s some random feature that Unity doesn’t do — it’s not too hard to implement it yourself. I know, it’s a double-edged sword.

      1. Well those two should become a native at least :D come on, how long it’s been for unity without a proper decal system :)

      2. Oh btw @Aras, i understand that this is just a simple sample so i mostly curious. In the decal sample when camera are close inside the decal bound, the decals are disappearing even though camera near clip are «0».
        What causing those actually?

        1. Aras Pranckevičius

          febrero 6, 2015 a las 7:08 pm

          When camera gets inside the decal’s box then it does not render properly. Production code should switch to z-fail rendering of the decal shape when that’s the case.

  25. Northern Vision Studio

    febrero 6, 2015 a las 3:34 pm

    Most excellent. I hope when U5 is released that these example projects will remain somewhere easy to find.

    1. Aras Pranckevičius

      febrero 6, 2015 a las 3:44 pm

      This example will be in the documentation.

      1. Excited to try this and see the documentation! Great stuff!

  26. Does this mean Unity5 can be used as a rendering backend for C# applications by bypassing its scene structure? That’s pretty huge.

    1. Aras Pranckevičius

      febrero 6, 2015 a las 2:27 pm

      «Maybe». You still can’t do «totally arbitrary things» from command buffers. But this coupled with Graphics.DrawMesh (which has existed since forever) does open up quite many interesting opportunities indeed.

  27. This pleases me greatly, hurray for extensible-ness that lets you casually add several killer in demand features as an example, leverage indeed