Search Unity

Pollen VR: Developing high-end visuals with Unity 5

February 16, 2015 in Community | 7 min. read
Topics covered
Share

Is this article helpful for you?

Thank you for your feedback!

More stories from the adventures of an EMEA field engineer! Today, I wanted to share with you the development of Pollen VR. Mindfield Games are using Unity 5’s new graphics features and I have been able to follow their development of Pollen VR very closely.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

I spoke with Ville Kivistö, CEO and Co-Founder of Mindfield Games, who is also their technical coder.

What Unity 5 features are you utilising for Pollen VR to achieve such high visual quality?

New physically based shading and realtime lighting are the definitive features of Unity 5 and we are using them to the fullest. We have material maps basically for every surface and due to design restrictions, environments need to be dynamically lit. With the new global illumination everything looks gorgeous and combined with reflection probes, the surfaces really come alive. Having a 64-bit editor is also crucial to have as developing a high end PC game can consume huge amount of memory. We used to have tons of out-of-memory crashes with Unity 4, but those days are now long gone.

pollen_interaction2_smaller

What techniques did you use for the foliage?

Foliage is a rather simple GPU particle effect. What makes it interesting though is how we got the lighting to work with it properly. As Graphics.DrawProcedural doesn't integrate into Unity's lighting passes and new CommandBuffer API doesn't support compute buffers, we had to do a somewhat funky solution.

We have a cube the size of foliage bounds, so we know that whenever the cube is visible, foliage needs to be rendered as well. Now, whenever the cube's OnWillRenderObject() is called, we render the compute buffer particles to two render targets in a single pass with MRT and using the settings of currently rendering camera. One texture has diffuse and roughness data and the other one has normal and depth. When we get to rendering the actual cube, the cube shader just gets those buffers as parameters and outputs corresponding data. Depth is written manually, so we get perfectly Z clipping output. And the lighting of leaves is affected by all lights and shadows to make them look like we want them to.

And because all the leaves are GPU particles, animating them is really cheap. So they aren't just a static mesh, but can react to environment realistically (naturally with some limitations).

pollen_hydro_l_smaller

Any techniques or workflows you can share with Global Illumination?

For us it worked very well straight out of the box. Default values provide a good balance between bake times and quality. On day to day use we have quite conservative values to enable quick baking, but our automatic build system scales up the values to provide even better quality GI for our nightly build.

Are you using a mixture of realtime and baked lighting?

Our game design requires that all lighting is realtime. We don't have a single static lightmap in the game.

In Unity 4, we used to work around that restriction by placing dozens of ambient point lights around the scene, trying to fake global illumination that way. In the end it was horrible, because the maintenance and designing was very tedious and time consuming. It was also pretty much impossible to keep all the lights from leaking through the walls.

Even the amount of draw calls and fillrate requirements caused by many lights was starting to be quite high, which in turn hampered the framerate. When we got our hands on the Unity 5 beta, we just enabled Enlighten and removed all the fake GI point lights and everything looked better, ran faster and GI worked perfectly in real-time when disabling or animating lights.

We also had a custom cubemap reflection system built, with box projection and everything. It worked rather well, but using it required dozens of custom shaders and the editor side was constantly missing features. We are very happy that new reflection probes basically replaced our own system. They require less maintenance and the workflow is much simpler.

pollen_lightshafts

How did you achieve the glow effect in the corridors?

We have a few different volumetric effects that we use throughout the game. For spotlights we like to use Robert Cupisz' implementation, which gives very nice volumetrics for spotlights that perform very well. As spotlights are very versatile, it's easy to use them to height based fog as well.

In some parts of the game you might see some fluid volumetrics for which we use Fluidity from the Asset Store. We use it in all scales, all the way from lighter fire to filling the room with gasses. It looks awesome and physics simulations always look yummy.

For outdoors, we use our custom post process volumetric fog solution as we want the player to feel the density of Titan's atmosphere.

pollen_hydro_d

What tools do you use to generate your PBR Textures?

Adobe Photoshop and Quixel Suite along with some reference materials. We have our own "playground" scene where artists can inspect models in various lighting conditions.

Have you made modifications to Unity’s new Standard Shader? If so, any tips’n’tricks?

As our base is not tied to a single point in time and space, we need to be able to render the base differently. As we're a small indie studio, we don't have resources to start working on multiple assets that look different based on how old they are. Our solution was to do an extended version of the new Standard Shader that adds "grittyness" to materials. What's best, it works for all objects and with the new material pipeline, we can have custom grittyness based on material type. Just one shader and few lines of code and we can change the look and feel completely.

Currently Unity doesn't support code injection to Standard Shader, Our solution is to just copy the built-in shader and keep our code in include files, so we just have to write couple #includes into strategic places if the Standard Shader changes.

pollen_grunge_on_smaller

How did you deal with Anti aliasing?

As MSAA doesn't cope that well with deferred rendering, we have to comply with post processing solutions. The one we have chosen is SMAA as it provides a nice and clean resolve with good performance. Even if it lacks the subpixel and temporal anti-aliasing, the final result is good enough even for Oculus Rift.

Unity 5’s new graphics is high-end, how has performance been?

Unity 5's new MRT based deferred path basically increased the framerate about 30% when compared to the legacy two pass deferred path. Our draw calls can climb quite high and skipping one pass helps a lot to keep those calls down. As virtual reality is very important to us (we recommend playing Pollen with Oculus Rift), it's crucial that framerate can be as high as possible. Therefore, we provide as many options as possible for tweaking the visuals to match players' hardware and framerate requirements. With maximum settings Pollen can really turn on those GPU fans, but you can scale down the options a bit and play with an older GPU with Oculus Rift if you like.

pollen_rifteffect_smaller

Any advice on the optimisation for such a large scene?

Unity's occlusion culling is very efficient and it handles most of the things we want. However, occlusion culling deals only with the rendering and does that only in play mode.

As we have spent lots of time to make everything behave physically correctly, be it books, basketball, microwave or punching bag, we have huge amount of physical objects in the game. Unfortunately, Umbra doesn't help us much with physics, so we had to write our own custom portal/culling system. Because the rooms of our moon station are efficiently separated by safety doors, we've been able to write a simple portal system based on those doors. We simply disable all the rooms the player can’t see. This helps with physics and even with Umbra as it has less culling work to do. In editor, we can also easily activate only a single room to keep draw calls and poly count low and editor responsive.

Out the box Unity 5 handled everything else for us, we didn’t need to do any additional optimising for our large scenes, everything just worked!

Thanks to Ville for talking to me, can’t wait to see this game released!

February 16, 2015 in Community | 7 min. read

Is this article helpful for you?

Thank you for your feedback!

Topics covered