Search Unity

We’d like to share with you a project that was built during the R&D period of the Physically Based Shader and Reflection probes.

This benchmark project is one among several which helped us identify what improvements of functionality were necessary from an artist’s production perspective.

We compared offline to realtime rendering methods and output with the aim to achieve both an increase of visual quality, and a better streamlined, smoother production workflow for artists, which will open playful possibilities for graphics to be extended beyond realism to stylism.

The demo uses the Standard PBR shader and displays a range of shiny and rough metallic, plastic and ceramic materials, which naturally use the new native cubemap reflections (or HDR reflection probes). The material output in the movie is at a prototype stage and the shader is still evolving.

The textures consistently changed throughout the process as the shader evolved. In total, it is composed of 30 texture sets or so, both manually authored and procedurally generated textures. At this point, scanned textures were not used whatsoever. Typically, a texture set consists of albedo, specular, gloss, occlusion and a normal map and the sizes range between 256px to 4k. Background surfaces demanded less surface detail and amount of textures. In some cases, we casually created materials by pushing sliders to adjust color and float values until it matched the references. The secondary (detail-map) slots give a layer of dust, cracks, and crevices on the surfaces, which can be spotted on the close-up camera shots.

teleporter 2.1The heated up revolving core is achieved by simply animating emissive values and combining the results with HDR bloom to give a glowing hot impression.

The cave is a large scaled environment and the 100 meter tall machine itself was used intentionally to challenge performance and to serve as a lighting benchmark. This asked for a variety of convoluted HDR reflection probes/cubemaps to be placed along its body that could adapt during the changes of light that gradually diminishes towards the bottom of the cave and when the heated core lights up. Certain elements use real-time reflections while many are kept to static reflections. The application of the HDR reflection probes remains true to Unity’s ideology of keeping workflows simplified and are nearly effortless to apply and use.

The background scene uses directional lightmaps, while the machine is composed of partly skinned- and dynamic meshes that are hooked up to light probes and use Image-Based Lighting and a variety of light sources.

teleporter 2.2To be able to see the output of the shader during production, it is crucial to have HDR rendering represented in the sceneview.

We are most excited to share this short film with you and are impatient to see what our talented community can produce with the new set of tools which is coming. We are looking forward to seeing  artists amaze us with their limitless creativity.

58 replies on “Teleporter demo”

Will this be available for Wii U? Just like the current free Unity 4 license or in paid format?


BTW, what do they mean by image-based lighting? I am confused. is lit like a light-cookie, or GI, or something completely different from the former two?

@Sean @Peter i think he is referring to the part from around 38+ where fast lens flares appear, but from what i see those were spark effects that were meant to be there in the first place, though yeah it has nothing to do with temporal aa

Is enlighten gi an offline baked approach? Can we expect enlighten to work dynamically at runtime interacting with objects?

Will the unity 4 camera jittering be fixed in version 5? I get fed up people arguing it dont when it does!

A mono that dont crash as much and gets slow at times or simply wont open because its still hanging around as a service.

Will the new gui be released with unity knowing its bug free or expects the community to find them!

@Sean “The lack of temporal anti-aliasing causes lens flares to occur for 1-2 frames @ 40, 42, 45, 51 seconds – The list goes on but those are some of the most obvious. This is due to single pixels popping in or “shimmering” which have such a high specular they invoke lens flares that really jar as you’re watching the demo.”

What? Single pixels popping in with super high specular and causing lense flare? You’ve lost me. Temporal aliasing is due to speed of objects. I’m not sure how that causes lenseflare unless the object popping in is reflecting directly into the camera much like a mirror used to blind a sniper. How does a single pixel have a specular setting able to cause the whole screen to white out?

[…] 解説ページ:Teleporter demo – Unity Technologies Blog […]

This looks amazing. I have a few questions in general, so forgive the ignorance here if my understanding is not 100% accurate.
1) What kind of performance hit are we talking for real time GI, is it possible that it will work on current gen mobiles and tablets?
2) If this is not possible, will there be a seamless way to bake lighting from enlighten to traditional directional/forward rendering lightmaps that will allow us to create multiple publishes without having to recreate the lighting setup?
3) Will the demo above be released (or a demo like it) for us to pull apart when unity 5 is released?

Hope you can answer these,

@Alan: “I know Unity 5 is getting a shiny new MRT deferred renderer. How much control are we going to get over the deferred shading?”
The goal is to make it fairly flexible. E.g. you can already provide a custom lighting pass shader; and we’re looking into how it would be possible to setup custom MRT layout/formats.

@ROCKY: That’s not Unity’s fault. It’s just the limits of floating point number precision. You just can’t go beyond 10km without losing millimeter precision.

Unity game developing in large scales is very hard !
change terrain size from 2000 to 20000 or higher.
the objects become shaky .camera become shaky …
mountain and environment rendering is weak .

After reading the title I was expecting a “real” demo, not a video. Is it possible to produce an alpha/beta web player to see for real the demo?

I have another (slightly unrelated) question.
I know Unity 5 is getting a shiny new MRT deferred renderer.
How much control are we going to get over the deferred shading? For example would it be possible to modify the deferred renderer to add CryEngine-style screen space subsurface scattering?

So Robert (or Kuba) what about large open worlds in Unity5 and lighting workflows? Having to bake an unknown amount of terrain KMs is out of question. How lightmapping/lighting works in that case? What do you guys mean by fully dynamic? I still don’t understand

Nice job!!! … I only wish my first car purchased in college had as many cylinders on this monstrous device!

[…] Teleporter demo is a tech demo that shows the unparalleled graphics capabilities of the next version of the engine (Unity 5). From the perspective of an artist rendering, brightness, texture and precision make a leap forward. […]

Wow that looks really good. Good to see a new Unity 5 PBR demo. Please keep it coming!

@Alan: I don’t recall any presentations from Battlefield that would mention volume textures used in conjunction with the light probes.

What you might be referring to is Far Cry 3 where volume textures following the camera were used to store irradiance. It allowed for cheap interpolation and for evaluating probes per pixel (which gave varied lighting across large objects).

As Robert said, the light probes (and lightmaps and reflection probes) update dynamically. So a similar technique to the one just described can even be implemented in user code on top of Unity 5. :)

Forget game production this could be used for cinematics. The quality in realtime is where rendered graphics were just a few years ago. Seriously nice. Great work guys.

hi robert,
thanks for the info.
that sounds great and i am looking forward to read more about it.

@Lars: We’ll post more details about the reflection probes later on. For now, a bunch of random facts:
– They’re either rendered from the scene or from Enlighten’s representation of the scene, can update real-time, also at a frequency lower than the frame rate.
– In the current alpha build they’re only convolved offline.
– They will blend between two, which means Kinda Seamlessly(tm). Guaranteed seamless (like with light probes) is quite expensive, as it means always sampling 4.
– Day/night cycle is possible out of the box, because lighting, reflection probes and light probes update dynamically. We don’t have an analytic sky model atm, though.

That last point should also answer Alan’s question. : )

[…] dobu k předobjednávce. Pokud vás tato tématika zajímá, rovnou vás odkáži na Unity blog, který se k […]

I have a question: In BF3 (or was it BF4?) they inject Enlighten GI data into a 3D texture and use it as a light probe set (where each voxel is a light probe) so that characters and props receive GI.
Is there going to be support for injecting GI data into a light probe set? That would be very useful indeed :)

looks very nice.
and as pbs isn’t any rocket science any more i would like to know much more about the environment probes.
– how are they captured?
– how fast are they convolved?
– do the shaders blend between them seamlessly (according to position)?
– how do you bledn between those over time and changing lighting conditions (pre baked or real time…(!)?
– does unity 5 come with any tools to create a day night cycle that also handles the environment probes?


Still a bit sad that the camera movement isn’t smooth. You can still see that the motion is jittery at times.
Is this due to the garbage collector or transform update order for the camera?
It totally breaks the immersion. I was hoping unity 5 would finally get smoother frame rates.

great job guys,… as the rendering pipeline is changed… what happened to the Terrain System?

[…] can read more about the video at the Unity website here. In the meantime, here’s a brief description of the cave environment seen in the […]

Hmmm, I did see some blue dots in the last of the seen when the rocket thrusters or whatever went off…

…but my question is PBR shaders, how to use them? Will I be able to say things like use a vanilla ice cream shader here and a chocolate cake shader here and oh, a spoon and fork shader here? Except this person, who steals silver, they get white plastic spoon and fork.

…because to me that’s what saying Physically Based Shader says…

Release a webplayer demo…..!!!!
Come on guys, you really need to update the realtime showcase page with this….

I do know what I’m talking about and I got a decent 1080p stream. The lack of temporal anti-aliasing causes lens flares to occur for 1-2 frames @ 40, 42, 45, 51 seconds – The list goes on but those are some of the most obvious. This is due to single pixels popping in or “shimmering” which have such a high specular they invoke lens flares that really jar as you’re watching the demo.

>and this is all real-time?
>No seriously!?

Yes this runs realtime. Even on my MacBook Pro. No monster rig is required.

@Adam: Unity 5 with Enlighten bakes directional lightmaps differently and uses that information in the shader differently too. Judging by our tests it looks like we got rid of a bunch of artifacts and are able to better represent the light, which was required by the new standard shader.

@Olivier: All our demos are real-time. :)

Now that is impressive!
It is like I am looking at Blender Cycle Rendering engine.
and this is all real-time?
No seriously!?

Looks very promising guys.,

How do the Directional lightmaps compare to the existing Beast lightmaps? Is it the same rendering code with simply new baking? Or something altogether new? And the actual setup and baking, what can we expect with regards to Lighting – do we get all new photometric lights, or is it the same existing ones?

(Short version: Have you solved the various key issues with artifacts with Directional Lightmap baking / rendering under Beast?)

It’s a great demo – but we would love to see more dynamic content – perhaps involving characters.

@Joel temporal aliasing is that effect where objects are moving to fast for the scene and so will appear to pop from one place to the next. It’s best typical example is in movies where cars in a high speed chase appear to have their wheel spokes moving backwards. All Temporal anti-aliasing would do is smooth that effect somewhat but, not eliminate it. To fully eliminate it the framerate would need to be around twice as fast as the fastest moving object in the scene.

As for Sean’s comment well either he’s mis-understood what temporal aliasing is or he’s somehow got a bad video stream. Temporal aliasing only seems to be present in the scene where the teleport spins up to full speed.

Beautiful, really amazing.

Please, release a webplayer for we take a look in real time. Just for fun!

Sean, what timestamp do you see that in? I watched the video at 1080p and couldn’t see what you describe.

The lack of temporal AA really shows in this demo – I can’t take the constant popping of high specular pixels blooming out into large blobs of white.

Comments are closed.