This week Unity stormed SIGGRAPH 2017 — the world’s largest and most influential computer graphics conference — giving attendees an in-depth look at the latest breakthroughs for artists and creatives in the world of interactive entertainment.
Those who visited our booth and Unity Central were treated to daily demos and speaker sessions on topics including advances in real-time rendering and cinematic storytelling, as well as first-look deep dives for Unity’s photogrammetry workflow and de-lighting tool. The amount of knowledge being shared across 5 days was inspiring.
Timeline and Cinemachine
Head of Cinematics Adam Myhill wowed creators with the speed and flexibility Unity’s powerful new real-time cinematic tools offer. Artists can use the Timeline sequencer to easily blend and tweak animations without any additional code. Our intuitive smart camera system Cinemachine optimizes the artist workflow and gives room for experimentation by eliminating hours of hand animation, camera programming, and revision. These cinematic tools enable you to retain control over creative decisions up until the end of the creation process, empowering you to create more content in less time.
Unity’s Photogrammetry workflow and de-lighting tool
Our Field Engineer Mathieu Muller and Technical Artist Cyril Jover gave attendees an engaging in-depth look at Unity’s photogrammetry workflow, which makes it easy to transform high resolution photographs into photorealistic 3D objects and textures. Using this advanced workflow and Unity’s de-lighting tool, you can now create reusable high-quality digital assets efficiently, saving time and money.
OctaneRender in Unity technical preview
Unity and Otoy also showcased a preview release of OctaneRender, a physically-based renderer that will work directly inside Unity for beautiful hyper realistic game-cutscenes, 360 videos and VR films.
Octane will complement Unity’s real-time processing by rendering assets offline at speeds up to 50x faster than CPU based engines. Learn more about Octane here.
Research and VR
Aside from all of the Unity activity in Unity Central or at our booth, we were very excited to sponsor this year’s VR VIllage, a space in the expo hall for attendees to discover the potential of real-world applications demonstrating new ways to communicate and interact with virtual and augmented realities. And located right next to the VR Village was the VR Theater, where cinematic experiences like Baobab’s Rainbow Crow and Scatter’s Zero Days were given a bit of the red carpet treatment, complete with lit-up marquee and spotlights!
In addition to holding sessions on VR topics spanning tools, ethics, and rendering, a few of the Unity Labs researchers shared two technical papers, which you can read in-depth at the links below:
- A Spherical Cap Preserving Parameterization for Spherical Distributions
- A Practical Extension to Microfacet Theory for the Modeling of Varying Iridescence
To find learn more about Unity’s experimental projects, research, and explorations into the future of game design, VR, AR and development, check out the Unity Labs articles.
It was an inspiring week exploring the potential of our ever-expanding creation engine for gaming and real-time entertainment. Take a look at the Unity SIGGRAPH 2017 landing page for a full list of talks. We plan to share some of these talks with you on YouTube, so keep your eyes out for those in the coming weeks!