Unity Unveils 2018 Roadmap at GDC
We’re going bigger than ever at Game Developers Conference 2018. We kicked everything off with our Unity at GDC Keynote, which brought everyone who joined us in person and online the latest announcements, previews and technical demos.
The opening keynote has just concluded. With an event that’s bigger than ever, we’ve got a lot to go through. If you want to watch the recording, you can find it below. If you would prefer to read the highlights, see the table of contents after the video.
Table of Contents
Unity 2018: Next Level Rendering, Machine Learning, and Performance
Brett Bibby, VP of Engineering, brought us a glimpse of the Unity 2018 roadmap. It’s shaping up to be an incredible year for Unity creators with a focus on Next Level Rendering, Machine Learning, and incredible performance improvements made possible by the Entity Component System, C# Job System, and Burst compiler. You can get early access to them now.
Unity 2018.1 will be released in April. If you can’t wait and want to get your hands on the latest version, then join our Beta program here. And oh yeah, Nested Prefabs will also be coming later this year in 2018.3.
Next Level Rendering
Silvia Rasheva, Producer of Unity’s Demo team, joined us on stage to introduce the team behind Unity’s flagship demos such as The Blacksmith (2015), Adam (2016), Neon (2017), and Book of the Dead (2018). The Demo team drives advanced usage of the Unity engine and works closely with Unity’s R&D team to push the envelope of what is possible to achieve with our technology.
Natalya Tatarchuk, Director of Graphics, and Lucas Meijer, Technical Director, followed Silvia on stage to demo Unity’s new Scriptable Render Pipeline (SRP), which brings many strengths: It’s configurable, lean, user-centric, and comes with two options: the High-Definition Rendering Pipeline (HD RP) prioritizes stunning, high-fidelity visuals with performant results on GPU-compute-capable consoles and PC hardware, while the
Lightweight Rendering Pipeline (LW RP) is optimized towards delivering high performance across hungry applications like XR and platforms such as mobile.
Natalya and Lucas dove into how Book of the Dead was able to utilize the HD RP to allow their small team to achieve a high-quality production.
The demo takes advantage of photogrammetry assets – some created by the Unity Demo team and some coming from Quixel. We are proud to announce that the Quixel Megascans Library is coming to the Unity Asset Store.
2018.1 will also introduce Templates, which are project starters with default settings already tuned, including a sample scene. Here are some samples of the HD RP and LW RP templates:
To address the incredulous reactions by the community at seeing the quality demonstrated by Book of the Dead, the project was shown running on PlayStation 4 Pro.
Natalya and Lucas gave us a sneak peek at Unity’s upcoming GPU-based progressive lightmapper, which gives instant feedback to the artist during the process of tuning lights and baking at ~10x the speed (thanks to AMD for the help).
Next, Mike Wuetherick, Producer on the Made with Unity Team, and Adam Myhill, Head of Cinematics, took us through the improved artist creation workflow. With Unity 2018.1 we’re adding Cinemachine Storyboard. This is a new feature designed to aid you when roughing out your shots with Cinemachine and Timeline. It quickly lets you really establish the feeling you’re going for with grey-box levels and scenarios. In conjunction with our announcement earlier this month of bringing ProBuilder into Unity, the worldbuilding process is faster than ever. We also shown how our close collaboration with Autodesk on FBX format is improving the roundtriping between Unity and digital content creation tools like Max & Maya, letting creators easily import/export models, lights, camera, animations and more.
Here is a quick summary of the improved artist workflow we showed:
Next, Danny Lange, our VP of AI and Machine Learning, took to the stage to highlight our commitment to democratize machine learning. We’re committed to lowering the barriers to entry so that you can make machine learning an integral part of your game development. You no longer need to program every solution, every NPC, and every permutation of how a person may interact with your game – you can focus on making systems learn.
Danny introduced our latest release, ML-Agents 0.3, which brings many new features, including Imitation Learning. Imitation Learning lets your system learn from real people playing your games, and can be trained to adjust to your players. The agent, in this case the NPC being trained, does not play perfectly, like a robot, but rather imperfectly, like a player – and all of this training happens in real-time.
Machine Learning insights allow us to build tools that optimize your games for retention and engagement. App performance is critical to these factors – more than 50% of 1-star reviews in the Google Play store mention performance, making it one of the most important problems we can solve. We want your games to be accessible to all devices without you having to sacrifice graphics or effects – which is why we built LiveTune. LiveTune tailors your game for every device in real-time. It adjusts assets, effects and rendering on each phone model, thus providing the best possible experience for any player on any device. Sign up for the LiveTune beta now.
We want to help you reach not just cohorts, but individual players, with the content and in the context that is most relevant for them. The first step we’re taking towards this is IAP Promo. IAP Promo surfaces the best possible in-app promotion to each player based on their game behavior and likelihood to engage. For more information on IAP Promo, and to get started with it, take a peek at this blog.
Danny closed by recognizing that each person has a combination of hardware, software, skills, and interests that create millions of options. We want to give you the tools that make your game accessible to everyone and deeply engaging to every person who plays.
As your Engine team, we’re bringing together the best talent with the goal of making Unity 2018 the best choice for all creators. We’ve hired engineers and artists from renowned studios such as Insomniac, Bungie, and Naughty Dog, and they are building the foundation alongside a team of more than 800 engineers who are making their breakthroughs available to all developers around the world. Working alongside you, our community, as a platform team. We want to help improve your quality of life, and we’re always thinking about your pain points. Moving into 2018, Joachim discussed that we are pushing “performance by default.”
“Performance by default” means that we’ve been working on a new high-performance multithreaded system that will make it possible for games to fully utilize the multicore processors available today without a headache. This is all possible thanks to the new C# Job System, a new way to write performant code by default utilizing our Entity Component System. This is coupled with a new math-aware, backend compiler technology named Burst. Burst takes the C# jobs and produces highly optimized code for any platform you’re building to.
We provide access to the greatest number of build targets, and we’re always adding support for the most desired and relevant platforms. When important new devices enter the market we want to ensure we have day-one support for you. Here is what we announced.
For all you pioneers looking to explore development in spatial computing, we have partnered with Magic Leap to integrate their platform with Unity. We’re thrilled that today the Technical Preview build for Magic Leap is available, and that our creators can grab the Lumin SDK at the Magic Leap Creator Portal.
Unity developers now have the ability to build directly for the Oculus Go using the same workflow you are used to with Gear VR. If you haven’t already, get your Oculus Go content ready for launch later this spring.
Unity developers can start building for Google’s Daydream Standalone directly from Unity with the newly added 6 degrees of freedom (6DoF) support for the Daydream platform, letting you easily add 6DoF support to your existing Daydream apps or build brand-new ones for this exciting new device.
Google Play Instant
Mobile games can take a long time to download and the longer it takes, the more players will walk away without ever playing. Google Play Instant engages players immediately, delivering content on the spot so they can try it first before installing. Unity is working closely with Google to ensure our developers can get the most out of Google Play Instant. This is one of several ways we are working with Google to help our developers convert more players.
Universal GameDev Challenge
We also announced the Universal GameDev Challenge. What’s your challenge? To reimagine five iconic worlds from Universal. From the classic Back to the Future™ films to the current animated series and popular reboot Voltron Legendary Defender, the Universal GameDev Challenge celebrates the creativity of game developers building with Unity.
Phase One starts today. We’re challenging the entire developer community to share their vision in a Game Design Doc, and pitch it in a recorded video. Six entries will be selected by our panel of expert judges and move on to Phase Two, where they will be invited to a VIP Mentorship Summit to work with visionaries, engineers and spokespeople from Universal, Microsoft, Intel, and Unity to help them make the most of their project.
The finalists will be tasked with creating a vertical slice of their vision using Unity, and submit it for the ultimate goal: A consultancy agreement with Universal and the chance to make their game a reality, plus $150,000 in cash. Each runner-up team will get $20,000.
Will Wright’s Proxi Challenge
Will Wright, world-renowned game designer and creator of The Sims, SimCity, and Spore, has announced he is teaming up with game development studio, Gallium Artists, to create his newest mobile game, Proxi.
During our keynote, we announced that Will Wright and Gallium Artists have partnered with Unity Connect to launch the Proxi Art Challenge! Artists will submit their creations for a chance to win one of two grand prizes and be flown to California to interview with Will Wright and his team. After the interview, one of the winners may be selected as the 3D artist hired to help Will Wright’s team bring Proxi to life! Do you have what it takes to land the gig?
Building smaller, lighter, faster experiences with Unity
To wrap up our keynote, Ralph Hauwert, Head of Platforms, spoke about building small experiences with Unity. To reach the world on the next billion devices, entry-level mobile phones, wearables, IoT, or the web, you need your apps and experiences to be light and fast. We want to give you the ability to do that – with the tools and extensibility of the Unity Editor.
To do this, we created a brand-new, highly modularized architecture and a new set of specifically designed components that you can create using the Unity Editor. This results in a smaller, portable runtime that can run natively on lightweight devices or even on the web. For web-based deployment, the file size for our compressed core runtime is 73KB.
This new core runtime, combined with asset optimization and the data-driven architecture of our code, leads to small files sizes, and fast delivery and start-up times. On stage, Ralph announced that while there are many uses for such a runtime, we’re starting with Playable Ads (a demo ad format) and games in messaging apps.
Currently, we are working with a number of developers in closed alphas. We will be bringing you this technology in 2018.
Thank you to everyone who tuned in online, joined us in person, or took the time to read our recap. The keynote may be over, but stay tuned to our social channels as we continue into GDC Week with Unity at GDC. Online you’ll be able to view some of the great sessions we’re bringing to our community on your YouTube channel.