Unite Copenhagen is officially here, and we’re thrilled that the action is taking place just down the street from where the first lines of Unity were written 15 years ago. We kicked it off with a keynote full of game-changing announcements and breathtaking demos.
Table of contents
Shape the world
The revolutionary ways games are powering visual content continues to inspire us. New consoles, streaming distributions, tools, and ideas are changing the landscape when it comes to interactivity, graphical power, and social connectivity. Simply put, game technology and game makers are shaping the world.
Real-time 3D opens up possibilities for every creator and is unlocking innovations in industries in brand new ways. We’re already seeing it used in the automotive and transportation industry, as well as architecture and construction. Soon, we expect to see it used to practice and perform microscopic surgery; engage with with artificial intelligence and train robots; and even influence the design of manufacturing factory floor layouts. It is an incredible time to be a creator – and we’re so proud to be a part of this journey.
Level-up your performance with ease
Over the last few years, we’ve shown you the astonishing scale and performance you can achieve with our Data-Oriented Tech Stack (DOTS). Our Megacity demo demonstrates that it’s possible to use DOTS to create a project with more than 4.5 million renders and hundreds of thousands of game objects and audio sources.
While performance and scalability allow immersive game experiences on the gamer side, the improved iteration speed benefits game developers, too. With our new conversion flow, you can use the same intuitive content authoring you already know, while taking advantage of the massive performance upgrades that come with DOTS and data entities. In short, writing DOTS code is now much more convenient.
That’s why we’re excited to release a DOTS sample project later this year. The sample project will showcase how all the DOTS-powered components, including physics, animation, netcode, and our content conversion flow work together. It’s a simple environment that you can use as a starting point for your own games.
Test, train and validate with Unity Simulation
Unity Simulation is our brand new, cloud-based simulation product for running multiple instances of a Unity project in parallel, at scale on Google Cloud. Unity Simulation can be used to accelerate applications across industries like gaming, automotive, and robotics. You can now test, train, and validate any projects, prototypes or concepts in a simulated world without being bound by all the challenges of reality – like insufficient data, inadequate resources, or safety concerns.
For example, the new tech can help game developers address challenges that are critical yet cumbersome – such as testing before soft-launching a game. Getting a high-quality game ready for launch is an immense task, thanks to resource constraints and tight deadlines. With Unity Simulation, you can set up multiple test runs to address different aspects of a game throughout the development process. This includes testing for game balancing and player experience, with a quick turnaround of data to analyze issues.
Unity Simulation can also be used to train self-driving cars in millions of hypothetical scenarios and edge cases – all in a simulated 3D world. Autonomous vehicles need to be tested in risky conditions that would be impossible or even unsafe to carry out in real life. A synthetic environment lets developers train machine learning models in the cloud, then transfer all the knowledge to cars driving in the real world — all in real-time and in parallel.
Unity Simulation launched in closed beta today. We plan to use the beta period to get feedback and fine-tune requirements. Get started with our free beta program and use Unity Simulation to solve your biggest testing, training, and validation challenges.
Next-level graphics tools for unprecedented visuals
The Unity Graphics team lives to create best-in-class graphics technology – performant, state-of-the-art visuals, along with intuitive artistic workflows. We are investing in our render technology with the Scriptable Render Pipelines, giving you direct control over what you need. You can take advantage of our two new render pipelines out of the box, use them as a starting point for your own solution, or customize them to meet your needs.
The High Definition Render Pipeline in game-ready action
The first of these is the High Definition Render Pipeline (HDRP), which lets you push graphics as far as you can on high-end hardware, delivering powerful, performant, high-fidelity visuals.
It’s a fully-featured offering in 2019.3 – now out of preview – bringing stunning graphics and photorealism at game-ready frame rates. Your HDRP assets will scale in fidelity on high-end platforms, taking advantage of the available hardware resources for the best visual quality. And you only have to author it once with the full suite of production-ready artist tools – such as VFX Graph, Shader Graph, and Progressive Lightmapper.
To give us a taste of how studios are already seeing great results with HDRP, Founder and CEO of Multiverse, Freeman Fan, shared an early look at their newest project, Earth from Another Sun.
Their team at Multiverse envisioned a graphically stunning game with alien, yet lifelike environments. To achieve this, they employed a variety of HDRP effects, like adjusting exposure, tonemapping, subsurface scattering, volumetric lighting, color grading, and lens distortion. The result was a beautiful and immersive world to explore — and performance never drops from a smooth 60 frames per second, even during hectic encounters with hordes of enemies.
We’re excited to see more of Earth from Another Sun as they head toward release in 2020.
Universal Render Pipeline: beautiful, performant graphics on all Unity target platforms
The second new pipeline is Universal Render Pipeline, previously known as Lightweight Render Pipeline, and is best to use if you want full Unity platform reach. It’s a powerful solution that delivers a combination of beauty and performance, right out of the box. Best of all, it scales to all the same platforms as Unity – whether you’re building for 2D, 3D, or XR.
With the Universal Render Pipeline, you can use all of the new artist tools and workflows, including the VFX Graph, Shader Graph, new post-processing, and render passes. You can update your projects from Unity’s default render and now, it looks better, runs better, and scales better than ever before. It utilizes an improved single-pass rendering technique that leads to fantastic performance improvements.
You can author once and deploy everywhere with great performance and best in class visuals. To demonstrate this, we showed the Boat Attack project running on a PlayStation 4, Xbox One, Nintendo Switch™, and three phones with different processing power and capabilities. If you want gorgeous graphics while scaling up your content, you’re going to love the production-ready Universal Render Pipeline.
You can start taking advantage of all the production-ready features and performance benefits today. Upgrade your projects using the upgrade tooling or start a new project using our Universal template from the Unity Hub.
We believe it’s important that you have the right tool for the job and with the Scriptable Render Pipelines – High Definition Render Pipeline and Universal Render Pipeline, you now have a range of options to confidently build from, now and in the future.
The Heretic: High Definition real-time graphics in Unity
At GDC 2019, we unveiled a preview of The Heretic, our real-time cinematic short film, which runs at 30 fps at 1440p on a commercial-grade desktop PC. The short film uses the latest developments in graphics, heavily leaning on the High Definition Render Pipeline, which now comes integrated with the newest edition of the Post Processing effects.
After much anticipation, at Unite Copenhagen 2019, we revealed the full cinematic film – which introduces a new character, Morgan, created through the use of the VFX Graph. By creating the simulation with GPU particles, an artist can change the shape, gender, appearance, and behavior of the character. The artist can immediately see what the adjustments look like in the final frame, as the particles conform to the artist’s actions in real time.
HDRP now ships with built-in Shader Graph master nodes for hair, eye, and fabric. The eyes master node allows us to have realistic human eyes, with caustics and refraction. We used the hair master node for the stubble and eyelashes of the protagonist character, Gawain.
Both VFX Graph and the HDRP are coming out of preview and will be production-ready in 2019.3.
The evolution of 2D tools
From RPGs to match 3s, some of today’s most successful games are 2D, including 75 of the top-100 mobile games.
The new 2D tools help you create gorgeous experiences with more efficient workflows. We created the Lost Crypt using the complete suite of tools working together, in one project. The lively scene features animation, light effects, organic terrain, shaders, and post-processing, all made natively in 2D. It’s for artists and designers to use the new 2D tools directly in the Unity editor. Teams and projects of all sizes, targeting any platform, can now get more engaging and beautiful results faster.
The suite of 2D tools will be ready for production in Unity 2019.3.
Build seamless multiplayer experiences
Our Multiplayer Services team has been hard at work building tools and services to help you build the best connected games. To do this, we’ve been working alongside studios, like MADFINGER Games, to push our tech to the next level.
At Unite, we shared how MADFINGER used our Multiplayer Services for their upcoming release, Shadowgun War Games. War Games is a tournament-style, first-person shooter for mobile, with an emphasis on fast-paced, dynamic gameplay.
MADFINGER was focused on mitigating some of the challenges they faced when operating their last hit game, Shadowgun Legends. Specifically, they wanted to enable super low-latency gameplay, prevent cheating, and create fun matches. We helped them do this with our Unity Transport Package, Multiplay Game Server Hosting and Matchmaking, and Vivox Game Communications. Each of these contributed to help MADFINGER build the best game possible.
The Unity Transport Package is available in preview, Multiplay Matchmaking will be available in beta in October for all Multiplay users, and Multiplay Game Server Hosting and Vivox Game Communications are available today.
Revolutionize your workflows
Create deeply interactive augmented reality experiences
We build tools for creators who want to construct powerful, deeply interactive augmented reality (AR) experiences that interact intelligently with the real world.
Sitting on top of AR Foundation, our Mixed and Augmented Reality Studio (MARS) is our specialty AR work environment built to make AR authoring more straightforward. The workflow gives you the ability to quickly prototype, test, and ship contextually aware, truly interactive experiences that seem to live in and react to the real world.
Our AR Foundation framework for multi-platform development allows you to get your AR experience in as many hands as possible. It provides a unified workflow so that you don’t have to rebuild your app for each platform. You can build your app once, and it will work across all platforms. What’s more, it now extends to wearable AR devices, meaning for the first time ever, you can build an app once – and it will work across ARKit, ARCore, HoloLens, and Magic Leap devices.
We’re also giving you the ability to add interactivity to your AR or VR experiences with our brand-new XR Interaction toolkit. Rather than by coding these object interactions from scratch, you can easily add components to your scene.
We know that rebuilding your app in order to add AR functionality is time-consuming and painful, so we’re officially supporting the ability to insert AR directly into your native mobile app. This unlocks the ability to take the full power of Unity and our AR offerings and embed them into the hundreds of thousands of native apps that already exist today.
Real-time building information modeling (BIM)
In June we announced Unity Reflect, our new product for the architecture, engineering, and construction (AEC) industry that transfers multiple BIM models into real-time 3D in one click. Since then, we’ve provided a select group of AEC firms early access to the product as part of our beta program.
The first AEC project to feature Unity Reflect, a skyscraper that will become the tallest structure in Brooklyn, comes from award-winning architecture firm, SHoP Architects. The firm is using Unity Reflect to prepare and convert its 3D designs into real-time 3D in seconds instead of weeks. By spending less time on data preparation and optimization, SHoP has more time to create AR and VR applications that improve the design and construction planning processes for this landmark project.
With better connecting design and construction, Unity Reflect makes it easier to catch design flaws early, solve problems faster, and reduce the time it takes to build. Sign up for our mailing list to get updates as we approach the launch of Unity Reflect this fall.
Deliver the best gameplay experience with Game Tune
To be profitable in your game, you need to be continually improving revenue and retention. This often means providing a flow of optimizations, events, content, features, and more. Each launch can take weeks to analyze, and some updates come with the risk of negatively impacting your player base. So, how can you test faster while mitigating risk?
With GameTune, we give every game studio — small and large — the power of machine learning to help them learn and act more quickly. Simply choose an optimization target and provide GameTune with different variables you want to test. Almost anything in your game can be made dynamic, such as different in-app purchase bundles, tutorial lengths, or game level difficulties. GameTune automatically tunes your game to the top-performing answer. Unlike A/B testing, each individual player chooses the top-performing answer and the system continuously learns and iterates. The time you save on experimentation can get put toward developing and optimizing new features – revolutionizing your workflow.
Sign up for the beta today.
Take a look at our Unite Copenhagen page for a full rundown of all the week’s activities.
Until next time – let’s get ready to shape the world.