Connected games, facial mocap, New Prefabs workflow preview, and more news from Unite Berlin
We kicked off Unite Berlin today! We couldn’t be happier to finally share all of the exciting news that we’ve been working on, from our strategic alliance with Google Cloud to help you make connected games, to a preview build of our improved prefab system. Here’s your handy overview of all the keynote announcements.
The show will go on tomorrow and Thursday with loads of great talks on everything from ECS & C# Job System, to how we optimized Book of the Dead to run on consoles. We’ll be streaming selected talks; please join us on YouTube via these links: Day 2, Day 3. Check the video descriptions for the live streaming schedule.
Table of Contents
New Prefab workflows
Connected games with Google Could
Unity for small things
Book of the Dead: Environment project
AR: Project Mars
AR: Facial mocap performances
New Prefab workflows
We did a lot of research into how you all work with Prefabs at the moment and what you need. We’re improving the entire system, with a focus on reusability, control & safety. New additions include Prefab Mode, prefab variance, and of course… nested prefabs. During the keynote we announced that you can now access a preview build – we can’t wait to hear what you think!
Prefab Mode brings you an isolation mode for editing Prefabs. This new mode creates a faster, more efficient, and safer place for you to edit your prefabs, without adding them to your scene.
We’ve developed new prefab variance and nested prefabs features so that you can be more productive when working with prefabs. This allows you to edit the model prefabs, and then the changes propagate to all prefab variants. Our new nested prefabs workflow is designed to let you be more flexible with your team. Your team can now work on different parts of your prefab and have them all come together for the final asset.
Connected games with Google Cloud
Today, we announced that we are migrating our infrastructure to Google Cloud and a strategic alliance with the company to help creators make connected games.
Everything from fast-paced multiplayer games to social, single-player experiences on your mobile device, the most successful and influential games out there are connected and enabled by the cloud. These experiences require a lot of expertise, time and infrastructure that not all creators have access to.
Through this collaboration, Unity and Google Cloud will be building a suite of native Unity features and tools that will help you create and run connected experiences, and scale them to meet the needs of your players. We will make it possible for creators of all sizes to harness the power of Google Cloud without having to be a cloud expert. To learn more visit unity3d.com/connectedgames.
Unity for small things
Smaller. Lighter. Faster.
With each new platform, you have a huge opportunity to reach more players. Emerging platforms like messaging apps require quick load times so that players get a seamless, instant, social experience. We have built a new small runtime with a modularized architecture and compression technology for light and fast delivery. This enables you to build games for messaging apps in the Unity Editor you know, with the assets you already have.
While most messaging games today are best delivered in 2D HTML, we’re going beyond that with native deployment and 3D. This will also help you build to IoT devices, watches, car dashboards, AR glasses and many other platforms of the future.
Personalized Placements: Improve player engagement
There is a perceived trade-off between building a business and player joy, but what if you don’t have to comprise… what if you can have both?
Personalized Placements allow you to tailor each game experience for each player. This prediction engine determines what to show to each player based on what will drive the highest engagement and lifetime value, whether it’s an ad, an IAP promotion, a notification of a new feature, or a cross-promotion. The beauty of machine learning is that the predictions will get even better over time.
We empower you to deliver the most impactful content to tailor your player’s journey because there is no monetization without player engagement.
Book of the Dead: Environment project is now available
Earlier this year at GDC 2018 we debuted Book of the Dead, a beautiful award-winning interactive narrative experience created by the Unity Demo Team. All of the natural environment assets in the demo are photogrammetry-scanned real-world objects and textures with the majority coming from Quixel Megascans, a publicly available library of high-quality scanned assets.
The project is optimized for modern console performance and showcases the power of Unity 2018’s new HD render pipeline for achieving stunning visuals. Demo projects like this one allow us to push the envelope on what Unity can do at the moment and continue to push it further.
Over the past few weeks, we’ve shared making-of blog content covering topics such as concept art, character design, and environment art in Book of the Dead. To fuel your creativity and learning we’re are excited to announce that you can now download a slice of the Book of the Dead environment today on the Asset Store.
Reality is our build target
We have always focused on how creators will make the content of the future. This is why we’re excited about Project MARS – Mixed and Augmented Reality Studio, a Unity extension that delivers on the promise of AR by giving creators the power to build applications that intelligently interact with any real-world environment, with little-to-no custom coding.
The motto for this project is reality is our build target. We don’t want you to think about building to a console or phone – we want you to start thinking about how you can create experiences that truly take advantage of the real world. MARS is a robust toolset gives you the power to build AR apps that are context-aware, customizable, flexible, and work in any location, with any kind of data.
MARS is flexible enough to work with any data provider – we’re adding an abstract layer for useful data sources, like object recognition, and location and map data. We’re also providing sets of templates with simulated rooms, allowing you to test against different environments, inside the editor. Use our new AR-specific gizmos to easily define spatial conditions like plane size, elevation, and proximity without requiring code or precise measurements. AR differs fundamentally from traditional game design. Instead of having a level designer build your entire world from scratch, your level design is the real world. MARS will help you solve this challenge, enabling you to easily create AR experiences, from face masks, to avatars, to entire rooms of digital art. We cannot wait to see what you create.
Project MARS will be coming to Unity as an experimental package later this year. Stay in the loop on Unity Labs updates.
Facial mocap performances
Over the last couple of years, VR and AR have been increasingly powering how films are made. Advancements in graphics technology have enabled more detailed environments, and we find ourselves more invested in the characters. Virtual cinematography, one of the most remarkable use cases for XR in film, connects a physical camera on set to a digital camera in Unity. This allows you to shoot CG content with your hands, just like you would with live action.
With the new Facial AR Remote Component, you can perform, capture animate characters, and finish your shot, using AR technology as the tool. This could save your team days, maybe even weeks of work.
Coming soon, anyone can take advantage of this workflow and create cinematic content fast enough to keep up with your ideas.
Machine Learning improves animation
The Unity Labs AI group has been hard at work researching how to use machine learning to improve animation. Traditional animation workflows require you to explicitly define every possible transition, which can become quite complex. The team set out to create an animation system without any superimposed structure, like graphs or blend trees.
Our upcoming animation system, Kinematica, generates smooth, natural-looking movements by applying machine learning to any data source. Kinematica maintains all clips in a single library and then decides in real time how to combine tiny fragments from the library into a sequence that matches the controller input, the environment content, as well as gameplay requests.
Kinematica brings three major benefits: 1) a higher-quality polished look, 2) increased versatility (numerous variations determined from the same data set), and 3) no longer having to manually map out animation graphs.
Kinematica will be coming to Unity as an experimental package later this year.
Our main focus at Unity is not making games, it’s our creators. We not only strive to solve hard problems that stand in the way of your success, but to enable innovation and technical achievement within our community. Today, we were proud to have a few creators from this incredibly talented community showcase their projects on stage at Unite Berlin.
Lorne Lanning, creator of the Oddworld franchise is using Unity 2018 for the latest installment and pushing graphics to a whole new level where you can’t tell the difference between what is s game, or a film or tv. They are building on the cloud and hoping to ship in 2019. On several occasions, while developing the adventure game, Lorne and his team have marveled at how robust the engine has become, asking “why isn’t it crashing right now?!”
Harold Halibut is an adventure game that looks like a stop-motion film. Slow Bros. achieved this unique look by physically building all assets in the real world using polymer clay, wood, and metal before bringing them to life with Unity using photogrammetry. They also use Unity 2018 with HDRP which allowed them to achieve a more realistic look with colors and lighting behaving as they would in the real world! It solved a huge number of issues for them, including transparency sorting and volumetric lighting.
Flipping Death is a fantastic platformer that mixes 2D and 3D, coming to Nintendo Switch this summer.
Zoink! Games let us use their assets to show you two upcoming features for 2D development. Sprite Shape is a new world building tool that helps you sculpt and bend your smart with a smart visual layout system. Our new 2D Animation system is a set of tools that make it easy to set up skeletal animation and animate characters using a single sprite. You can easily add and edit bones, paint weight and add mesh tessellation. Both are now in preview, please learn more & try out all the 2D packages! We can’t wait to hear from you on the 2D Experimental Preview Forum.
Although GTFO looks like it was made by a team of 80, it was really brought to life by a small team of 8. Since debuting their trailer at The Game Awards 2017, it hasn’t stopped getting noticed. Fans and media outlets alike have mentioned GTFO as one of this year’s most anticipated FPS games. The creator and former designer of “Payday” Ulf Andersson is excited by the graphic results he’s already achieved using Unity.
Madfinger Games has been using Unity for close to 10 years – and they are experts at high-fidelity graphics for touchscreens. This latest title has been critically acclaimed by countless media. The CEO mentioned they’re making Shadowgun Legends fully coop online and converting hundreds of missions so that friends can play together. Also, they have announced that they are porting to Nintendo Switch later this year. We’ve always enjoyed working with the studio and they’ve given us countless notes on how to improve Unity.
Huge thank you to all the Unity creators who’ve helped us make this a great Unite keynote!
The show goes on – we’ll be live streaming select sessions on YouTube, Twitch, Facebook and Periscope tomorrow and Thursday. Join us via your favorite channel!