How AR Foundation and MARS work together to enable interactive, multiplatform AR experiences
Unity makes tools that give creators the power to build deeply interactive augmented reality (AR) experiences that intelligently adapt to any environment and work across devices. Learn how AR Foundation and MARS work together to give creators the ability to make AR experiences that blur the line between the digital and physical worlds.
Fundamental concepts for mobile AR development
To make augmented reality (AR) content feel like it’s truly a part of the real world, you need to appreciate three fundamental concepts:
- Motion tracking enables devices to have six degrees of freedom. This means that your mobile device is tracked both positionally and rotationally as you move throughout the space.
- Environmental understanding gives devices the ability to find planes on vertical and horizontal surfaces. This allows us to understand the real world and place AR objects and content on those surfaces.
- Light estimation allows devs to understand the lighting and current conditions in the real-world environment so that information can be pulled into AR experiences. From here, AR content can either be brightened or shaded to more accurately fit in with the lighting conditions of the real world.
Platform providers integrate these fundamentals into their native SDKs to make it easier for developers to build rich AR experiences on their platforms.
Unity’s AR Foundation
We built AR Foundation to make it easier for you to deploy across multiple mobile and wearable AR platforms. AR Foundation is our core AR framework specifically designed for enabling multiplatform AR experiences.
Data is fed into AR Foundation through packages that are built on top of each platform SDK. So no matter what device you’re targeting, once you build your app using AR Foundation, it can take advantage of all supported features for each platform.
Even if one feature is enabled on one platform but not another, we put in hooks behind the scenes so that it will be ready to go later. When a certain feature becomes enabled on the new platform, you can easily integrate it by updating your packages – you don’t have to completely rebuild your app from scratch.
Unity’s Mixed and Augmented Reality Studio (MARS)
MARS is our suite of design and simulation tools that make it possible to more quickly create AR experiences that are flexible, are customizable, and work in any location with any kind of data.
MARS was designed to solve the most difficult challenges for AR developers today:
Authoring complex, data-oriented apps visually
It’s impossible to know exactly where someone will use your AR app and what physical objects will be in their environment. Even if you’re building for a specific or controlled environment – for example, a museum – floor plans and the installation locations can change; and the people viewing the AR experience may do so from different angles and will be of various heights. The possible variables and their combinations could be near infinite and nearly impossible to define manually when creating your app. Added to the tedious task of coding the dimensions by hand, you could spend a lifetime coding and still not have accounted for every possible variable.
Testing experiences in a painless and efficient way
It’s highly likely that the physical space you’re developing your app in won’t be the same as the user’s when they’re running it. You can’t teleport yourself to Tokyo if you’re building a location-based environment (LBE) experience for the airport there. For geo-location games, it’s simply not feasible to test the app in every outdoor environment around the world where your app will be used. This is a common pain point across development cycles when you’re building a mobile experience. It’s time-consuming to wait for a build and test it on each device you want it to work on.
Delivering apps with runtime logic that adapt responsively to the real world and that work across platforms
When a user is running the AR content, it must react to objects in their real-world environment, which is difficult to do. If the user experience is not good, developers risk losing users – they may not return to the experience again.
MARS alleviates all of these key pain points for AR developers by bringing environment and sensor data into the AR authoring workflow, ultimately enabling them to build more complex and robust AR applications.
How AR Foundation and MARS work together
MARS is built on top of the Unity Editor and works in coordination with AR Foundation. It exists as an additional layer that takes advantage of the data from AR Foundation or other custom data providers to allow creators to build for multiple platforms in a more streamlined and intuitive way.
Fundamentally, MARS provides the tools to efficiently author, test, and deliver AR apps, and AR Foundation is what makes it possible to have your AR app work across various platforms. AR Foundation and MARS work together to make it easier for AR developers to quickly create and deploy more interactive apps that intelligently interact with the real-world environment.
For more information on MARS and AR Foundation, check out our popular Unite Now session “How to Creative Captivating, Deeply Interactive Mobile AR Games in Unity,” to learn how a game studio is using both to create their newest mobile AR game.
If you’re craving more AR developments, make sure to join us online at AWE USA 2020, the next installment of the Augmented World Expo. Use our code UNITY2020 to get a 50% discount on AWE tickets. We can’t wait to share what we’ve been working on.