Search Unity

Three reasons to consider Unity MARS for your next AR project

, 四月 23, 2021

Unity MARS is a tool to help kickstart and further support AR development. Read on to explore three specific scenarios and use cases that can make Unity MARS a win for your team’s next AR project.

You have a diverse team with various skill sets working in the Unity Editor

Unity MARS brings AR creation right into the Unity Editor. In addition to providing a unique set of samples, the Simulation view enables drag-and-drop functionality to produce and place AR content, as well as the ability to visualize markers for image tracking-based applications.

If you want to write scripts, Unity MARS is built on top of AR Foundation so that you can access all the unique platform features and functionalities through provider interfaces. You can also link directly into MARS components and the MARS query and data API via scripts. 

Let’s look at two ways to trigger an animation when a MARS object finds a match for its conditions: The first demonstrates how to use the MARS Component Action, whereas the second involves a custom script.

To trigger an animation on match, first add the Match Action MARS component to any proxy object. From there, you can add events for Match Acquired, Updated, Lost and Timedout. In our example, we’re linking into the Match Acquired event to store a reference to an animator and set the trigger for Grow.

To accomplish the same thing in code, you can store a reference to the proxy component and subscribe to the MatchChanged event. Check the callback to see if the query result is null. If it’s not null, this means that a match has been found. From there, you can call SetTrigger on the animator passing through the Trigger Name.

Here’s a look at the script hooked up in the Editor.

This example highlights the incredible flexibility that Unity MARS offers developers and non-developers alike. As you can see, both of these users can approach the same task differently, whether they’re using specific scripts, or harnessing the MARS interface more exclusively. These workflows empower the creators on your team to work directly in the Unity Editor with Unity MARS.

Your AR experience is based on transforming and interacting with the user’s environment

Many AR applications prioritize the placement of digital content in the real world. With Unity MARS, you can do even more thanks to the proxy-based and rules workflows, which allow you to configure the conditions that determine how and where your content appears.

In other words, not only can an app place content in the real world, it can heighten your environment into a much more unique experience. Unity MARS also handles integration with core Unity systems like navmesh and physics within an AR context. 


This video shows how different environments, models and textures are spawned based on rules. The models used are from Synty Studios on the Unity Asset Store.

With Unity MARS, you can quickly create content that procedurally spawns in your world based on the different surfaces you’ve scanned. You can utilize core Unity features in AR more easily with MARS extensions for features like NavMesh, which enables characters to pathfind and move on different surfaces in the real world more fluidly.

You have limited access to the space where your AR app will run

Over the last year, remote work has reached an all-time high. It’s now more common than ever to collaborate with teams across the globe, from different locations and time zones. While AR has the capacity to enhance a space, it needs to work seamlessly to feel believable, or like a natural extension of the real world. If you’re working on an app that is predicated on a specific location, or you have limited access to certain spaces, the Unity MARS Companion app (beta) is a great solution for bringing captured AR sessions back into the Unity Editor.

The companion app’s AR capture and data recording features let you scan any environment and record surface data, camera paths and videos that can later be imported into Unity MARS and used in Simulation view. The Simulation view in the Editor then allows you to iterate and adjust parameters to better control how and when your content appears. 


See how a capture is imported from the Unity MARS Companion app into Unity Editor. You can then adjust the parameters on the proxy objects and track their changes based on the capture in Simulation view.

Testing and iterating in a location where your app or experience will be used is crucial for creating compelling AR content. With the Unity MARS Companion app, you can record several AR sessions from any location, at any time, and then save to the cloud and import them directly back into the Unity MARS Simulation view, so you don’t have to be onsite to see how your content and updates will run.

Get started with Unity MARS

To get started with Unity MARS, try our 45-day trial at no cost. After you start your free trial, be sure to check out “First Steps in Unity MARS – a step-by-step course on Unity Learn that unpacks the foundation of the product so you can create an AR application.

 

Start your free trial

 

4 replies on “Three reasons to consider Unity MARS for your next AR project”

Very cool. But overkill for my needs. I just need the standard Unity Remote for iOS to support basic AR Foundation features. At the moment we have to rely on a solution from Asset Store.

发表评论