Augment your spaces with Vuforia Engine
Vuforia 9.0 is here. The latest release lets you easily create immersive augmented reality (AR) experiences for large environments with Area Targets.
The spaces where we live, work, and shop have an abundance of valuable information – but it can be difficult to access relevant data in context. Current object-based augmented reality (AR) is limited to having the subject in view and is not designed for tracking large spaces.
To help enable the creation of massive experiences Vuforia Engine 9.0 introduces Area Targets. Area Targets allow for the authoring of large, continuous AR experiences on a scanned 3D model of a space. This new technology continuously identifies and tracks the features in your chosen area to offer persistent augmentations. The experiences can be consumed on a wide variety of phones, tablets, and AR headsets.
Creating an Area Target experience
The process of creating and using an Area Target can be summed up as scan, author, and view. This process breaks down into five high-level steps.
Step 1 – Scan your space: Area Targets are created from a 3D scanned digital model of a physical space. In Vuforia Engine’s initial release, you can capture a scan with a Matterport Pro2 camera.
Step 2 – Create an Area Target: Once you have your 3D model, you can bring it into the Vuforia Engine Area Target Generator, which will produce a Vuforia Dataset. This consists of a Unity package containing dataset files and a textured mesh of the scanned space, ready to be imported into a Unity project.
Step 3 – Import your Area Target: Within Unity, once Vuforia Engine is set up, you can bring in your Area Targets from the GameObject drop-down list. To configure your Area Target in Unity, please review the detailed guide for Area Targets in Unity.
Step 4 – Author: As shown above, developers have a view of their entire scanned 3D model in precise detail once imported. Developers can place 3D augmentations within this digital environment and test their experience with Vuforia’s Simulation Play Mode. Play Mode allows developers to “walk through” their 3D model and see the final AR experience from a computer. This can be useful for testing applications where you may have limited access to the physical space.
Step 5 – View: Once your Area Target application is deployed, a user can start the augmentations from anywhere within the area – the experience is simple and intuitive. Tracking is robust and persistent wherever the user moves within the space. And all 3D augmentations have full occlusion capabilities by all real objects that are part of the full scan.
Connecting people to places with immersive AR
Area Targets are versatile and powerful. Malls, hotels, and factories can all find value in large, persistent AR experiences.
Using an entire store as a canvas for AR enables businesses to easily display and change out promotional experiences, sale announcements, or navigation. Hospitality professionals can offer information in context, so guests can, for example, learn how to use appliances. Experiences can readily be made available in different languages to further aid guests.
In an industrial setting, this technology can give frontline workers real-time, immersive experiences to make their jobs safer and easier. Navigation data connected to specific tasks or machines can be placed within the facility, guiding workers to their jobs. This can save time and money when training new employees, guiding field service technicians, or helping workers who may jump between multiple procedures and machines.
Vuforia Engine 9.0 was released on March 18th and includes even more advancements to the Vuforia Engine AR software offering. You can learn more about the latest version on the developer portal or via the Unity Asset Store.
If you’d like to start building with Vuforia Engine Area Targets, check out How to Create Area Targets.