Search Unity

Unity and Google are committed to making augmented reality (AR) mainstream. As we continue to work together towards that goal, let’s take a moment to celebrate this year’s advancements and where handheld AR is going in the next 12 months.

Throughout 2018, Google and Unity collaborated to expand the possibilities of ARCore development on Unity. ARCore released near monthly updates to enhance its featureset, helping pave the way for standout games like Ghostbusters World and Jurassic World™ Alive. Unity focused its efforts on the introduction of the AR Foundation framework, a common API that makes it even easier to create AR experiences. While you may be familiar with some of the enhancements, let’s take a quick look at the evolution of ARCore on Unity:

  • February 2018ARCore 1.0 came out of developer preview.
  • March 2018 – ARCore 1.1 included Instant Preview to enable testing of apps instantly on your phone. The release also added the ability to get color correction information and capture an image from a camera frame to run your own algorithms.
  • May 2018Unity 2018.1 released with full support and improvements for ARCore. ARCore 1.2 released later in the month, adding the ability to share AR experiences using Cloud Anchors along with support for Augmented Images.
  • June 2018 – Unity introduced AR Foundation, which included default support for ARCore. ARCore 1.3 released with the functionality to access camera texture and image intrinsics.
  • August 2018 – ARCore 1.4’s release included passthrough camera auto-focus, provided even faster plane detection, and added support for a variety of devices.
  • September 2018ARCore 1.5 added unique IDs to point clouds and further expanded its list of supported devices.
  • December 2018 – Unity 2018.3 will soon bring LWRP support to AR Foundation as well as improvements to camera texture APIs. ARCore 1.6’s release included refinements and a new Cloud Anchors example that uses Unity’s Multiplayer Services.

What’s ahead

Looking forward to the next 12 months, ARCore development on Unity will continue to evolve in exciting new ways. While many of the details around specific features and timing is hush-hush, here are a few things you can look forward to next year:

AR Foundation

After adding support for ARCore and the release of a few features in 2018, you can expect Unity’s AR Foundation to mature in 2019. This framework will expand beyond handheld AR development and include wearable AR development. This will make AR development easier across industries and use cases. We’ll continue to share more details on this developments over the course of the year.

Scaling up device support

Making sure ARCore is widely available is key for maintaining a healthy ecosystem that supports your handheld AR projects. Expect Google to rollout support for even more devices next year, expanding upon the already 250 million globally supported devices.

Connected experiences

If 2018’s release of ARCore Cloud Anchors and the announce of Unity’s partnership with Google Cloud on Connected Games isn’t telling enough, we’re all-in on helping creators make connected experiences. Next year, Unity will open up access to mobile Connected Games services including matchmaking, game hosting, and much more.

Get started creating handheld AR experiences

If you have yet to experiment with ARCore development, it’s a great time to begin. To get started (and hone your skills for what’s to come in 2019), below you’ll find a list of helpful resources including the Unite LA session on developing experiences for ARCore.

Leave a reply

You may use these HTML tags and attributes: <a href=""> <b> <code> <pre>

  1. I really enjoy using ARFoundation. There might be some native support for using it with ECS. For now I have to do all the communication between ARFoundation MonoBehaviours and my ECS Systems myself

  2. I’m waiting for Instant Game platform on Unity. How about roadmap for this?

  3. I heard mention a few months ago that Google was working on support for Cardboard with ARCore so that the user could have an immersive experience and see the real world camera properly aligned with the VR viewport. I understand the many complications around this, but wonder where it’s at. Holding the device in your hand is a barring factor.