Search Unity

When Apple announced at WWDC that ARKit framework was going to be part of iOS11, there was great excitement in the development community, as most people saw an amazing opportunity to create incredible AR experiences that could be felt by a large audience.

Unity had worked with Apple during the previous months to create the Unity ARKit Plugin, which was released on that first day of WWDC, and allowed any developer to use Unity as a content creation platform for ARKit apps. The demo videos of ARKit apps using that plugin started streaming out over the Internet.

Today, we announce a new feature of our Unity ARKit Plugin: Unity ARKit Remote. This allows developers to prototype their experiences in an agile manner, reducing their production timelines drastically. Previously, when a developer needed to iterate on the scripts and on editing objects, they would have to build out to an iOS device to test their changes. Unity ARKit Remote allows you to run a special app on the iOS device which feeds ARKit data back to the Unity Editor, allowing you to react to that data in realtime in the Editor.

Here’s a video of how it works:

For more details, see this forum post.

Unity ARKit Remote is available as part of Unity ARKit Plugin:

18 replies on “Introducing the Unity ARKit Remote”

This is an amazing development tool!
Unfortunately it doesn’t seem to recognize touch events on the phone. I tried to set up the UnityARHitTestExample to register mouse click events so that I could use it for testing, but I don’t seem to be getting any results back for:
List hitResults = UnityARSessionNativeInterface.GetARSessionNativeInterface ().HitTest (point, resultTypes);
// where (point = UnityEngine.XR.iOS.ARPoint) & (resultTypes = ARHitTestResultTypeHorizontalPlane)
When I run this using the remote and mouse clicks, hitResults always comes back empty, but if I run it on my phone without the remote using touch, I get a list back. Is there a way to get this to work?

beautiful. Thanks guys. I’m always impressed at how well you guys do at being in _our shoes_. I did a lot of Adobe stuff in the past, and they would never have any ideas how much pain was involved in using the tools. The resources you guys throw at making demo apps, and the film making stuff you guys do internally really pay off for us guys in the pit, imho : you develop tools/processes that actually fit the work we do. Keep it up!!! Brilliant stuff.

Hi Joe,
It’s possible this not working for you for some reason, but otherwise the large consensus seems to be that ARKit tracking works reliably and is pretty stable given most conditions. What are your observations that cause you to conclude otherwise?

That is an awesome step up to agile development. Can we have something similar for regular iOS games?


IMPORTANT: in the video at 0:45 there is square with black dots.

What is it?

I have this same issue in Windows. Is a square Unidentified Software Object [for short U.S.O.] with random square dots inside in a square of 100px X 100px that usually is draw printed on top of the screen, It happens randomly. It Remembers me the virus Commodore 64.

Any idea?

It looks like the game is running on the device while the editor only updates state. Why not make it run on editor, and only receive ARKit sensor data, so that you can actually set breakpoints. This would also allow me to update the code on the run, without the need to build the whole app(Unity+xcode).

Comments are closed.