Search Unity

The Making of The Robot Factory

February 5, 2016 in Community | 8 min. read
Share

Is this article helpful for you?

Thank you for your feedback!

We talked to Tinybop's Rob Blackwood, lead iOS engineer, Jessie Sattler, production designer, and Cameron Erdogan, iOS engineer, about their experience using to Unity to build The Robot Factory.

The Robot Factory, was the 2015 iPad App of the Year on the App Store. It is the sixth app (of eight total, now) that the studio launched and the first app in a new series of creative building apps for kids. As the first app in that series, it was the first they built with Unity. It will also be the first app TinyBop made available for Apple TV.

Moving to Unity enabled the development and design teams to work together more quickly and efficiently. Rob Blackwood, lead iOS engineer, and Jessie Sattler, production designer, walk through how they work together to bring the app to life. Cameron Erdogan, junior iOS engineer, chimes in about preparing The Robot Factory for tvOS with Unity. Every app TinyBop builds is a learning process which helps refine and improve their processes on the next app.

Building tools for development & design

Rob: As software engineers, it’s our duty to architect solutions for all the concepts that make up the app. We need to identify the app’s systems and rules and implement them through code. For example, a system we created in The Robot Factory determines the way a robot moves. In an app about plants, we created systems to represent the different seasons of the deciduous forest. There are also many cases where we must create tools for tweaking these systems and rules. These tools can then be used by a production designer to create the right look and feel for the app.

Jessie: As a production designer at Tinybop, I’m in charge of putting together the visual elements that live inside the app (to put it simply). We commission a different illustrator for each app, which gives us a range of styles and techniques. It’s my job to translate all the artwork into interactive, moving parts. I build scenes in Unity, animate characters and objects, and make sure everything runs smoothly between the art/design and the programming/development of our apps.

Rob: When we began our first app, we were a very small team using a development environment that required pure programming to develop for. We developed a tool for layout and physics simulation but it was not very sophisticated. As our team grew, we realized we had a bottleneck on the engineering side since most everything had to be programmed and then built manually before it could be tested. Not having immediate visual feedback when developing also meant a lot more iteration on the code, a time-consuming task. Not having an automated build system, like Unity Cloud Build, meant an engineer had to sink time into manually delivering a build to devices or sending it up to Testflight.

Jessie: Our previous editor lacked a friendly interface for someone who wasn’t primarily working in the code. I relied heavily on the engineers to perform simple tasks that were not accessible to me. Unity has alleviated the engineers of menial production tasks, and at the same time enabled me to perfect things to the slightest detail. We also couldn’t see the result of what we were making until we built to device, whereas in Unity I can live preview the project as I work.

Rob: The most important thing Unity has done for us is allow us to easily separate engineering from production design. Unity is a very graphics-driven environment which means production can do much of the visual layout before ever having to code a single line. This also allows us to continually integrate and iterate as the engineers develop more and more systems. The production team can get immediate feedback as they design because Unity lets you play and stop the app at any moment you like. We also use Unity Cloud Build which lets us push new builds out to actual iOS devices as frequently as every 20 minutes. S o, everyone can test and give feedback on the current state.

Jessie: Using Unity has made collaboration with engineers a dream! I can specify what visual effects I want to achieve. Then, we work together to build tools and scripts for me to use directly in the editor. The visual nature of Unity makes it much easier for me to have the control I need as an artist to get the projects to look the way we want them to. It also facilitates our iterative process. I can go back and forth with our engineers to find solutions that meet both our aesthetic and technical requirements.

robots5_2048x1536

In The Robot Factory, giving robots locomotion based on the parts they were created with was a big challenge. Using pure physics to move the robots made it too difficult to control, and having pre-planned walk cycles was boring and predictable. I worked with the engineers to create tools to draw the path of motion for each robot part, within a set of restraints, as well as each part’s gravity and rotational limits. We were able to maintain enough physics-based movement to get unique locomotion, but users still had enough control and part reliability to navigate their robots through a world.

Adapting for different apps & artwork

Rob: We've always given priority to the art and we try to not funnel the artist toward too many particulars, style-wise. This can sometimes be difficult from a technical standpoint because it means our strategies for creating the feel of animations and interactions often need to change. The Robot Factory artwork has a lot of solid-colored shapes and hard edges. We were able to identify a fairly small set of re-useable elements that could be combined to create most every robot part—each one comprising as many as 50 small pieces—that could then be animated independently. (This was important because real robots have a ton of moving parts, as everyone knows!) This contrasted sharply in our most recent app, The Monsters, where we wanted the monsters kids created to appear more organic and even paintable. In this instance, we created actual skeletons and attached something akin to skin so that they could bend naturally and be colored and textured dynamically when a child interacted with it. So while there are many challenges to adapting to different artistic styles, the benefit is that we are much closer to the artist's vision, which is always more interesting.

robots7_2048x1536

Jessie: Each illustrator brings a different style, thus a different set of challenges for every app. On the production side, we have to decide what aspects of the art are integral to keep intact, and what can be translated through simulations and programmatically generated art. A lot comes down to balancing three needs: widely scoped content, efficient development, and beautiful visuals. It’s a big challenge to create reusable techniques that we can carry over app to app. Many instances call for unique solutions. Where we would rig, bone, and animate meshed skeletons in one case, another app might need large, hi-res sprites, or small repeatable vector shapes and SVGs. Having disparate techniques means longer production time, but because we deem the quality of art and design in our apps so important, it is a necessary step in the process.

Moving on over to tvOS with Unity

Cameron: I didn’t have to change much code to get The Robot Factory up and running on Apple TV. After downloading the alpha build of Unity with tvOS, I made a branch off of our original app's repository. After a day or two, I was able to get the app to compile onto the Apple TV. I had to remove a few external, unsupported-on-Apple TV libraries to get it to work, but the majority of the work was done by Unity: I merely switched the build target from iOS to tvOS. Pretty much all of the classes and frameworks that work with Unity on iOS work on tvOS, too.

After I got it to compile, I had to alter the controls and UI to make sense on TV. To do that, I used Unity's Canvas UI system, which played surprisingly nicely with the Apple TV Remote. The last main thing I did was add cloud storage, since Apple TV has no local storage. To do that, I wrote a native iOS plug-in, which again was integrated easily with Unity.

robots3_2048x1536

Looking ahead

Jessie: We currently build apps with 2D assets in 3D space. This allows us to create certain dimensional illusions that help bring life to our apps. I’ve been experimenting with using more 3D shapes in our apps and working with new 3D particle controls in Unity 5.3. I’m excited about tastefully enhancing 2D worlds with 3D magic.

Rob: As we look to the future, we'd like to expand our apps to even more platforms. Unity attempts to make this step as seamless as possible by exporting to multiple platforms with just a little bit of additional engineering on our end. Like our experience moving to the tvOS platform, we hope Unity will do much of the heavy lifting for us. And by the way, we’re hiring senior Unity engineers right now. If you love Unity and building advanced simulations, look us up at http://www.tinybop.com/jobs.

The Robot Factory is available for iOS and Apple TV on the App Store:

Congratulations to Tinybop and thanks for sharing your story.

February 5, 2016 in Community | 8 min. read

Is this article helpful for you?

Thank you for your feedback!