Search Unity

Art and VR at Sotheby's

July 7, 2017 in Community | 8 min. read
Share

Is this article helpful for you?

Thank you for your feedback!

This is a guest post from our VR Artist in Residence, Isaac Cohen, also known as Cabbibo. For more background, you can check our initial announcement post here.

When you know you’ll have to make an art piece in one day, you need to be prepared. For a painter this might mean making sure you have enough brushes, for a sculptor it may mean checking your clay isn't dry -- for me, it's ensuring I have code that works, and is flexible enough that if I need to make last-minute changes it won't all come crumbling down.

As Unity’s Artist in Residence, I had the opportunity to feature a piece for an “Art in VR” event at Sotheby's -- the renowned institution in New York City. The event was put on by the VR Society, of which Unity is a member. I wanted to make something special and unique for the site: I decided to show how art can be experienced when regular, everyday objects become reactive to the touch. But doing an installation unique to Sotheby’s meant I had to be extremely prepared. I wouldn't have much time to write new code, so I spent the weeks leading up to the event writing a group of modules that I could mix and match while there -- with final inspiration coming from what I found upon arrival.

1 - THE CONCEPT

The first time I did this type of project was with a piece called myRoom (which you can download for Vive here) where I scanned my actual bedroom at home and then aligned the digital representation with the physical, so you could pull and play with the furniture that was actually there in real life, while in VR.

I wasn’t sure what furniture we’d have available at Sotheby’s to use, so Unity ordered a couch for me -- a love-seat, to be exact. (I’d asked that whatever we secured, it had to squishy. It makes the experience more fun.) It was a dark brown, plush, heavy piece of furniture that we dragged out from the piles of crates in the mail room at Sotheby’s, the day before the event was to start. We put it in the booth area where my installation would be -- and then I got to work. I scanned it with my iPhone, then pulled out my laptop, sat on the floor, and began creating the couch asset for VR -- which would later get a psychedelic twist.

The transition from this physical, soft, brown couch to digital iridescent goo is described below. I’ve made a github repo with all the code (it's my first attempt at making code that is usable by others, so it’s still a little messy. You can ask me questions on twitter @cabbibo and I’ll do my best to help!).

2 - SCANNING

To get the couch into Unity, I started by taking around 250 pictures with the app ‘Reality Capture’, which I chose because it processes models much faster as its algorithms are written for the GPU (very helpful when you’re trying to make an art piece in a single day). In about an hour, I had an extremely hi-res mesh (10,000,000 verts). Obviously this won’t do for real-time graphics, especially when I wanted to give the couch physical properties. Because of this, I brought the mesh into Blender and cleaned it up, eliminating some other tertiary elements that got captured on the sides. Now I had a simple .fbx which I could import into Unity, which is where the magic begins.

I’ve always found much of computing to be very inorganic. We have box colliders and static models, baked lighting and canned animations. For a medium as physical as VR, I have found that the more organic I can make something, the more playful it can become. Making a model reactive can breathe life into it. A regular object suddenly can bring joy. Because of this, I want to make everything squishy, gooey, visceral, and physically palpable. To do this, I need the GPU.

3 - COMPUTE SHADERS

If you have never delved into compute shaders before, I would recommend this tutorial by Kyle Halladay. After I first saw it, I made everything squishy -- from donuts and cloths, I made space whales gooey and walking aliens hairy. But every time I would start a new project, I would rewrite custom code, starting from scratch again and again and again. I wouldn’t have that luxury for this project, so I had to make some code that was actually reusable. I am still a beginner in many ways (this was the first time I ever tried my hand at c# inheritance) so I can’t say that it is the best, cleanest, or even most complete architecture, but hopefully with a bit of background knowledge and patience for my strange mind, you can find use for it as well.

There were a few things that I knew that I needed this system to be able to handle. The first was to be able to turn any mesh into a compute buffer. This would allow me to not only pass this information into other compute shaders, but would also allow me to have a base buffer I could begin to do physics on. I wanted to try to architect a system where different buffers could be used in different compute steps. For example, I wanted to make sure that I could make a mesh gooey, doing strange physics on its vertices, and then pass this gooey object into a different shader that would do hair physics, making it so we could see the hairs bouncing and swaying on top of a liquid mesh. The update step needed to be flexible so that I could bind different information to different compute shader dispatches at will. Lastly I wanted to be able to get information out of the compute shader for use in CPU land for things like audio and collisions.  With all these thoughts in mind, I made a first pass at GooHairGrass -- my part in helping to make the world just a little more squishy.

4 - SIMULACRA

I decided to call the piece ‘Simulacra,’ referring to the imitation of what is real. A representation. VR discussions tend to be mainly about what’s not there, but I think there’s something powerful when all realities collide and you can see your own world around you differently.

As guest after guest put the headset on and walked toward the couch, they didn’t have to feel their way blindly, as some VR experiences demand. In Simulacra, you know the couch is there -- you sit on it, bounce off it, feel it beneath you. The glowing colors that look like grass and hair will lean away from you if your hands get too close, and then suck toward you if you squeeze the trigger. Letting go will cause them to bend back and slowly snap back into place. And being thoughtful, in what feels like a dark void of space, sitting on a psychedelic couch that feels real because it IS real, is exactly what Simulacra is all about.

Actress Maria Bello, who was at the event to give a talk on her 360 video documentary series, also stopped by our installation to see what I had created. Tony Parisi, Head of VR/AR Strategy at Unity gave a few talks on the power of real-time technology for artists, designers, filmmakers, and other creators. I myself was on a panel about artists. It was a great learning experience, and very humbling -- especially when I first saw that famous Sotheby’s podium.

5 - WHAT’S NEXT

There is still a lot of work to do on the repo, and I’m hoping that as I continue to clean it, the structure will become more easily usable and explainable -- but anyone is free to dig in now. I plan on doing some live streams explaining and building out objects using the code, so keep an eye on my twitter (@cabbibo) for announcements on when that is happening. Although there is still so much to do, I hope that this code can begin to help others imbue their creations with gooiness and life, helping to make virtual reality a bit more physical.

I would like to give a special thanks to Morten Mikkelsen from Unity for helping me understand compute shaders, to Yağmur Uyanık for creating magnificent sounds (which you’ll find in the repo, with some extra demos to come), and to George Michael Bower for sending through some models to use in the demos.

July 7, 2017 in Community | 8 min. read

Is this article helpful for you?

Thank you for your feedback!