Search Unity

Since the introduction of the new Video Player component in Unity 5.6, we’ve had a lot of questions about how to integrate 360 video into a Unity project. Over the past months we have slowly been refining our suggested workflow and working on a high quality shader to make it happen. Today, we are pleased to announce that we are ready to share our work in the form of a beta shader ready for use in any Unity 5.6 or later project.

The concept is simple and straightforward. Take any supported video file (like an .mp4) containing either 360 or 180 equirectangular or cubemap content, import it as an asset, and play it through a Video Player component. The key is to target the Video Player to a Render Texture of the same dimensions as the video. Then, connect that texture to a Material set to use the new Skybox/PanoramicBeta shader and use that as your Scene Skybox Material.


Voila! You now have a Skybox being driven by your panoramic video! Turn on the Virtual Reality Support Player Setting and throw on a VR headset and you’ll immediately be surrounded by your video in full 360.

If you have 3D 360 content, you can take things a step further for the ultimate immersive experience by using the Skybox Panoramic shader’s 3D settings.

Full project details and documentation available at our GitHub project page.

 

23 replies on “How to integrate 360 video with Unity”

Can the Video Player play content stored on Google Drive, Dropbox or any other cloud storage service? If not, could this feature be added in an upcoming release?

Tolouse: Congrats! Your explanation and info has been VERY helpful!!!! I could have a sample project working in my Oculus rift in minutes with audio!!!!!

I have a quick question:

I need to control the playback of the movie projected on the skybox with the keyboard with code similar to this:

void Update ()

{

if (Input.GetKeyDown(KeyCode.P) && movie.isPlaying)

{

movie.Pause();

}

else if (Input.GetKeyDown(KeyCode.P) && !movie.isPlaying)

{

movie.Play();

}

}

But have been trying for days to attach a script to your example with no luck.

Can you please give me pointers on how to do it please?

THANK YOU

We’re using this on a new project and quite like it! Thank you for doing the necessary work to pin down a recommended workflow for 360 video.

The one issue we’re having is with 6K video. Our 4K video works fine, and we’ve gotten 6K to work without audio, but every attempt we’ve made with 6K and audio stalls after a second or two of playback. Does anybody have a workflow for creating 6K or 8K video with audio that’s working correctly for Unity?

Hi Sean.

Unfortunately, most graphics cards are only rated to decode 4K H264 video in realtime. Going up to 6K or 8K often ends up resulting in decoding problems. You could check what the maximum video decoding specs for your graphics card are, or, you could try transcoding to VP8 which is our software solution. VP8 decoding doesn’t impose any limits on the resolution but will be limited by the processing power of your CPU.

For future specific support questions, please follow up in our official VR forum here
https://forum.unity3d.com/threads/how-to-integrate-360-video-with-unity.485405/

Hi! I followed this tutorial to play 360 videos. I edited and built my project on a PC and it works well on PC and Android. However when I copy my project to a Mac, console shows “Metal: Fragment shader missing texture binding at index 0 (_Tex / Skybox/PanoramicBeta)” . When I run the game, the sky is green with some distortion. It would be great if anyone could tell me how to fix this >.< Thank you!

Hi! Will Skybox Panoramic shader 3D settings work with 180 degree stereoscopic video in Unity? VR180 becomes popular so would be nice to get it work in Unity

There are no special requirements for audio in this method. If your audio works when using the stock Video Player (say when targeting one of the camera planes), then it should continue to work fine when you target a Render Texture and pass it through this shader and onto the Skybox.

This is really great. Thank you!

One question: is there any reason why you named the texture _Tex and not _MainTex ? IMHO it wouldbe more consistent as _MainTex so we can use material.mainTexture property to set it rather then material.SetTexture(“_Tex”, texture);

Just wondering, is there any performance gain by doing the above, vs rendering the video on the inside of an sphere with the video player directly rendering to the material of the sphere? That’s the approach I have used, and seems to work pretty well, even on mobile.

I’m more curious as to the best approach for maintaining high quality video and also balancing file size/memory, on mobile devices. Is this something you should dynamically load in from resources or asset bundles?

Is there a way to integrate live-stream videos like hls into the same Skybox instead of having to store the videos locally?

Comments are closed.