Search Unity

Building and running massive online interactive live events

March 24, 2021 in Games | 7 min. read
green scenery
green scenery
Share

Is this article helpful for you?

Thank you for your feedback!

We’ve partnered with Genvid, creators of a toolset allowing you to create richly interactive livestreams, turning your viewers into participants with influence over the content being streamed.

On December 2nd, 2020, Genvid launched Rival Peak - the world’s first dedicated Massive Interactive Live Event, or MILE. With Genvid’s unique SDK and comprehensive back-end support, entire game-like experiences can live in the cloud, accessible across all manner of connected devices, without any download or installation for users. Genvid worked in close partnership with Pipeworks Studios, DJ2 Entertainment and Facebook to craft something truly unique: an interactive quasi-reality show, produced in Unity, which would stream live, 24/7 for 12 weeks on Facebook. This was accompanied by a weekly wrap-up show hosted by Wil Wheaton under the title of Rival Speak. 

Rival Peak ran a live service, with almost zero downtime, where the interactions of the Facebook audience had material impact on the script, the plot and the live-filming sections.

Infographic

With all of the pressures of a live-production environment and a Game as a Service (GaaS) in place, the team behind Rival Peak knew they had to get it right. Over the 12 week run of Rival Peak, the livestreams, featuring multiple daily streamers that participated each day, reached a peak of nearly 50k concurrents, and the Rival Speak companion series averaged over 10M views per episode. 

Genvid’s Unity development partner Pipeworks studios utilised Genvid’s toolset to reach a massive audience with very low GPU overheads and a highly efficient server structure. Pipeworks engineer Joshua Lee, who has been working with games, and graphics hardware and software for over 40 years, explains how:

“The core of Rival Peak was partly based on a prototype single-player game we had called Galapagos, which was all running in a single instance. We wanted the AI from that game, but instead of having one viewpoint at a time we needed to have multiple viewpoints, cameras to follow all of the contestants, plus the homestream. That meant we had to change from a game running on a single CPU to a more complex networked experience, like a LAN game. We had an authoritative server, along with render clients for each AI “contestant” and the home stream, which was a huge architectural shift for us. We chose to implement these as a single executable, called a GameNode, that was configured at runtime for different roles: Authority, Render Client, or (for test and development) a hybrid “single instance” that functioned as combined Authority/Client.

The Client nodes functioned chiefly as rendering engines. They needed a lot of capability because along with rendering the characters and environment, they were computing dynamic weather, lighting, and cinematic cameras. From the Unity perspective the Authority was doling out character-centric streams to each render client. That produced a sort of complementary problem from the outside world perspective of being able to configure this array of nodes, and letting the Viewer Clients (i.e. the app on Facebook) know ‘here’s where you find the stream of this particular character, and that’s going to get refreshed, periodically, around every 8 hours.’ That concept was embodied in something we called the conductor. Everything had to be data driven and needed a way of authoring iterations for how many nodes were in the game, which was how many contestants were currently in the game. 

Then there was the other authored data, which was: how is the simulation going to play out? So there were two separate realms - how things execute internally and how this vast array of clusters is configured and broadcast to the world. That’s where you have this huge fanning out of data - you have one client node that’s producing a video stream and data which is associated with that specific stream, which goes to Genvid clusters that have to scale to vast numbers of Facebook users.  Those Genvid clusters are operating closer to Facebook - and every viewer there has their own instance of the Viewer Client that runs as the Instant Game. 

The Viewer Clients tap into the streaming sources that the Genvid clusters provide. Plus, you have the viewers giving inputs in the form of what Pipeworks called APEs. The Genvid clusters consolidate the massive-scale viewer inputs into periodic reports received on the Render Clients.

The reports are forwarded to the Authority, where the gameplay decisions happen. The Authority pushes the logic of what’s happened to the Render Client for rendering. The Render Client, associated with a particular “contestant”, produces the audio/video stream, along with sideband game data. All of that data is then streamed to a Genvid cluster which knows how to encode it and send it out to however many instant game instances request it.

What’s impressive is, in the production environment, we only ran 13 instances of the rendering cluster. One for each contestant and one for the home stream. That’s 13 rendering clusters and one authority cluster. Each rendering cluster used just one CPU for each rendering cluster and varying number in the Genvid cluster. Incredibly, 13 GPUs served around a million users. A unique advantage of using Genvid’s solution.”

 

Screenshot

If you’re interested in learning more about Rival Peaks, you can get details on Genvid’s website. You can also grab the Genvid SDK from the Unity Asset Store.

March 24, 2021 in Games | 7 min. read

Is this article helpful for you?

Thank you for your feedback!