The future of VR and AR involves Unity, and it starts now
The Short Term
At Unity, VR and AR are have both short- and long-term visions. The short term effort, over the next 6-12 months, is directly aimed at making the task of creating high quality VR and AR content easier for our developers. It consists of several key projects :
- VR/AR Platforms
Support all main VR/AR platforms with the best possible VR/AR-optimized rendering pipeline.
- VR/AR Performance/Quality
Drastically increase our rendering performance and quality across VR/AR devices.
- VR editing extensions to editor
Make it easier for our current developers to develop VR content via in-VR
scene editing tools.
- AR interaction tools in engine and editor
Provide generic API layers for Unity made games to interact with the physical
- VR/AR extensions to Director
Add support for cinematic VR and storytelling, 360 immersive (stereoscopic
and non-stereoscopic) video, VR editing and storytelling and high end quality
rendering for cinematic experiences.
- Unity Support for VR/AR Artists
Deeper integration of artistic tools with our editor and VR pipeline
The Long Term
The longer term effort is being managed by Unity Labs, a newly-created entity inside Unity in charge of exploring the future of game and VR/AR related technologies, and examining what repercussions those technologies will have for our developers and our mission of democratizing game development.
This long term vision is based on an effort to imagine how, if we take VR and AR together as a single piece of technology, they could end up completely transforming the way we create and interact with content within the next decade. With a merged, coherent and powerful AR/VR technology millions of users would then create, play and learn in virtual worlds created by Unity developers.
But this is also about imagining a future where game authoring will be made easier through creating and developing within the VR worlds themselves. A future where creating triple-A games is within the reach of ever smaller creative teams. A future where assets understand each other. The glass knows it is on a the table and can fall. The wheel knows it can connect to the car and propel the car forward. Assets have metadata. Assets are smart and thus contribute to a far easier and simplified authoring experience in order to produce far more sophisticated games and content. Assets are part of a semantic ontology that provide the foundation for AI assisted authoring. Worlds can be procedurally generated with neuro evolution algorithms: mixture of genetic and NN+machine learning algorithms. The definition of a developer and gamer is blending. The possibilities are enormous. Custom designed economic policies based on generic blockchain technologies? What if we give worlds creators the possibility of defining their own economies and connect them together securely? These are only few examples of the technological components that will most probably be required to build such such virtual worlds (or metaverses if we keep in trend with the literature!).
What is interesting about the metaverses is not that we will see the exact replica of the Oasis being created in the next year; but rather how they provide us with a powerful metaphor to start thinking about the future of game authoring technologies.
Creating and managing such virtual worlds presents very complex technological and authoring challenges that need to be addressed and will help us frame our vision toward providing our developers with the best possible tools and features in a not too distant future.
Unity Labs is already contemplating multiple research projects into the most important areas:
- Interoperation and connectivity: by providing a lingua franca for active objects, game content, prefabs, and avatars we can have these entities understand each other, connect with each other, work together, and move smoothly between worlds.
- Ontology: metadata and semantics for world objects are needed to enable searches like “I need a spiked bloodstained viking shield for my game.” and thus easier forms of authoring UX.
- Future of game authoring UX: extensive research into re-imagining what game authoring will become within the next decade both from a Unity editor perspective as well as from an in-VR authoring perspective. We are already at work on early prototypes of creating Unity experiences in VR.
- Advanced character animation: future of character animation and mecanim. Advanced user interfaces and visual authoring to make it easier to animate characters. Multiple colliders and rigid bodies per character.
- High performance computing: accelerated unity scripting via sophisticated compiler optimization technologies. This will allow for high performance computing such as neural nets, genetic algorithms and real-time image analysis to be performed inside Unity scripts.
- Commerce, economy, monetization: the creation of a trustworthy – possibly blockchain based – virtual item economy supporting in-game purchases, asset-store purchases, game/world/application purchases, and more.
- Smart world objects: Scanning and tagging the world around us and making it available to our game developers as smart assets.
- AI-driven authoring: We believe that with an AI assistant a 5-year-old should be able to build a VR world using only gestural and verbal commands. . This would include intent based modeling, layout, and assembly.
- Scalability and quality: world computation and world data to be stored and executed seamlessly between client and cloud depending on the client’s capability; complex calculations like lighting, cloud based film rendering or AI/simulation should scale dynamically for real-time updates on client or server as appropriate.
These are just some of the different areas we are committed and focused on. Unity is dedicated to making VR/AR development as frictionless as possible for our developers and as rich as possible for their users.
Unity Labs is a newly created entity inside Unity in charge of exploring the future of game and VR/AR related technologies and what it means for our developers and our effort at democratizing game development.
Image Credit: NASA/JAXA.