Showing off VR experiences on a storefront has proven to be one of the toughest hurdles for VR developers and streamers, but Owlchemy Labs, creators of Job Simulator and Rick and Morty Simulator: Virtual Rick-ality might just have a solution.
Informally dubbed mixed reality, the process allows VR gameplay to be merged with green screen footage in-engine, producing video footage with the player literally in the game without the need for any video editing. Its official name is depth-based realtime in-app mixed reality compositing, which isn’t as easy on the tongue as mixed reality, but does a better job of explaining the work that goes on behind the scenes to drop a player into the virtual world, and requires a stereo depth camera, green screen and use of custom shaders and plug-ins.
This could offer a more appetising approach to VR footage. At the moment watching VR on a Twitch channel or game trailer is a shaky first-person perspective. Technology like this in-engine mixed reality could offer a more immersive viewing experience for people looking to get a sense of what a virtual reality game is all about, and as a result could be a strong marketing tool for virtual reality experiences.
The process, for this interested, uses a stereo depth camera to record video and depth data of a user against a green screen. This data is then sent through a custom Unity plugin and shader, which produces the in-engine footage of the user into the game’s environment. The end result is footage that drops the player into the game in place of their in game avatar.
It’s not out yet but Owlchemy Labs is still working hard on mixed reality to make sure it’s ready, and hope to share this technology in the near-future.