What do System Shock 3, Oddworld: Soulstorm and Harold Halibut have in common? Well, all of these incredible-looking upcoming titles are built on Unity, and more specifically will utilise Unity’s upcoming High Definition Render Pipeline or HDRP for short.
HDRP is the core graphics tech that’s bringing Unity out of the mobile and indie space and into the area of high-end PC and console releases, which adds huge potential to the innumerable studios and developers with Unity experience around the world.
That process is being led by Natalya Tatarchuk, VP graphics at Unity, whose previous experience includes an eight year stint at Bungie, during which she designed and lead on Destiny’s renderer architecture, and time spent at ATI/AMD as a graphics software architect.
As with any major new feature, HDRP has been circulating for a while; most notably in Unity-created demos such as Book of the Dead, with its stunningly lifelike forest environment. And at GDC this year, the Unity’s demo team wowed us with The Heretic, which combined a lifelike human with a dynamic environment, incredible VFX and lighting effects.
“This pipeline offers a huge leap in graphical realism and ease of use and will be production ready for all developers in 2019,” enthuses Tatarchuk. “It is designed to benefit games shipping on current generation consoles and modern PC hardware, and we see ambitious high-end titles adopting it as it offers tremendous visual fidelity and power of performance. It benefits games today, and games in the future that will use it.”
HERETIC OR PROPHET?
Tatarchuk is rightly bullish on what Unity has achieved with its new graphics workflow. She demonstrates The Heretic to me in the editor, where she’s able to edit and iterate at the final render quality, even with all the effects that are packed into the demo.
“For us it’s so ingrained that we could just do this in the editor, we forget to emphasise it enough. Putting my triple-A developer hat on, we were never able to do final frame quality operations in the editor.”
And that power translates directly into better final results: “[The demo team] is iterating on that high-fidelity content with all the final assets in editor, they’re seeing exactly what the final frame will be while the editor’s running. They’re moving everything around, that’s why they could get to this result… They don’t have to spend hours baking.”
That means they don’t have to spend time pre-rendering any of these effects before they can tweak them.
“HDRP is allowing game creation teams to do some incredible work that would have been much harder – or even impossible – before. HD lights are physically-based, with real-world units, which lets artists use their real-world experience to guide light setup. You know light bulbs, the sun, and you don’t need to translate exposure into arcane units. It just works!
“So they’re able to work directly on the final triple-A quality,” she pauses and adds: “Triple-A plus plus in fact – because nobody in triple-A ships at this quality yet. I’ve worked in that and they’re not doing it yet. And that’s why this team can do these types of things.”
The workflows are undoubtedly there then, the results speak for themselves, but demos are one thing and actual games are another. So it’s heartening to see the very first instances of titles using HDRP coming from studios.
BUILD IT AND THEY WILL COME
HDRP is still in development and will not be considered fully production-ready until the 2019.3 release of Unity later this year. Despite that, some studios are already in production using the pipeline.
“We are getting to the point of maturity with HD pipeline that you can be producing actual shipping projects on it. That was the core point of ensuring that performance is tiptop, ensuring the content creation flow is natural, intuitive and convenient, that it’s efficient, that we’re able to give a gamut of rich features to developers.”
Tatarchuk won’t tell us just how many such projects there are, only saying: “We’re certainly working with a number of creators to ship games on HDRP. And we are supporting them in full force to prioritise and ensure that those titles are in a happy place.” One likely such Unity-based title is Campo Santo’s In the Valley of Gods.
Tatarchuk agrees with our assertion that we should see finished titles based on the technology over the next year or two, and that they’re doing their utmost to ensure the quality of that first wave: “It’s like a good console launch, right? You make sure to invest in developers who put their trust in you.”
The GDC keynote showcased two such titles, both with huge fanbases. Otherside Entertainment’s System Shock 3 sees the return of the franchise after twenty years (though numerous spiritual successors have helped to bridge that gulf), while original protagonist Abe returns in Oddworld: Soulstorm from developer Oddworld Inhabitants.
The visuals on both titles were impressive, and those results were doubly impactful for Unity itself, with team members on stage showing their work in progress live in the editor. More impressive still was how the teams are already finding creative ways to utilise the toolset to generate specific effects for their titles.
“I am forever rejoicing by the variety of ways that creators are using our features,” Tatarchuk enthuses. “As someone who used to write algorithms for a living. That was the most fulfilling element, where somebody would take an algorithm or a feature we put together and then they come up with an outrageously different way to think about it. We’re not attached to specific uses, we’re creating a tool to give a baseline for people to do whatever they wish. And personally for me that is the most fulfilling thing to see. It’s all about the end result replicating whatever is in their head, their vision, their world.”
Of course, not everything comes straight out of the tools Unity provides, with developers building on top of what Unity ships both in C#, Unity’s chosen programming language going forward, and by writing custom shaders.
“Moon Studios, on Ori and the Will of the Wisps, is an example where they’re taking the C# and shaders, and creating a drastically different look. They’re doing 60fps rendering with a completely rethought approach. They’re getting tremendous performance,” Tatarchuk tells us of the platformer scheduled for next year.
Then there’s more unusual projects, such as the stop-motion animation of Harold Halibut by Slow Bros. “Harold Halibut is another one, those guys [are using] the new post-processing stack. It really gives a state of the art, Pixar look, like cinematic frames. And that’s a good example where they took HDRP and made it their own, made it really special,” she says.
Between what we’ve seen from Unity’s own demo team, and the early efforts of a handful of developers, things are looking very bright for HDRP, and all at a time when we’re about to see another huge leap forward in graphics horsepower for the industry.
LIGHT YEARS AHEAD
Google’s Stadia has already announced that it’s wielding a massive ten teraflops of graphical muscle per server blade – a big jump up from the current top dog, the six teraflop Xbox One X – and it seems almost certain that Sony and Microsoft’s next consoles will be in the same ballpark in terms of performance. So what kind of advances might we see in games with such power on offer?
“What I’m excited about it is you can really do dynamic worlds at that compute, you can get a lot more interactivity,” Tatarchuk says. “I think ten teraflops can let you be a lot more creative in your usage. That’s that’s a really exciting point for developers because then if you combine it for example with all of the tools, like the Visual Effect Graph [the new GPU-powered particle system launched with Unity 2018.3].”
Referring back to The Heretic demo, she tells us: “This is the example of what people are going to able to accomplish – that’s exactly it. You get the scope of the richness of the scene. And then you can really really amp up, you saw everything was moving. There’s lots of things, lots of wires. The world is becoming dynamic. That’s a good way to spend teraflops, from my point of view.”
We ask whether we’ll also see real-time, ray-traced lighting, which has been much-hyped by both Unity and Unreal over the last year, on this kind of next-generation hardware: “That’s up to the creator ultimately,” she answers. “Certainly you could do some really amazing ray tracing.”
She goes onto explain that you could take Unity’s ray tracing tech and drop it into The Heretic today, because it all runs on HDRP: “And then you [can] put in some really amazing refraction and transparency,” she adds. Though at ten teraflops she also notes that “you would still have to be mindful about the complexity of the scene.”
“What I’m excited about it is you can really do dynamic worlds at that compute, you can get a lot more interactivity.”
Making the landmark moment of real-time ray tracing a reality is as much a testament to the flexibility of modern hardware and software as it’s to its sheer power.
“Fundamentally the reason we have reached this landmark is because there is now performant hardware under the hood that allows us to have really fine-grained execution change between rasterisation and ray tracing. One of the challenges for ray tracing was you were locked into one or another. And that made it really difficult to actually take performant advantage of it.”
So it’s the ability to switch between high-speed and high-quality solutions that will allow ray tracing to be used, where needed, to improve the appearance of the scene. But control over those decisions must then be easily available in the editor.
“That’s actually one of the key things that we build in our solution. Say I want to render this super complicated transparent glass with refraction and water, “ she points at glass of water on the table between us. “I will send that through to ray tracing. Now this object is opaque [picks up a phone speaker], so I don’t care, I’m going to send it to rasterisation, but its shadow will go to ray tracing. Fantastic! Best quality shadows, really high resolution, super detail and best performance,” she smiles.
“That’s one of the things that we focused on, the content creation story for enabling ray tracing… And that’s what’s gonna take it out from the cute demo, where I have to spend 50 grand on a rig, to a practical thing that people can actually ship. And that’s always our focus, how we enable people to ship.”
ENGINE ROOM
Rebuilding the graphics engine at the heart of Unity over the last few year is only the beginning though for the team: “We’ve certainly come to the maturation of this stage. But we’re very far from thinking that we’re at the end of the process. Our creative juices and imagination have just started flowing… it’s just the beginning of our real evolution,” Tatarchuk says.
“With my team, there’s a roadmap for the next five years worth of effort. [At the lower end] there’s a tremendous amount of work we’re doing to get even more performance both on GPU and CPU so that we can continue to expand the complexity of the worlds that one can render.”
And the team is working with Unity Labs, the in-house research team, to bring new shader technologies to Unity. “It’s a higher fidelity material representation. So you can do much more rich surfaces with that. Current consoles would have a bit of a challenging time with performance, but upcoming generations will be able to take advantage of it.”
With graphics about to take centre stage again as a new generation of hardware is launched over the next year or so, Unity has timed the launch of HDRP well. By bringing such high-end console graphics to the platform, it can empower legions of Unity developers.