Unity on finding the ‘magic bit in the middle’ of movies and games

Can you tell me a bit about your background both within and outside Unity and how you got where you are today?

I worked in video games for a long time, I was at EA for seven years and worked on FIFA, SSX, NBA… My dad’s a photographer so I grew up in a photo studio. And I got into games because I was really passionate about the cinematography and the cameras and how to shoot them.

Video game cameras, interactive cameras, quickly became this really interesting problem because it’s not a movie. You know, you can control it but you’re shooting a variable scenario. So I started working on procedural camera systems and game camera systems and I made one and then another for Frostbite.

I left the game industry, worked in film for four years, shot four movies and then was introduced to Unity. When I saw it, I got it. I was like: ‘Oh my god this is democratising creativity in a lot of ways’.

I started a company called Cinemachine Imagery where I made this thing called Cinemachine, which is a procedural camera tool. I talked about it at Boston Unite 2015 and someone from Unity saw it and said that it should be built right into Unity. So Unity acquired the technology and that included me. And it was great because, I don’t have to figure out if the toilet has soap in it anymore and other things that happen when you have your own company [laughs]. So Cinemachine is myself and lead engineer Gregory Labute, and we spend our days solving camera problems.

What are the benefits of real time for film production?

Numerous. Let’s define film production for a second. There’s a few different ways to do it. There’s virtual film production. So let’s take Blade Runner (pictured top), Lion King, Jungle Book; they used Unity for pre-visualisation, they had a camera on a motion capture floor and it’s a real camera, it’s a real thing you hold, it has a screen on it and you record into that, it gets beamed to Unity, which renders the shot and puts it back.

So you can be in an empty dusty mo-cap floor but when you look around it’s lions and tigers or sunsets. And directors are using this to craft their stories, move the sun over there, move the camera faster and make all the mistakes there. And we’re hearing directors who worked on these amazing projects say things like: ‘I get to put my hands on the lens again’. When you’re in CG, the director has storyboards and a lot of hand waving and then you have teams of artists who are doing the shots. But it’s kind of far apart. Now they can hold the camera but still be in CG.

The other side of that is fully animated stuff. So not a hybrid of live action, but completely CG. What Unity does is allow for the most open window of creative opportunity. What I mean by that is traditionally in a CG pipeline you have storyboards, you get a pre-visualisation, then you do layout and get your cameras down and you start to do animations and lighting and it’s very sequential.

If you are well into the animation stage and you say ‘I want to change the camera’ it’s a big choice because you’ve got to go back and render everything out. You can spend a hundred hours on a frame and creative decisions then have incredible time penalties. You can only do certain types of creative input at certain stages otherwise you can be looking at weeks or months.

What we have is because it’s all real time rendered you have every department at your fingertips so you have lighting, camera, animation, everything you want and you can be at that frame and move the camera a little bit, get a little bit closer and so on. And because there’s no post-production, there’s no compositing, there’s no rendering, when you’re done creating, you’re done.

There are great examples of that; Veselin [Efremov, writer and director] on [sci-fi short made in Unity] Adam changed a lighting configuration on the last day before shipping. You couldn’t do that before because when you change your light, you’ve got to render it out and it’s two weeks before you’re seeing it.

Neill Blomkamp on Adam: Episode 2, three days before they were done, was like: ‘Let’s make this afternoon-like instead of noon’. What happens is you move the light and move to another scene and you don’t have to render anything, it’s done. So that giant creative window of being able to shape and tune and iterate your project, all the way through, is a revolution in how you create these things.

Isn’t the visual fidelity found in most CG films a lot better than what you can get out of real time graphics? 

Are we as good as a basement full of computers rendering at a hundred hours a frame? The answer is no. But is that changing? Yes, an incredible amount. And depending on the show (such as episodic, CG content, stuff that you see go straight to TV), the quality of real time is equal if not superior to that, so those guys are switching over.

Is an entire blockbuster film going to be rendered in real time? No, that’s not going to happen for a little while. I mean there are some notable exceptions of characters being rendered with real time engines thrown into live action. But still the blockbuster films are using real time for pre-production, project design, cinematography. And then the other guys are doing real time because you can’t afford not to.

If you look at the graphics, movies are getting better: remember the first CG and how horrible it looked. It’s getting better and it’s basically photo real now. But that graph, if you’re to draw it, it’s on an incline. The real time graph is closing in fast.

What are the benefits of Unity being like film production software for games? 

There’s an approach that you take when you’re going to shoot something on film. And it’s very heavily steeped in the storytelling language of cinema, camera choice, lens choice, framing. And we’ve built tools to do that – some of the best tools that are available in that regard.

That language, when you can do it well, transfers over to games. Now games are different in that you have agency, unless it’s just a totally canned cutscene, then you have control. So how do you make something look cinematic when you still have variability to the performance plus the dimensions of user input?

When games try to be too much like movies they can sometimes fail because the controls are heavy or you don’t have awareness. But when you’re too ‘video gamey’ you miss out on conveying the story and some of the lens language or cinematic stuff.

But I think that it’s great to give the filmmakers some of the games stuff. And it’s great to give the game makers some of the film stuff because there’s just like this magic bit in the middle. It’s an axis, you know, from Pacman to a Steven Spielberg movie. And I’m really excited about that hybrid space in the middle. My job is to make these tools work in both domains.

Should devs make trailers themselves rather than shipping them out to trailer companies? 

There are some fantastic trailer companies out there and I think it makes sense for some studios to consider going to a trailer company because these guys specialise in it. I think having the skills in-house or at least the ability to do it in-house, has its benefits too. For example: ‘We’ve changed our main character, oh no, now we have to do our trailer again!’

The tools are getting so fluid now. It’s like going to film school and putting a Panavision cam on every kid’s desk. It’s not about the camera anymore. It’s not about the tech anymore, it’s ubiquitous. It’s not about that anymore.

It’s instead about how you tell your story and how you craft these memorable things. And that goes back to the fundamentals because there’s always going to be a better computer along in 20 minutes. Everything is changing but colour and composition, and emotion and story, that’s forever.

Additional reporting by Jem Alexander

About Marie Dealessandri

Marie Dealessandri is MCV’s former senior staff writer. After testing the waters of the film industry in France and being a radio host and reporter in Canada, she settled for the games industry in London in 2015. She can be found (very) occasionally tweeting @mariedeal, usually on a loop about Baldur’s Gate, Hollow Knight and the Dead Cells soundtrack.

Check Also

Q&A: Stefano Petrullo and Torsten Oppermann on Renaissance PR joining the 1SP family, and what creating a ‘superagency’ actually means

Renaissance PR was acquired by 1SP Agency last week. We checked in with both companies to ask some questions about the acquisition, and what it’ll mean for them going forward