Friday, May 11th, 2012 Posted by Jim Thacker

FMX 2012: We could remake 300 in real time now



In the next of our reports from FMX 2012, we round up one of Thursday’s standout sessions: Zoic Studios VFX Supervisor and co-founder Loni Peristere’s session on real-time virtual production for episodic television.

In the one-hour talk, Peristere discussed the birth of ZEUS, Zoic’s revolutionary production system, how he hopes his tools will convert Chris Nolan to working digitally – and why we could now remake 300 in real time.

The text below is an abridged version of Peristere’s presentation.

Like feature films, broadcast clients expect the very best you can do, at the very highest level of technology. When you have J.J. Abrams come to you and say, “We’re going to make a new pilot for NBC, and it’s about a post-apocalyptic world, and it’s going to have no real backgrounds,” he doesn’t diminish his expectations for television. He says “I want photoreal visual effects, for a tenth of the money [of film work] and a tenth of the time.”

So at Zoic Studios, it’s our job to come up with solutions that allow that to happen. We want to make episodic television tie for first place with the feature community. But the exciting thing for us is that it’s not just pushing the boundaries for what we do on TV: it’s pushing the boundaries for what we do as VFX artists.


The noodle bar scene in Blade Runner: a “world so completely realised you never doubt its authenticity”. The scene is a touchstone for Inception director Chris Nolan – and a technical challenge for Zoic’s Loni Peristere.

While I was on the plane, I read an interview in the [Directors Guild of America] magazine with Chris Nolan. The interviewer asked him why his favourite movie was Blade Runner. His answer was that, after seeing Blade Runner and Alien, not knowing that Ridley Scott directed them both, he felt they were connected.

The connection was that Ridley Scott had created worlds that you felt existed beyond the lens. When you were looking at Harrison Ford sitting at the noodle shop, you knew that to the right of the camera there were spinners flying around; that above him, there were blimps.

These are worlds that are so completely rendered, you never question their authenticity.

We’re here to do that in a bigger, better and faster way: a process that places script and finished story at the heart of the creative process. We’re going to make restrictions disappear, until the tools ultimately make the entire creative process about storytelling, in its purest and most beautiful form.

The birth of a pipeline
My partner Andrew Orloff and my CTO Mike Romey have been obsessed with the concept of developing a pipeline of tools that will allow us to build photorealistic visual effects in real time. This has led us to something that’s very special to us, and which we call ZEUS [Zoic Environmental Unification System].

ZEUS, the ruler of Olympus: the world creator. [It came out of] an obsession for creating real-time environments, for real-time production, for real-time work, to get the most creative potential from a slim schedule.

Four and a half years ago, a guy named Joshua Kolden, who had just finished working on Avatar, brought the InterSense camera he had used to give Jim Cameron a view of the planet Pandora into our studio. We immediately started asking around how we might get this real-time process into a live-action environment.

The first people we reached out to were our friends at Epic Games. Could you do live-action greenscreen with a real-time camera and render things as sexy as Gears of War into the background of the live-action plate? Actually, why couldn’t we do the movie 300 in real-time using the Unreal Engine?

Things didn’t move as quickly as we would have liked. The Epic guys were trying to finish Gears of War, but we wanted to go out there and make a series. We had two greenlight projects, at ABC and Warner Bros, that were interested in real-time production because of the huge savings possible.

But the game industry said, ‘Oh, we do videogames. We don’t have the time to develop the tech to make TV shows.’ And the TV guys were like, ‘Well, we don’t want to spend the money to do it either’. We had an impasse.


ABC’s remake of V proved the catalyst Zoic needed to move into real-time production. With no other way to produce the visual effects required for the series on schedule, studio executives approved the new process.

In life, whenever you reach obstacles, things come along that help you overcome them. For us, that was a TV series called V. The studio wanted to take a very risky approach and create all of the spaceship interiors as visual effects. For a pilot, that seemed okay. So we produced it traditionally. There were something like 800 effects shots and we turned it around in about 12 weeks.

Everybody thought it was great, but then Warner Bros turned around and said to us: ‘Holy crap. How are we going to do this for the series?’ We can’t possibly afford [to spend 12 weeks on each episode]. So I turned around and said: ‘Virtual production. Let’s go talk to the games guys again.’

At this point, ZEUS launched a bolt of lightning into our conference room in the form of Rick [Balabuck] and Eliot [Mack] from Lightcraft Technology. They had a very simple game engine that did real-time production for previz, and they had designed it in such a way that you could see your backgrounds in a very realistic way through prebaked lighting and rendering.

Eliot worked with our team to take the real-time rendering from the set and integrate it into shot setup at Zoic Studios. Lightcraft [technology] was used for previz, but ZEUS was created for final. In 2009, we began the world’s first prime-time virtual production on a network scale.


How the magic works: an overview of Zoic’s ZEUS real-time production system.

ZEUS takes concept art from Maya into Lightcraft, back to Maya, and into Nuke, and finishes it, all in one real-time toolset. It takes shot finishing times from days to hours.

We’ve fully integrated ZEUS into our Shotgun pipeline, which allows us to manage the real-time data. ZEUS generates 3D, 2D and tracking templates in real time. So the real-time part isn’t simply rendering the images on set: it’s actually creating the scenes in Shotgun.

Now, we have three prime time series on the air that are using ZEUS. Once Upon A Time averages over 300 shots an episode; Pan Am is a 100% photorealistic series; and on Steven Spielberg’s Falling Skies, we are not only creating virtual sets, but virtual characters.

In the month of April, we rendered two million frames. That’s a general average for busy months.

Where next for ZEUS?
In the past three and a half years, advances in technology have expanded the opportunities for ZEUS in virtual production. We’ve created a Unity app that brings a tool that directors can use directly with ZEUS to iOS devices.

When the director walks on set, they are given an iPad with all the virtual sets stored on it. You can change the light, you can change the time of day, you can rotate the set. Then you show up on the day, and ZEUS imports those changes into the production process and sets up your work day based on what you want to see.

So what’s next? It’s time for us – and I mean the collaborative ‘we’, everyone at this show, everyone interested in the magic of movies and television – to do something really special, to create tools and technologies that will bring photorealistic real-time production to life.

This coming season we’re going to demand some very lofty things. We intend to drive lights on set with real-time scenes on iOS devices. We intend to bring characters to the stage and ouput on location using mocap techniques. But what we really want to do is final shots on set.

If we get through this phase, it will provide a workflow so innovative it will be remembered just like the clips we just saw. [Peristere had previously screened footage of pioneering uses of sound and colour in cinema.] Not because of the novelty of doing it in real time, but because of the creative freedom it’s going to give all of us as film makers. It’s going to give us a platform to think bigger.

It’s going to bring VFX and animation to the front end of production. Visual effects will no longer be segregated into phases: it will be managed as a single process. The entire production team will work together to make the most incredible stories you’ve ever seen, and it’s going to cost a lot less. As a film-maker, it makes me dizzy.

Because the big thing that’s going to come out is virtual production for feature production. We’re looking at using virtual production to make, potentially, what would have been a $120 million movie for $30 million. I used the example of 300 earlier, and I think the answer is that we could now do 300 in real time, for less money.


Ripe for revival: Zack Snyder’s 300 cost $65 million to make, back in 2006. Today, it could be remade it in real time for less money, claims Loni Peristere.

And that brings me back to that Christopher Nolan article we talked about at the beginning of this speech. Christopher Nolan’s approach makes him a little old-fashioned. He uses a single camera, is averse to digital cinema, and doesn’t use digital visual effects all that much: he uses them, but he tries to get everything in camera. He builds his world from a singular point of view.

I believe the innovations we’re going to see in virtual production in the next six months are going to change Chris Nolan’s mind. Like Blade Runner, he’s going to be seeing [a complete] world around him. Everywhere he turns his camera, not just when it’s on the greenscreen. Now we’re using Zeiss’s new tool to do auto matte generation to get rid of that terrible green light.

We’re going to be out on location with a stereo lens doing real-time matte extraction, using real-time rendering tools brought to us by games companies like Crytek and Epic. All of these great tools are going to give Chris the ability to build whatever he wants.

That’s super-exciting for Chris Nolan. It’s super-exciting for fans of cinema. And it’s super-exciting for me, because we’re going to change the world with this.

Visit Zoic Studios online

Visit the FMX 2012 website