CGC: What’s your company position?
KN: I filled a few roles there. I worked as the Modeling Supervisor and as the Effects Supervisor. That’s because most of the effects in this case were very entangled with the modeling, so they kind of went hand in hand anyways.
CGC: What role did you play in the making of X2?
KN: Our responsibility for X2 was to create the Mystique transformation effects for the second movie, but at the same time we wanted to improve upon the production process and create some transformations that were a new challenge from the first film. Shots such as the tent sequence where she morphs from Jean, to Mystique, to Storm, to Rogue were a whole new idea. In the first movie we only dealt with on transition per shot. This added a whole new level of difficulty to the work that we all really enjoyed.
CGC: Please tell us some of the problems you conquered on the project.
KN: I think one of the biggest hurdles I myself had to accomplish was in creating polygon models from the scan data of the actors that could all morph from one to another. They basically all share the same topology so we could even morph different parts of mystique into different characters at the same time. In the first movie we utilized NURBS patches to get the effect. In this case, you need to have all the patches grouped in the same order on each character, and every patch has to have the same number of spans in U and spans in V. The UV directions also have to be the same. In X2, we had to go through all this work as well, but then had to convert the model from the NURBS into a low poly SUBD model. These then became dense poly meshes once they were sent out to Renderman. The difficulties here were that every model had over 16,000 vertices that had to be numbered in exactly the same order for the morph to work. Eric Sanford, who’s a great modeler in LA, helped us brainstorm some good methods for controlling this. I ended up writing a bunch of custom modeling tools that would control the numbering, the conversion to polygons, and the naming and grouping of all the objects. Going with polygons instead of NURBS gave us more flexibility later in pipeline for binding, texturing, custom shaders, and archiving of geometry.
CGC: Please cover a bit of the model and texturing you performed for X2.
KN: The other big task from the modeling end of things was placing scales. There’s close to 10,000 scales on Mystique that had to be hand crafted and placed because they form such an explicit pattern on her with each scale a specific shape. Dave Kitner and Mike Comly were the 2 guys that handled this for us, among a lot of other things. They have a lot of patience!
For texturing we took advantage of camera projections whenever we could, and had to develop a complex process for making the morphs work out. When we got back plates that were filmed for the morphs, each of the actors Mystique would transform into had acted out their part as closely as possible with Mystique. We were usually facing these shots where 3 or 4, or in one shot, 9 different actors had tried to get as close as possible, the same acting performance. In all cases though, its impossible to line them up perfectly. They all have different shapes, sizes, their timing would be slightly off, etc.
To align the characters, what we had to do was to roto each character to the back plate with facial animation and all. We’d then use the render farm, instead of rendering, to bake out all the textures for the characters per frame, using the back plates projected onto the models to get the texture from. This ended up in having thousands and thousands of textures. But once this was done, we could reanimate the characters performances so that they would all line up accurately. This had to be done before we could start the morphing process. I think this was definitely one of the biggest challenges with X2, was that each shot had to go through such a lengthy and complex process.
KN: Oh, geez. I don’t think there was any single part of the production process that DIDN’T require us to write custom tools for. We had Daniel Roizman and his company (Kolektiv) writing us tools to handle the complex motion of the scales and the map driven blend shapes… Ben Anderson, a 19 year wiz that we plucked out of college without letting him graduate, was writing a lot of tools and shaders that we needed for Renderman, as well as databases and web pages that let us track all of the data we needed. I was writing modeling tools, UV layout tools, plugins for Maya to control the maps that we controlled Daniel’s tools with, and so on.
I have never worked on a production that didn’t require a lot of R&D though, or at least benefited profusely from it if it wasn’t needed. Every project is custom work, and usually requires custom tools. X2 was definitely and extreme case of this.