Find millimeter on Facebook

Related Articles

Tight Ship

Oct 1, 2005 12:00 PM, By Michael Goldman

A Clever DP, Efficient Effects, and a Subtle DI help Serenity Stay on Budget.


      Subscribe in NewsGator Online   Subscribe in Bloglines  

Joss Whedon Chuckles at the question, “How did the filmmakers behind Universal's Serenity produce a high-end, science-fiction epic for the big screen without access to Star Wars-type money?”

DP Jack Green’s camera crew films actress Summer Glau during a key escape sequence, which was extensively previsualized at Green’s request.

“We did have Star Wars money — the money they had in the 1970's for the original Star Wars, that is,” he laughs. “Actually, it's true — this is a low-budget film relatively speaking, but we had the advantage of using fresh actors who did not command half our budget for salaries; a DP in Jack Green who works unbelievably fast and efficiently; a very clear understanding of what we were trying to do; and the advantage of a digital intermediate [at FotoKem, Burbank, Calif.], which helped us tremendously. That, in essence, helped us turn our small budget into a bigger one — or made it look like we did, anyway.”

Serenity tells the story of a band of shady characters eking out a living on a beat-up spaceship cruising the galaxy, only to be forced into a confrontation with a morally bankrupt galactic government over the fate of a mysterious member of their crew. The fact that the movie made it to the big screen at all is impressive enough, given that it's based on Whedon's quickly canceled TV series, Firefly (see the October 2002 issue of Millimeter).

Whedon, also the creator of TV's Buffy the Vampire Slayer and Angel, says getting a studio greenlight was one thing. Pulling the whole thing off to his satisfaction on limited time and money was quite another. Besides Green and the DI process, he gives particular credit to the visual effects team at Zoic Studios, Culver City, Calif., under the leadership of visual effects supervisors Loni Peristere and Randy Goux, for making his vision a reality within the project's constraints.

“Zoic did all sorts of digital previsualization for us that helped throughout — particularly with the big space battle, which was very complicated — and they also had lots of solutions to take our limited number of computer models and make it look like two armadas were battling,” Whedon explains. “They worked very closely with Jack Green to find ways to get things done.”

The budget for the ventilator shaft scene provided for only five shots, which the crew created with careful choreography, according to director Joss Whedon.

Production

At the outset, a decision was made about how much homage filmmakers should pay to the extreme handheld, documentary camera style of the television series. Green says Universal's decision to make Serenity a widescreen 2.35:1 anamorphic release basically limited that style to key sequences. He therefore opted to shoot the film on Super 35 and then have FotoKem perform a digital squeeze on the negative during the DI.

“We knew this would be a widescreen release, and that meant we couldn't get too carried away with the documentary style — although we did employ it to some degree,” says Green. “But we couldn't employ it as much as they did in the television show; it might make the audience seasick. We also agreed early on to do a DI in order to avoid losing generations to an optical squeeze. I knew we could maintain the blacks from the original negative and reduce graininess this way, but this approach meant we had to have very close coordination between the set photography and the special effects photography. Otherwise, if we moved the camera around too much, it wouldn't look coherent because it would be difficult to match the visual effects photography to the plates.”

In other words, Whedon says, “We had to adjust the lenses and back off a bit because, with the Super 35 lens, you see and feel so much. Handheld was part of our vernacular for the TV series. But for the movie, we couldn't do the extreme handheld stuff, although we did it in some less extreme situations. Also, this is a motion picture, meant to be more epic and grand, so we wanted a movie aesthetic. The notion was to absorb the TV aesthetic into a larger aesthetic for the movie.”

Green shot the entire movie with Panavision PFX-P Platinum Panaflex and Millennium XL Panaflex cameras and Primo and C Series lenses. He relied entirely on Kodak Vision2 500T Color Negative Film 5218 (rated 500 ASA).

A major battle scene begins. To work with a limited effects budget, the crew used a variety of visual effects tricks and repurposed many elements.

Green's background in visual effects photography meshed well with Zoic's approach to the visual effects. There are, after all, about 400 visual effects shots in the movie, with Zoic having created about 220 of them. (The rest of the digital effects shots were created by Rhythm and Hues, Los Angeles; Illusion Arts, Van Nuys, Calif.; and Perpetual Motion Pictures, Valencia, Calif. In addition, Grant McCune Design, Van Nuys, built a 15ft. practical model of the Serenity ship, which was used to film the ship crash-landing using entirely practical techniques.)

Peristere was the visual effects supervisor on the original TV show, and for the most part, he brought Firefly veterans onto the film. He also brought Goux, a Matrix veteran, to serve as Zoic's pipeline/networking guru on the project.

But Peristere insists that Green was a crucial player in helping the effects team maximize resources throughout production.

“Jack really understands technology and visual effects, and how visual effects and practical photography work together and are useful to one another for budgeting and aesthetic,” Peristere explains. “For an important escape sequence, for instance, he came to [the visual effects team] and said he was worried about seeing lights that would cause lens flares, and he asked, could we build the sequence in previz and let him take a look at the set in 3D? He was asking us to help him figure out his lighting. On the flip side, in another scene, we needed to create an effect of a light shining inside the cockpit of the ship, Serenity, and it was largely a CG sequence. Jack pointed out that, in the old days, he would just create the effect moving flags across his light rigs — shining light across stretched foam. The effect is giant, skeletal shadows like the ones we were looking for. So he created broken light for us, in the context of the story. That's an example of the old and the new coming together to get this thing done.”

A motion control session with actor Chiwetel Ejiofor.

Effects

Meanwhile, Zoic ramped up its infrastructure for the project — a job that was made easier because the company had recently upgraded when it took on Battlestar Galactica, an HD project.

“We already had 10TB of storage,” says Peristere. “That was enough to get going at 2K for turntables, modeling, texture work, testing, and so forth. But, as we got closer to finishing, it became evident that turnarounds for preview screenings would mean we would need to push even more data through the pipeline than we ever envisioned. That took us to new heights in terms of network storage. So, when the project was awarded to us, we figured we would need 600 to 800 heavy NT processors to turn things around during heavy weeks. Randy Goux organized all that, with all the skill sets he developed coming from [ESC Visual Effects, Alameda, Calif.], where he worked on the Matrix movies. That gave him foresight about predicting pretty accurately what we needed. We ended up with a network that supports about 22TB.”

Although Zoic retained digital assets from the TV series, the company largely had to start over when building digital imagery for the movie.

“For most models, we had to start from scratch because of the difference in resolution,” Peristere explains. “We did use our original Serenity ship model, but we had to up-rez the whole thing to hold up under 2K scrutiny. That meant upgrading the texture count, shaders — all that stuff. It was a domino effect that led to, essentially, a new ship. The model built for TV was an HD model and could have survived scrutiny for a medium-range wide shot, but we upgraded the polygon count considerably. We basically set up a network to work flexibly between [NewTek] Lightwave and [Alias] Maya. We modeled it in [a specially customized version of] Lightwave, brought it into Maya [v. 6], UV'd the ship, upgraded textures, and, of course, gave it a new, digital paint job.”

Peristere adds, “A typical effects budget for a picture this size would be in the neighborhood of about $15 million. Using some of these techniques, we ended up well under that — more like half that. We had to deal with our constraints, while at the same time, trying to be a lot bigger than our budget was allowing us.”

The so-called Mule Skiff chase scene is a typical example of how filmmakers worked around constraints. The sequence features a small hovercraft manned by Serenity crew members being chased over the surface of a planet by a larger ship manned by the evil, cannibalistic Reavers. In some respects, the sequence is reminiscent of the pod race in Star Wars: Episode I - The Phantom Menace, but “dirtier,” in Whedon's words.

DP Jack Green (right, with visual effects supervisor Loni Peristere) used Panavision PFX-P Platinum Panaflex and Millennium XL Panaflex cameras with Primo and C Series lenses to shoot the entire film, relying exclusively on Kodak Vision 2500T Color Negative Film 5218.

“I really wanted to shoot as much of the chase practically as we could so that it wouldn't have that airless feeling you see in movies, where people can fly just over the surface at 900mph and not hit anything, with hair blowing neatly in the wind,” says Whedon. “Therefore, we really wanted practical backgrounds for it. Jack Green shot it using these 30ft. Technocranes racing down a highway at 35mph to film the Mule rigs themselves, in which the actors sat. Dan Sudick, our special effects coordinator, designed five different Mule rigs for the heroes to ride in, and each one did different tricks we could film from different camera angles.”

The visual effects team at Zoic then removed the old highway where the sequence was shot and methodically replaced the road with wild terrain.

“Tracking was an issue because they shot the thing barreling up a mile-long highway, finding angles almost on the fly,” Goux explains. “Because of the schedule, it was therefore hard to prepare the road for a true tracking situation, measuring and marking it up. But, interestingly, it's an old highway, and those old black tar lines from filling in potholes on the road turned out to be ideal tracking markers. Creating the plates, we were looking for something to grab, and our software [2d3 Boujou 3] grabbed onto those tar lines.”

The big space battle between the military fleet of the so-called Alliance and the “death ships” of the Reavers proved to be an even bigger challenge.

“For the battle, we did manage to repurpose some things from the TV series — planets and ships,” Goux says. “These two giant armadas are in the scene, and we only had a budget to build about 12 of those ships. But we had to make it seem like there were hundreds of them, so we borrowed a lot from the TV series, took ships we built back then, and put them in the background to fill out our fleet. That helped us create the illusion of many more ships than we actually built. We also did a lot of kit bashing. It's an old-school model mentality: Build a bunch of widgets — like digital harpoons for the Reaver ships, panels, lights, things like that — and then take the 12 ships we had and keep repurposing them by putting these widgets on them over and over to make them appear to be different ships.”

Peristere adds that the space battle also turned into a major compositing job, with some digital shots featuring between 200 to 300 layers of elements. He says, “[There were] hundreds of ships working in choreography, adding complicated effects, volumetric lighting, and different types of motion. All the things that make visual effects difficult were in those shots. … We assembled most shots in about 3 ½ days, and then took about six weeks to do the composites.”

Whedon, whose background is primarily in the television world, says the process of mapping out such sequences, and deciding what to do in-camera, what to do in a computer, and what sequences to eliminate or shorten was “extremely educational.”

“Take the ventilation shaft sequence in the generator room,” he says. “The visual effects for that sequence were created by Rhythm and Hues. Basically, we were told we have money for five shots in that sequence, so I wrote it so that there are five times that we see down the gigantic shaft. On the day we choreographed the whole thing, I had to pick and choose specifically which five of my options I wanted to select, and we had to commit to that. Then, in post, you realize you need more than what you thought. But we had extra material because we thought about that in production. Part of the learning curve for me was about learning when to shoot extra footage for things that we might need, and when not to. At times when we didn't have the extra footage, we had to find another solution. That is the nature of filmmaking, but it is different from television. TV will throw you curves, but there is so much structure that you can kind of control it. You can't do that with a movie — everything is different every day.” (See the September 2005 issue of Millimeter for a breakdown on the ventilation shaft sequence.)

Peristere says the crew used the original Serenity ship model from the Firefly TV series, but up-rezzed it—upgrading the texture count and shaders and giving it a new digital paint job—to withstand 2K scrutiny. He says, "It was a domino effect that led to, essentially, a new ship."

The DI

The big equalizer in all this, the director and his colleagues insist, was the DI process. In particular, FotoKem used a special digital process to create a widescreen image from the original, Super 35 negative. Since Super 35 results in a larger picture area, FotoKem essentially took the whole negative area and did a 2:1 stretch to it to create the 2.35:1 aspect ratio.

“If you did this [squeeze] with a photochemical process, the optical squeeze would degrade the image and create another generation between the negative and the final print,” says Green. “FotoKem's approach eliminates that step, and that meant we retained a higher-quality image.”

Green worked closely with FotoKem artist Walter Vopato on the project. Vopato used a Quantel iQ system for both color correction and the digital conform. (FotoKem also used an Imagica XE Advanced scanner to bring the imagery into the iQ world, and eventually filmed the movie out on an Arrilaser recorder.)

Green also credits Vopato with developing a digital filter to allow filmmakers to realize Whedon's vision for a lengthy dream sequence in the film.

“Joss didn't think that the way we had [the dream sequence] originally made it clear that this was, in fact, a dream sequence,” Green explains. “So we asked Walter if there was anything he could do to make the dream sequence feel special, and he used a soft filter in [the iQ] to make it look more extreme — he created a direct visual hint for the viewer that this was a dream sequence. He made it real bright, kind of a heavenly effect, and it was very effective.”

According to visual effects supervisor Randy Goux, the crew extensively repurposed 12 budgeted ships and others they had built for the TV series to create the illusion that hundreds of ships were fighting in this key battle scene.

The DP says that was the most extreme example of how the DI was used creatively to help tell Whedon's story, but he adds there were other, subtler areas where the process also made a big difference.

“There's a little shot where a Reaver's foot is shown large in the foreground of the frame,” he says. “We didn't shake the camera when we shot it, but in the DI, we inserted a little shake of the ground as the foot hits the earth. That's a real small example, but there was a lot of that — areas where we tweaked or finished off things to get it just right. There's another shot in the movie where a character gets cut with a sword in her back, and they apply an electronic, futuristic Band-Aid. The shot was a little dark, specifically where we showed the wound being treated, so we just lightened up that one little area, and left the rest of the shot alone. We could never have done that without a DI.”

Whedon hopes that the primary end result is, of course, a fun movie that fans enjoy. Beyond that, though, he suggests that Serenity could be a template for how to create science fiction for the big screen on a modest budget.

“We'll see if there will be another [Serenity] movie — it depends on if people like it and if the studio would say yes,” he says. “But I do think this is a good model to follow. Our goal, like any movie, is that it becomes a big blockbuster, but our bottom line most certainly was not blockbuster. We combined techniques and styles and had some very smart people put the whole thing together, and I think it worked beautifully. Hopefully, we've learned how to do this sort of thing in a way that makes financial sense without shortchanging the audience.”

Share this article




Continue the discussion on “Crosstalk” the Millimeter Forum.


© 2014 NewBay Media, LLC.

Browse Back Issues
Back to Top