Find millimeter on Facebook

Related Articles

Photoreality

Aug 1, 2005 12:00 PM, By Ellen Wolff

Will rendering’s next frontier revolutionize filmmaking? Twelve experts weigh in.


      Subscribe in NewsGator Online   Subscribe in Bloglines  

Modern rendering software is being pushed to the limit by the complexities of today’s digital film effects—from the characters in Star Wars, The Polar Express, and The Chronicles of Narnia to the complex environments of Star Wars and Batman Begins.

Look at the backlighting on Yoda's ears in Star Wars: Revenge Of The Sith or the glints on Gotham City's skyscrapers in Batman Begins, and you'll see telltale signs of CG rendering today. The way light is rendered has always been crucial to making synthetic images appear convincing, but demands for greater photorealism are pushing the art of rendering into new terrain.

To meet these demands, software manufacturers are pushing hard to incorporate techniques for simulating the way light works in the real world. Sub-surface scattering puts that glow behind Yoda's ears. Ray tracing is responsible for light rays bouncing off countless airplanes. Even global illumination — the way all the lights in a scene affect everything else — is showing up onscreen with increasing frequency. More accurate depiction of light interactions is on the wish list for studios that create both visual effects and all-CG movies. This has boosted interest in Mental Images' Mental Ray renderer and Splutterfish's Brazil, and it has prompted industry leader Pixar to evolve RenderMan into PRMan — the “PR” signifying “Photo Realistic.”

“Ray tracing was a big red item that we had to address,” says Pixar's Chris Ford, business director for RenderMan. “But the reality is that the range of problems that renderers have to address is very broad. You can have the world's fastest ray tracer, but if it doesn't do good displacement maps or motion blur, it doesn't get you there.”

The incentive to keep adding new functions is clearly there. “Growth in rendering is exceeding, probably by a factor of two, modeling and animation tools,” Ford says.

Yet, it has been a difficult market for smaller software developers to crack. In recent years, such rendering products as Entropy, Jig, and RenderDotC have tried to get on the radar screens of the major production studios, to little avail. “A renderer produces the final results that the customer sees,” Ford says. “It has to swallow every-thing, so it's really hard to stabilize. RenderMan has been around so long that it's been hammered by virtually every production scenario you can imagine. So it's very stable. I don't know of any renderer who's gotten there in one or two releases.”

At Splutterfish, which markets its Brazil renderer to the 3ds Max community, CEO Scott Kirvan agrees that it's tough to create a niche for a new rendering package. “I think the market is too competitive,” he says. “If you're just going to sell to the film industry, it's hard to get in the door. And it's easy to overestimate the demand.”

To render the fully CG photoreal lion in The Chronicles of Narnia, Rhythm & Hues used its proprietary renderer, Wren.

Home Brews

Commodity rendering companies aim to keep their film clients happy by making sure their products are as programmable as possible. Nonetheless, several production houses are continuing to write in-house rendering code to achieve lighting effects that they can't get with off-the-shelf tools. Blue Sky, Sony Imageworks, Rhythm & Hues, Double Negative, PDI/DreamWorks, and Digital Domain (DD) are among the studios that have invested time and talent writing proprietary renderers. It seems to be a part of the CG production process that's still a frontier of development.

“Rendering is definitely in flux right now in our industry,” says Darin Grant, DD technology director. “RenderMan has been the gold standard against which everything has been judged for 20 years. But we never were able to think about global illumination, blurry refractions, and glossy reflections. RenderMan wasn't built for those things.”

Although Grant applauds Pixar's recent efforts to retrofit RenderMan, he still has concerns. “Even though Pixar's product is called ‘Photo Realistic’ RenderMan,” he says, “it's really been ‘Faux Realistic.’”

Of course, there's a long tradition in moviemaking of building false fronts and forced perspective sets, so the term realistic has always been a relative one. There's no reason to expect that CG rendering techniques should be any more real than on-set lighting, and there are proven techniques for writing shaders that create faux lighting effects. The strategy of texture-mapping lighting effects onto objects has long been a cost-effective one. But with the huge gains in affordable hardware that studios have attained over the past few years, there has come an undeniable urge to more accurately simulate complex reflections and refractions that were once too compute-intensive to be practical for production.

“There is a rebirth of rendering preoccupation,” says Christophe Hery, ILM's lead R&D engineer. “We have faster machines now, so we can think about things like global illumination. The first character to be up to the current level of rendering was the baby in Lemony Snicket, which used global illumination and sub-surface scattering. And Yoda definitely used the latest technologies.”

ILM's pipeline uses both PRMan and Mental Ray, and the studio employs specialists dedicated to writing the integration glue. It's a mix-and-match environment that Hery expects ILM to have for “a long time.”

But Hery, who won Sci-Tech Academy honors for ILM's sub-surface scattering techniques, also admits that his rendering suppliers are extremely responsive. “We've asked questions, and they've fixed problems seven hours later, though we might be in a special position.”

Other studios that render photoreal CG effects also use a mix-and-match approach to varying extents. At Rhythm & Hues, which is currently working on The Chronicles Of Narnia, technology vice president Mark Brown says, “We use Mantra from SideEffects, and we have RenderMan licenses. But we do about 90 percent of our rendering in our proprietary package called ‘Wren.’ The primary reason is flexibility. If we need to do a volumetric effect — for which there's not a good renderer out there — we don't have to think, ‘How do we kluge it?’ We're lucky we can afford to have a group of four people who can write modules for Wren. We don't wait until RenderMan or Mental Ray catches up.”

Andy Hendrickson, PDI/DreamWorks' head of production technology, agrees. This former ILMer completed Madagascar using code developed at PDI over two decades — code that enables the studio to render scenes with global illumination. “If you write your own renderer, it takes a certain level of effort. If you use a bunch of renderers and have to track each of their updates and bugs and write the glue to put them together, oddly enough it takes about the same level of effort, but a lot less time begging a vendor to make a change.”

Each Wookiee beyond the first two rows is a completely CG character. Since ILM had created variations in the appearances of so many digital Wookiees, rendering them presented a distinct challenge.

Trailblazing

If ever there was case made for the virtues of proprietary rendering, it's Robots from Blue Sky Studios. Rendering the film's many metallic characters was possible with the high level of refinement of Blue Sky's CGI Studio software. After nearly 20 years of pursuing algorithmic methods for ray tracing and global illumination (and amassing more than a million lines of code), Blue Sky produced dramatic lighting without relying on texture mapping.

“We said the heck with maps; let's see if we can come up with a procedural technique,” recalls R&D vice president Carl Ludwig. “We were able to make complex layered materials, and we made the process interactive, so if the director wanted to see a little more rust we could do it right away. With maps, when you make a change — like moving the camera — you have to repaint them. Some poor guy has to pound out a solution that usually requires lots of manual labor.”

The promise of reducing brute force solutions is a key factor in Ludwig's belief in a procedural approach to rendering. “The crew doesn't go nuts doing lots of manual intervention. An hour of a person's time is worth hours of a computer's time. These machines run 24 hours a day, especially now that they're cheaper and faster. It's a person's time that you have to value. That's how we keep our costs under control.”

An all-CG feature currently in production at Sony Imageworks is also following the path toward using more realistic lighting models, according to digital effects supervisor Rob Bredow. “On Monster House we're using a global illumination lighting model. We want this movie to look like it's shot on a half- or quarter-scale set, almost like a miniature. And in that world, the bounce lighting from wall to wall is a huge factor. When we were thinking about how to do Monster House, the team did tests with various renderers. They realized when they did full global illumination, it gave everyone artistically what they wanted in very short artist-time. There were tons of technical issues that were not trivial, and there's a lot of cost justification. But if you can answer the question ‘Is this what the movie is supposed to look like?’ then the technical issues can be solved.”

Bredow wasn't at liberty to name the rendering package (or packages) chosen for Monster House, but Imageworks has a history of developing proprietary renderers. For The Polar Express, it developed a way to render paint strokes with software called SPLAT (Sony Pictures Layered Art Technology), and the company is currently developing a proprietary renderer that will be used for Spider-Man 3 and the all-CG Surf's Up.

“It's a program called “Katana” — named after a Japanese sword that's supposed to be light and quick,” says Bredow. “So you can tell what our intentions are! We picked one CG feature and one live-action film, both of which were big and could handle a long development cycle to get Katana up and running. The goal [was to show] that it would be useful for both live-action effects and CG features.”

Converging Worlds

Studios have long used the same software to render both stylized CG and digital elements that are composited with photography. But Bredow notes, “The two worlds are closer together than they probably were five years ago, just because an average visual effects film may have as much as 50 percent of its shots almost entirely CG.”

ILM's Hery concurs. “I might say they are two separate worlds, but when I look at Star Wars, it's 95 percent an animated movie. We have shots with fully digital humans and creatures in backgrounds that are CG. It's not so much that it's live-action versus animated films; it's the fact that a postproduction facility comes in after the decisions have been made most of the time. So we have to accommodate that.”

The challenge to digital effects providers to match live-action lighting can place special demands on rendering strategies. It's become standard practice to capture high dynamic range images on set for later use in efficient image-based rendering techniques. But the number of digital elements that must be rendered and composited with live-action footage keeps growing dramatically.

“We can never render everything at once because we always have a background plate to match,” says DD's Grant. “So we've been building up the technique over the years of rendering out layers and using our compositor to put them together. Most recently, on Stealth, we took that very far. Our lighters were using our compositing software, Nuke, which handles a large number of channels, to do most of the lighting adjustments.”

Stealth also benefited from internally developed software called “Terragen,” which handles both modeling and rendering of terrain.

The ability to amass large numbers of rendered elements and manipulate them during compositing owes a lot to gains in storage capacity. “One of the most dramatic increases in the last few years has been the spectacular storage servers that have become available,” says Thad Beier, Hammerhead Productions' co-founder. “You really can store 20 layers of different elements and then composite them down the road. It gives the effects supervisor a lot more control over the shot. All the elements are there in the Inferno. On The Day After Tomorrow, Karen Goulekas had many houses render all their elements and then she would sit down and built the shot.”

This “render elements” approach permits more flexibility and control, but doesn't that make it more difficult to achieve a unified, globally illuminated scene? ILM's Hery agrees. “It's a semi-constant battle within the company. You have the reality of production. There's a time within the life of the shot where someone will say, ‘We've done enough lighting. It's about compositing now.’ Most of the time when this happens, the compositors will request elements that can be as varied as an image-per-object, or an image-per-light, or a diffuse or specular component of lighting. Each time we do this it's very painful to me because the nice uniformity and semi-physically based approach is given flat — component by component — to a compositor who might not respect the physicality of it.”

ILM is widely credited in the effects industry for pioneering a partial simulation of global illumination called “ambient occlusion.” It provides an efficient way to simulate light and shadow effects from nearby objects without doing a full simulation. “This can be calculated in a reasonable amount of time. Most people store ambient occlusion information in texture maps, and you can apply those to objects in real time,” Beier says.

Although this is clearly a cheat compared to full simulation, Beier remarks, “An artist can try 100 tests in an hour. If you're using real simulation that's more computationally intensive, you may only be able to render four tests in an hour. We're taking just enough science to make the lighting look realistic and putting control in the hands of artists.”

London’s Double Negative developed a proprietary volumetric render to efficiently handle the rendering of vapor effects for Batman Begins.

The Challenge of Volumetric Effects

Although efficiency rules in digital effects rendering, studios are also being challenged to create increasingly complex naturalistic effects. For Batman Begins, Double Negative was faced with rendering large volumes of animated water vapor. Paul Franklin, who created DNeg's 3D pipeline, says the Batman challenge prompted them to write a volumetric renderer that could render the output of fluid simulations. Dubbed “DNB,” this software's specific requirement was to generate clouds that could be illuminated in a variety of ways.

“We also can fly the camera through them,” says Franklin, “and that's a fantastically useful thing in the world of visual effects. It's also really fast. With off-the-shelf solutions, rendering had been taking an hour and a half a frame. We got it down to two or three minutes.”

Having this proprietary tool is also helping DNeg render epic cloudscapes for Dean Devlin's upcoming Flyboys. Franklin notes, “As we fly through these clouds, we can change the lighting and make them interact with the aircraft.”

Being able to efficiently render volumes has clearly worked to DNeg's advantage. “Writing your own tools is a way of leveraging your skills,” Franklin says. “The explosion in visual effects over the last five years has meant a Darwinian selection pressure is applied to studios competing to come up with the latest greatest thing. If you can do effects [that] nobody else is doing at a certain price point, you get a reputation.”

Digital Domain has its own volumetric tool, the Sci-Tech-award-winning Storm, which is a plug-in for Houdini that DD used to create cloudscapes for Stealth. At Rhythm & Hues, Mark Brown's team is also writing a volumetric renderer. “We're getting up close and personal with CG clouds now, so they have to be dead-on,” says Brown. “Once you're inside, they need to have internal shadowing and light sources, and that puts big pressures on the renderer.”

Not surprisingly, at Blue Sky they're also pursuing ways of illuminating volumetric subtleties. “It's about volumes interacting with light, not just surfaces,” Ludwig says. “For rendering, that's the next challenge.”

To achieve the aerial acrobatics in Stealth, Digital Domain rendered volumetric CG clouds using its Academy-honored software, Storm.

Managing Complexity

This year's movies are brimming with hairy CG creatures, from realistic-looking digital Wookiees in Star Wars to the jungle critters of Madagascar. PDI/DreamWorks' Hendrickson points to what he calls “the sheer volume and complexity of rendering today.”

“When I was reading the Madagascar script, and it called for a raving crowd of lemurs, I thought, ‘Do they really want this? How about just five?’” Hendrickson says. “We couldn't put a couple of million hairs on every one of those creatures and expect them to render in our lifetime. So we had to take lots of shortcuts with distance and reducing the complexity. Managing the sheer volume of modern scenes is a huge problem right now.”

Rendering specialists like Mental Images' CEO Rolf Herken believe that the ability to deal with increased complexity will be helped immensely by continuing improvements in hardware. “More important than the broad speed of the processors will be the availability of 64-bit computing at affordable prices,” he says. “You now have products with 64-bit processors, and you can get enough memory to accelerate rendering, often by a factor of 10 or more. The impact of 64-bit computing will be huge.”

Herken also believes the emergence of hardware rendering using GPUs (Graphics Processing Units) will have a notable impact on motion picture rendering. It's an opinion shared by many, including software architect Larry Gritz of Nvidia. Gritz has been a leading light in rendering for years, having worked on RenderMan and the BMRT and Entropy renderers. His current project for Nvidia is the Gelato renderer, aimed not at the realtime gaming community but at filmmakers. “We're trying to use graphics hardware to accelerate offline rendering. We don't want it to be any different than the way people use other renderers, except that it would be faster. To the best of my knowledge, it's the only offline, full-quality renderer that requires a GPU.”

The initial use of realtime rendering by filmmakers is likely to be for rendering preliminary versions of what a finished scene could look like. At DNeg, Franklin says, “We can use realtime tools to get a pretty fair approximation of a final render. Being able to present a director with something much closer to the final result is where hardware rendering will come into its own. It's also good for those ‘terrible temp’ screenings that we always face.”

Splutterfish's Kirvan thinks that widespread adoption of GPU rendering faces monetary barriers, however. “It's both the price of the cards and the configuration of studios' render farms,” he says. “Computers that hold those cards are more expensive, so those investment hurdles are in the way.”

Financial considerations aside, Gritz doesn't expect final film frames will ever be rendered in realtime, if only because it's hard to accelerate complex processes like ray tracing. “The ironic thing is that just as hardware began catching up to do a reasonable rendering job in realtime, everyone's expectations for what should be in final images went up!” he says.

Despite the increasing complexity of rendered scenes, studios will undoubtedly retain their current expectations about how long it should take to render complex images. DD's Grant cites Blinn's Law, a longstanding maxim credited to CG pioneer Jim Blinn. “That law,” Grant says, “is that any renderer, no matter how fast processors get, will always take a couple of hours, because that's the tolerance level of artists. Beyond that, it becomes like the line from Star Trek: ‘Captain, she can't take any more of this!’ Then, if there's headway, and a render takes only 15 minutes, they say, ‘Hey, I can add more to it!’

To read about Siggraph announcements in rendering, visit siggraph.millimeter.com.

Share this article




Continue the discussion on Crosstalk the Millimeter Forum.


© 2014 NewBay Media, LLC.

Browse Back Issues
Back to Top