Find millimeter on Facebook


It Came from the Third Dimension

Apr 9, 2009 12:00 PM, By Michael Goldman

DreamWorks rethinks 3D production with Monsters vs. Aliens.

      Subscribe in NewsGator Online   Subscribe in Bloglines  


Among the project's greatest challenges, though, was figuring out ways to ensure viewers did not experience eye strain during sequences such as the San Francisco battle. The idea was to allow the film's editors, Joyce Arrastia and Eric Dapkewicz, to largely cut those scenes with a pace similar to how they would do it in 2D. The 3D technology team would then apply a blending process to transitions between those shots to reduce the eye-strain risk.

"[The blending process was] an extra process on top of other processes as we checked the work all along the way," Vernon says. "They came up with software that let us go shot by shot, blending a deep shot with a shallow shot to lessen that straining sensation. It was an example of how technology moved forward on this project so that we could have a solution for how 3D affected what we would normally do on a typical animated film."

McNally largely handled the stereo-blending process himself using the new proprietary blending tool—also called a stereo perfection tool—created for the project during what he calls the added-depth grade, or stereo-blending stage—essentially a quality-control process to ensure action shots behave during transitions.

"You don't want your eyes jumping too much from a close-up of a robot to a monster on the bridge," McNally says. "I adjusted the stereo to blend the depth during the transition from close shot to distant shot. The distant shot might be set in stereo so that the robot is close for the first few frames, and then we gently ease it back into position to give your eyes an easier, softer transition even though it still feels like a hard cut. I graded those shots from a depth point of view, capturing shots that needed adjustment on a fundamental level and blending them together. The software reads all the shots in the sequence, but also gives me access to the camera curves that control the stereo, letting me manipulate them in realtime. This was an important step in eliminating the kind of eye strain that has been problematic with 3D in the past."

Another innovation was the creation of a stereo camera rig in Maya to view native stereoscopic imagery, according to McNally. He says inhouse stereoscopic software developer Paul Newell worked with Autodesk to develop a software camera that has a left-eye component and a right-eye component and automatically calculates stereo settings, which was helpful during daily animation reviews.

"It was developed from an artist's point of view," McNally says. "If you view a shot with one point being your nearest point in the scene and the other being the farthest, and you want them to appear at a certain place in relationship to the audience in the theater, you are really talking about pixel shifts. If you view a 3D image without glasses, those little separations along the double image's edge are a certain number of pixels apart. This way, we came up with a Maya stereo rig that has the ability to set the near plane, the far plane, and tells the camera this is the stereo space I want to work with. Then the artist puts in the shift values they want to achieve in the theater, and the software calculates the interaxial distance [distance between the two separated lenses] and the zero parallax setting [convergence point] based on what we requested. This way, we could say in dailies, ‘Give it five more pixels of depth,' or whatever the situation was. It created a common language of depth we could all discuss."

DreamWorks focused on giving artists the ability to see 3D depth easily while working throughout the project. This was done at various viewing stations that were essentially high-end consumer-level DLP monitors and on their desktops, where they typically were set up with CRT monitors and stereographic active-viewing glasses. Currently, however, McNally says that DreamWorks is in the middle of transitioning away from CRT technology and toward passive, polarized Hyundai 3D LCD W220S monitors—particularly 24in. models that work well with RealD polarized 3D glasses. The studio expects to have them in place in time for production on the upcoming 2010 3D feature, How to Train Your Dragon.

Share this article

Continue the discussion on “Crosstalk” the Millimeter Forum.

© 2015 NewBay Media, LLC.

Browse Back Issues
Back to Top