Taking 3D to the Next Level

After releasing several 3D documentaries over the past decade, "Avatar" marks James Cameron's first full length 3D film.
HOLLYWOOD
It is no accident that this month's release of "Avatar" has achieved new heights in live action and CG-generated 3D imagery. This milestone film is the result of many years of research into perfecting stereoscopic 3D through the efforts of director James Cameron and his collaborator, cinematographer Vince Pace.

"We have discovered what we call the e = mc2 principle of stereography," Cameron told TV Technology shortly before the Dec. 18 release of his multi-hundred million dollar 3D sci-fi epic. "Just as Einstein found that the speed of light is a constant and in order to preserve its value everything else including time and mass has to change to accommodate it, we have learned that no matter what focal length lens you are using or the distance between subject and camera, the parallax difference in the background—what we call the divergence—is an absolute constant and everything else has to change to accommodate it. This divergence is a constant in the pixel array on the screen, whether you set it at 65 pixels or 70 pixels, and it doesn't matter how close or far away you are from the subject because everything else has to change to maintain that constant value."

MIMIC-ING HUMAN VISION

Applying this principle has meant the Fusion Camera System that Cameron used on "Avatar" (see "Behind the Post of 'Avatar," Focus on Editing, p. 31) more closely mimics human vision than any previous approach to stereoscopic 3D.

Avatar Director James Cameron with Sam Worthington, the film's lead. "Usually the brain compensates in real life by encouraging us not to look at near objects for too long when our eyes have to toe in at extreme angles to provide it with close up left/right images," Cameron said. "Of course, that is no help for a filmmaker when you are trying to put a close-up shot up on the screen. So the solution we have found using the Fusion Camera is to adjust the interocular distance between the lenses and also their convergence angle to enable comfortable viewing at all distances and all focal lengths."

Once Cameron and Pace had worked out this universal 3D principle, it also greatly simplified the editing of "Avatar" during post.

"We cut it like it was a 2D movie without thinking about the 3D," Cameron said. "We could concentrate on making great editing decisions from shot to shot based on the drama of the moment and the on-screen performance. Convergence could always be fixed later by adjusting the post-convergence of the two left and right images in the DaVinci Resolve system from Blackmagic Design that we set up in the Robert Wise dubbing stage on the 20th Century Fox lot. Since we filmed in 16:9 format, but it is intended for a 2.35:1 'scope release,' with the help of digital intermediate colorist, Skip Kimball from Modern VideoFilm, we had considerable latitude to adjust either the stereo space convergence or the head room for each shot."

'VIRTUAL CAMERA'

"Avatar" audiences will also notice the realistic appearance of the 3D characters' performances which was greatly enhanced by Cameron's use of a special "virtual camera" he developed 3 1/2 years ago.

"As the name implies, it is not really an optical device at all," Cameron said. "The actors' performances are recorded in a large motion capture volume, using head rig cameras for image-based facial performance capture and hundreds of other mocap digital cameras focused on the registration dots on the outer layer of both people's bodies and on-screen objects. Once enough of this marker data has been recorded, as a director I can view it on the monitor of the virtual camera to see what a computer would render based on where I am pointing it. The images I see on the virtual camera's monitor are simultaneously ingested into the Avid NLE we used for streaming. Once this has been recorded, it is transferred to the cutting Avid where either editor Stephen Rivkin or John Refoua can begin to edit the footage at the most, five minutes behind me."

Vince Pace, CEO of Pace Camera, with the Fusion Camera System. So in "Avatar," the live action was stereoscopically photographed using the Fusion Camera, the 3D CG imagery was directed through the Virtual Camera, and the two workflows were integrated through Cameron's Simulcam to create the 3D illusion of adventure on the distant moon, Pandora.

"Instead of taking tracking data generated from a virtual camera as would have been done before, we put tracking markers on the live action camera," Cameron said. "Then, as I move the live action camera around on a lit live action set, the Simulcam lets me see virtual characters and virtual set pieces through a conventional HD eyepiece that aren't really there but have been composited in real time. And that has never been done before."

"Avatar" is not only demonstrating the current state of 3D stereographic production capabilities, it will also inspire filmmakers to its future potential in the hands of other creative artists.

"I just had Michael Mann in here, and he said he felt like an 8 year old kid," Cameron said. "It let him see the possibilities of what he could do if he got his hand on these tools. With 'Avatar' and the techniques we have introduced in its production, we are opening up a whole new set of tools for filmmakers to use."

CATEGORIES