A Comet’s Only Cameraman

The cometary crater left behind as simulated in digital rendering prior to the July 4th encounter.
Credit: Maas Digital for NASA/JPL

Digital effects artist Dan Maas has a vision that " small teams… create works of spectacular scope and great complexity", a dream that he has now shown the world both on Mars and a comet.

His first complex and spectacular project was to preview the flights of the twin Mars Exploration Rovers, Spirit and Opportunity. In March 2004, Popular Science magazine dubbed Maas as "NASA’s Martian Cameraman." In recognition of those Mars Rover graphics in the PBS/NOVA program "Mars Dead or Alive," Maas just received an Emmy Award nomination.

His next big project was to simulate the dramatic impact of a bullet-like probe with a icy comet for the recent mission, Deep Impact.

Maas is a Cornell graduate in Digital Cinema, who also went to high school in Ithaca, but left two years early because of his perfect college board exam scores. He speaks, reads and writes Mandarin Chinese, and has given interviews in computer graphics in Chinese. Astrobiology Magazine editors had the opportunity to talk with Maas about how he first got caught up at this crossroad between digital special effects, software design and spaceflight.

Astrobiology Magazine (AM): There was a close connection between Cornell and the Mars rovers which played a role in your renderings for that mission. How did the opportunity to do the Deep Impact mission come to your door?

Dan Maas (DM): I was hired to create a Deep Impact animation based on the strength of my Mars Rover video. I already had gained many contacts at JPL from working on the Rovers, so it was a natural progression to take on Deep Impact.

AM: Did you get a chance to talk over the ballistic physics of a projectile hitting both a hard and soft target with JPL or the University of Maryland groups?

DM: Yes, I spent quite some time going over the impact dynamics with scientists Mike A’Hearn and Lucy McFadden. I modeled the impact debris and crater based on the air-gun experiment videos (available on the Deep Impact web site) and Pat Rawlings’ painting of the encounter, which the scientists considered to be very accurate. I made my impact about as large and bright as the scientists hoped it would be – and of course the real thing exceeded those expectations!

Tempel 1 nucleus shortly before crater-rendering impact. Scientists wondered whether the whitish material was icy rocks or some other surface feature previously unseen on other comets because of lack of image resolution.
Credit: U.Md/NASA/JPL

AM: The animation you did was once again stunningly realistic, even to the point that JPL’s Don Yeoman remarked during the actual impact that there was an eerie resemblance between the simulations and the actual events. Even down to the camera angles and lighting. Were there many alternative ways considered for animating the collision, such as a "splat" or "thud"?

DM: Credit goes to the Deep Impact scientists for insisting that I include the correct sun angles and spacecraft attitudes for all shots. At the time I felt it might be a waste of effort to go into such details, but the payoff came when we saw how closely the real impact resembled the animation.

There was some debate about whether a bright flash would be visible right as the impactor hits the nucleus. I took a little license putting a flash in there, which happily turned out to be a major feature of the real impact.

The most difficult part of creating the animation was setting up camera angles to give the correct sense of motion.

My usual animation style involves moving the camera around a lot, but on Deep Impact any camera movement tended to make the spacecraft look like it was going in the wrong direction – away from the comet or to the side, instead of directly toward it. I learned that more subtle camera work is necessary for deep-space shots. That’s probably because there are no landmarks or a horizon to give the audience a point of reference. During the impact shot, the camera is basically stationary relative to the sun, which was the only way to show the correct trajectories – the comet overtaking the impactor, rather than the impactor speeding towards the comet.

AM: One thing that stands out from the physics in little or no gravity is the slow-motion of the collision. It seems like the cratering took a long time to develop and produced much more powder than anticipated. Was this slow motion explicitly modelled?

Rover impact simulation video frame. Each rover could bounce twenty or more times before coming to rest, with the average impact of a golf cart being dropped from the roof of a five-story building.
Credit: Maas Digital for NASA/JPL

DM: The speed of the debris was based on the scientists’ recommendations. The weakness of the comet’s gravitational field made my job easier – I could model the debris as a simple cone rather than worrying about curved trajectories.

The main thing my animation got wrong was the particle size distribution. The real impact was indeed more "powdery" (which is a shame, since I put a lot of effort into modeling large rocky debris particles!). At some point I hope to go back and correct the animation.

AM: Can you schematically highlight the big steps you take from conception of a scene to its finishing stages?

DM: I now have three other artists working for me, which is a big change from being on my own.

We do most of the 3D modeling and keyframe animation in LightWave on Windows. Occasionally we use Maya on Linux for certain elements where it has stronger capabilities than Lightwave, like particle systems and kinematics. For example, the Deep Impact debris cone was a Maya simulation. We do a lot of painting in Photoshop too.

The rest of our pipeline is based around a custom software package I have developed. The software allows us to integrate geometry and animation data from both LightWave and Maya in a seamless environment, apply lights, shaders, and procedural effects, and then pass all of the data to Pixar’s RenderMan, which does the actual number-crunching to generate an image (RenderMan is a popular program for high-end rendering, mostly because it is extremely fast, flexible, and can produce very high quality images).

This system is a unique advantage since we can use each animation package for what it is best at, and very easily add procedural effects that go beyond what is offered in off-the-shelf software.

The "render farm" currently consists of about 20 Linux machines which are managed by a custom distributed processing system.

AM: What is next for Maas Digital as far as projects on the horizon?

DM: We are currently very busy creating new graphics for an upcoming Disney IMAX film about the Mars Rover mission. A Phoenix mission animation is also in the works.

Related Web Pages

Digitally Directed, the Mars Missions
Unreal Film-maker of Martian Reality
Deep Impact
Blasting Cap On A Comet
Bombing the Comet
Digitally Directed, the Mars Missions
Unreal Film-maker of Martian Reality