The summer blockbuster The Amazing Spider-Man is full of action, mayhem, and visual special effects (VFX). I spoke with Jerome Chen, VFX supervisor for Sony Pictures Imageworks, about its 3D image technology (see “Spider-Man Swings Through A Virtual 3D World”). I also got a chance to talk with Tracy McSheery at PhaseSpace about some of the motion capture technology employed in making that film.
Motion capture and VFX are used in almost every action movie these days and probably most movies in general. Movies like Mars Needs Moms are all motion capture with VFX delivering the final product. Actors don outfits ringed with white hemispheres that are photographed. Software analysis tools then convert these images into digital information that’s used to render animated characters with lifelike characteristics because they’re copies of real people.
The problem is that analyzing the tracking of the white dots isn’t easy, and a lot of manual labor is needed to make the system work. That’s one reason why big budget films have big budgets. It isn’t always about paying the cast.
LED Motion Capture
PhaseSpace took a different approach to help track Spider-Man as he is thrown through a wall—make the dots smarter or at least more recognizable. The company used LED markers and modulated the output, allowing each marker to be uniquely identified (Fig. 1).
Now, a person is no longer needed to determine when a dot is occluded and where it reappears. PhaseSpace even has a Spandex suit with the wires embedded. The ability to identify the markers saves a lot of time. It eliminates most of the manual labor required to clean up the non-active, white dot approach.
A couple of technologies have come together to make the active approach possible. First, camera resolution is in the megapixel region, and frame rates are a couple orders of magnitude faster than the 24 frames per second most movies are shot at. Second, micros to drive the LEDs are dirt cheap and smaller than the LEDs. PhaseSpace has one micro that drives 72 LEDs, but they need to be wired together using a two-wire bus. They also have implementations with just one or two LEDs, a micro, and a battery.
And third, analysis software is now readily available and multicore platforms to run the software are cheap. One could even turn to the cloud for computing resources if a server farm isn’t handy.
Combine this technology with video models and modeling tools from companies such as Daz Productions running high-end workstations like those from Hewlett-Packard or Dell, and movie-quality motion capture moves to the masses where a single person could create Hollywood-quality special effects. Companies like Autodesk provide free digital creation software to students and teachers, so this area is ripe for game changing content.
It Has Always Been Done That Way
So why isn’t everyone using active LED motion capture technology? Tracy will give you a long and amusing explanation if you ask. Suffice it to say that Hollywood tends to be set in its ways. Spending money is second nature, and “it has always been done that way.” It might take a young director creating a movie from scratch to change everyone’s mindset.
The reasons for doing something often go way back. For example, Morton Thoikol’s solid rocket boosters were sectional and a particular diameter because they were limited by the delivery vehicle—a train that had to traverse a number of tunnels. Train tracks were spaced 4 feet, 8.5 inches apart, which was the standard English tramway width.
English tramways were 4 feet, 8.5 inches wide because the standard width for wagon wheels was 4 feet, 8.5 inches. And this standard dated back to Imperial Rome, where war chariots carved ruts that were—you guessed it—4 feet, 8.5 inches apart. These roads were wide enough to accommodate two war horses. Eventually, the design of those rocket boosters had fatal results, though not everything could be blamed on the width of an ancient horse’s posterior.
Inexpensive motion capture and VFX make it practical to create a rough draft of a movie before it goes into production. Check out White Tiger Legend, which is another example of PhaseSpace technology in action.
Some of the motion capture work PhaseSpace completed for The Amazing Spider-Man wound up on the proverbial digital floor, but some made it into the movie as well. Stay tuned for more motion capture news. I’m looking forward to seeing The Merchant Of Venice Prime—Shakespeare verbatim but with a futuristic setting and a virtually new set of bodies.