Adam Ash

Your daily entertainment scout. Whatever is happening out there, you'll find the best writing about it in here.

Thursday, October 19, 2006

One day we might make dramatic real-life movies without dramatic real-life actors

Cyberface: New Technology That Captures the Soul -- by SHARON WAXMAN

THERE’S nothing particularly remarkable about the near-empty offices of Image Metrics in downtown Santa Monica, loft-style cubicles with a dartboard at the end of the hallway. A few polite British executives tiptoe about, quietly demonstrating the company’s new technology.

What’s up on-screen in the conference room, however, immediately focuses the mind. In one corner of the monitor, an actress is projecting a series of emotions — ecstasy, confusion, relief, boredom, sadness — while in the center of the screen, a computer-drawn woman is mirroring those same emotions.

It’s not just that the virtual woman looks happy when the actress looks happy or relieved when the actress looks relieved. It’s that the virtual woman actually seems to have adopted the actress’s personality, resembling her in ways that go beyond pursed lips or knitted brow. The avatar seems to possess something more subtle, more ineffable, something that seems to go beneath the skin. And it’s more than a little bit creepy.

“I like to call it soul transference,” said Andy Wood, the chairman of Image Metrics, who is not shy about proclaiming his company’s potential. “The model has the actress’s soul. It shows through.”

You look and you wonder: Is it the eyes? Is it the wrinkles around the eyes? Or is it the tiny movements around the mouth? Something. Whatever it is, it could usher in radical change in the making of entertainment. A tool to reinvigorate the movies. Or the path to a Franken-movie monster.

The Image Metrics software lets a computer map an actor’s performance onto any character virtual or human, living or dead.

Its creators say it goes way beyond standard hand-drawn computer graphics, which require staggering amounts of time and money. It even goes beyond “motion capture,” the technique that animated Tom Hanks ’s 2004 film “The Polar Express,” which is strong on body movement but not on eyes, the inner part of the lips and the tongue, some of the most important messengers of human emotion.

“One of our principal tenets is to capture all the movements of the face,” Mr. Wood said. “You can’t put markers on eyes, and you can’t replicate the human eye accurately through hand-drawn animation. That’s pretty important.”

Ultimately, though, Image Metrics could even go beyond the need for Tom Hanks — or any other actor — altogether.

“We can reanimate footage from the past,” said Mr. Wood, a stolid man with a salesman’s smile. He was hired to introduce Hollywood to the technology, which the computer scientists who founded the company sometimes have difficulty articulating.

“We could put Marilyn Monroe alongside Jack Nicholson , or Jack Black , or Jack White,” he continued, seated in the conference room where the emoting actress and her avatar shared the screen. “If we want John Wayne to act alongside Angelina Jolie , we can do that. We can directly mimic the performance of a human being on a model. We can create new scenes for old films, or old scenes for new films. We can have one human being drive another human character.”

To prove the point Mr. Wood brought up on-screen an animated character that he showed at the Directors Guild of America this past summer. The character, a simple figure comprising just a few lines drawn in the computer, made the “I coulda been a contender” speech from “On the Waterfront,” in Marlon Brando ’s voice. (Because Brando didn’t gesture much, the stick figure’s movements were based on those of a hired actor.) Then he pulled up a video of the musician Peter Gabriel singing a scat beat alongside a half-dozen animated figures who, one by one, joined him in precise concert. Finally he brought up a scene from a Marilyn Monroe movie in which animators replaced the original Marilyn with a computer-drawn version of her. The image isn’t perfect — or rather, it’s a bit too perfect for credulity — but it clearly shows the path that lies ahead.

The breakneck pace of technology combined with the epic ambitions of directors has, up to now, taken movies to places undreamed of in the past: the resinking of the “Titanic” ; war in space between armies of droids; a love story between a dinosaur-sized ape and a human-sized woman. (Whoops, we had that one before.)

But if Image Metrics can do what it claims, the door may open wider still, to vast, uncharted territories. To some who make the movies, the possibilities may seem disturbing; to others, exciting: Why not bring back Sean Connery , circa 1971, as James Bond? Or let George Clooney star in a movie with his aunt, Rosemary; say, a repurposed “White Christmas” of 1954? Maybe we can have the actual Truman Capote on-screen, performed by an unseen actor, in the next movie version of his life.

Projects are already circulating around Hollywood that seek to revive dead actors, including one that envisions Bruce Lee starring in a new Bruce Lee picture.

Asked what he might do with the new technology, Taylor Hackford , the director of “Ray” and a dozen other movies, was at first dismissive. “It’s phenomenal, but its uses are in the area of commercials,” he said. (Image Metrics made a commercial last winter that revived Fred and Ethel Mertz of “I Love Lucy” discussing the merits of a Medicare package.) But after a moment’s reflection, he shifted his view. “If you’re working on ‘The Misfits,’ and Clark Gable died before the end of the film, you could have used it in that instance,” he reflected.

Or what if Warren Beatty , or Robert Redford , wanted to play a younger version of himself? “If you had Warren or Redford in a great role, and there was a flashback to a young character” — he mused — yes, that would be a reason to use it. Perhaps in “The Notebook,” he went on, in which Ryan Gosling played the young version of James Garner ’s character? Mr. Garner could have played both versions himself.

Still, one thought was holding Mr. Hackford back. “If you want Ethel Barrymore to give you an incredible, heartfelt and painful performance, that comes from the soul of the actor,” he said. “It’s not something you can get by animation.”

IMAGE Metrics began in the living room of Gareth Edwards, a shy, baby-faced, 34-year-old biophysicist from Manchester, England. He, Alan Brett and Kevin Walker, all postdoctoral students from the University of Manchester, were conducting research into image analysis, a technique first developed to help computers analyze spinal X-rays. “We were very much scientists looking for the big problem,” he said. “Big in terms of the problem, and big in terms of the benefit.”

They decided to start a company, of which Mr. Edwards is the chief technical officer. He doesn’t work out of his living room anymore; now he works in the Santa Monica offices. (His colleagues remain in England along with a half-dozen other computer and physics Ph.D.’s.) But some things remain the same. “Image analysis is a difficult scientific problem,” he said. “You’re trying to analyze complex objects: the human spine, or the mapping of the human face. How do you teach a computer to understand the context of an image when that image is complex?”

Many surveillance devices rely on facial recognition software, but it produces a lot of false positives. Mr. Edwards and his colleagues took a different approach, one that starts with the generic model of a human head and layers onto that a mathematical distillation of an individual’s expressions. He compared his approach to describing a new bicycle. The person who’s listening is likely to picture the new bicycle based on other bicycles she has already seen.

“It’s model-based computer vision,” Mr. Edwards said. “The idea is, if you know an object, you can picture it. The key for animation was that realization: that we needed to build a computer system with the prior concept. The mathematical structure describes the basic concept of the face and maps the subtle variations.”

The first step has been using Image Metrics to allow live actors to animate virtual characters. Thus Kiefer Sutherland himself has been able to drive the performance of the animated version of his television character, Jack Bauer, in the computer game “24,” based on the hit show. Warner Brothers is using Image Metrics, along with several other companies, to animate a new character in the forthcoming “Harry Potter and the Order of the Phoenix,” a monstrous relation of Hagrid, animated by an actor.

Larry Kasanoff is the producer and director of “Foodfight!,” which will be the first full-length movie to use Image Metrics technology. Sitting in his Santa Monica production office, surrounded by plush toys of characters (who will be played by Charlie Sheen ,Hilary Duff and Eva Longoria), he talked about the difference between image analysis and standard computer-generated imagery, or C.G.I.

In a C.G.I. film, he said, “every time someone would say something, banks of people would have to figure out how the lips move, how the eyes move — and it’s not even that good.”

“Now we don’t have to spend three years having people meticulously hand-animate Charlie Sheen’s lines,” he added. “He says, ‘Food fight!’ in real time, live action, and it’s applied, via Image Metrics technology, to the character.”

So whereas a film like “Cars” cost $120 million and took dozens of animators five years to make, Mr. Kasanoff says that “Foodfight!,” which has not yet begun production, will be finished by February.

And movies are just the beginning. “For creating characters that don’t exist, this is unparalleled at the moment,” said Alex Horton, the animation director for Rockstar Games, which has been using Image Metrics for two years in top-selling titles like “Grand Theft Auto: San Andreas” and “The Warriors.” Games, he explained, don’t require the level of detail that movies do, but they demand far more screen time than the average film.

“There’s no taking away the fact that a team of animators can sit and make some very convincing animation if they want to,” he said. “But I challenge anyone to do the volumes that I need in the time that I need, at this level of quality, and to capture the nuance of the voice actor.”

IT sometimes seems that every six months or so another technology comes along that promises to revolutionize Hollywood and supplant what came before. “Toy Story” gave C.G.I. characters an early sense of humanity. Great excitement accompanied Stuart Little and his remarkable fur. Another fanfare erupted over Gollum, the gnomelike hobbit played by Andy Serkis through computer magic in the “Lord of the Rings” movies. In recent years the focus has been on motion capture, for which actors are wired with tiny digital sensors. Lately yet another system has emerged, called Contour, that tracks actors’ facial and body movements by coating them with phosphorescent powder.

But Hollywood producers seem to agree that this is something truly different.

“It’s a giant leap from the motion capture technology used today,” said Sam Falconello, the chief operating officer of Cinergi Productions, which made the “Terminator” series and is considering using Image Metrics to make “Terminator 4.” “I really believe in this technology. It is scaleable. It makes our effects budgets go further.”

It’s also far easier on the actors. Instead of being painted with a chemical or covered in sensors, they need only do what they would ordinarily do: act.

Mr. Kasanoff said that for comedy especially convenience was a central issue. “Try to get an actor to be funny and relaxed with 900 dots on his face,” he observed. “Now, when we direct the actors, they don’t even know the camera is there. They just act.”

Debbie Denise, a senior vice president at Sony Imageworks who tracks new technology developed both in house and elsewhere for the movie studio, said that her company’s motion capture technique has advanced to where it can credibly track subtle facial expressions. It is being used in the current production of “Beowulf,” a computer-generated version of the ancient tale directed by Robert Zemeckis , who directed “Polar Express.”

But she agreed that the Image Metrics approach was “very promising.”

“It’s been a challenge for everyone in this field to get away from markers,” she said. “How can you just videotape somebody? The way they’re doing it is very interesting.”

As for reanimating former movie stars? “That sounds terrific,” said Chris deFaria, head of visual effects for Warner Brothers. “I’d love to see it.” But, he added, “There are real complexities involved with that.”

Undoubtedly so. But at least one former movie star thinks the ideas holds some promise. Arnold Schwarzenegger , now the governor of California, has conducted tests with Image Metrics to use his Conan the Barbarian character in political ads.

0 Comments:

Post a Comment

<< Home