This post was originally published on this site
https://content.fortune.com/wp-content/uploads/2022/12/52545797129_9713a8b2fa_o-e1670287513289.jpgDeep fakes (basically) use a form of artificial intelligence to replace one person with another—essentially a face swap—which is the exact term that Industrial Light & Magic uses, the visual effects company’s chief creative officer said at Fortune‘s Brainstorm A.I. summit on Monday.
Speaking on stage at the conference in San Francisco, Rob Bredow, also senior vice president of creative innovation at Lucasfilm, said the company tends to use the term “face swap” rather than “deep fakes” because “it’s not just a matter of running a deep algorithm…that’s one of the tools we can use for sure, but we are using kind of a wide variety of tools.”
While Bredow said he expected machine learning and A.I. to be extremely transformative, it’s gone beyond that. Bredow said it completely changed their workflow in that something that used to take 10 artists six weeks to do—in the case of face swapping—now, once trained in the technology and material, can get results in seconds on minutes of footage.
During the interview, a clip showing footage of Mark Hamill portraying Luke Skywalker in The Mandalorian—only it was a younger version of him. Bredow explained that Hamill was acting off camera and it was there was actually a stand-in for him, while Hamill voiced the performance.
The stand-in actor mimicked Hamill’s movements and expressions, but it was the studio’s job to turn the stand-in’s face into Hamill’s. They did so by using machine learning technology and their own in-house system, created with Disney research and Industrial Light & Magic’s own engineers. But it still wasn’t good enough for the highest quality of film streaming, so artists went back in with other techniques frame by frame, sometimes to add more detail.
“We get to collaborate with filmmakers who hopefully want to make something that no one has ever seen before,” he said. “And that’s when they pick up the phone and call us….we get to be their visual storytelling partners. And that’s all sorts of different techniques…it’s everything from matte paintings on glass 30 years ago to the latest A.I. and machine learning techniques that we’re using today.”
The company is all about innovation, Bredow said. He shared a story about an artist online who posted a version of their work, “a very, very good version on top of ours,” he noted. Bredow’s reaction? “How can I get a hold of this guy?” he told the audience. And two weeks later, that artist started at ILM.
“When somebody does great work, that’s the kind of person you want to bring in,” he said.
Bredow also spoke about the Abba as the famous Swedish band is touring again for the first time in approximately 40 years—with a bit of a twist. It’s a digital recreation of the band members’ younger selves.
“The illusion is you’re getting to go back to the 1970s and see Abba do a live concert,” he said. “And I mean, all of us were wondering as we were working on the show for three and a half years, is this going to work? Is this going to feel like somebody’s pressing play and we’re watching a screen, or is the audience going to be into it?”
And they were able to create a show that would never have been done, without the same machine learning technologies he mentioned previously—and in the process, Bredow even became an Abba fan.
So what’s next? The fifth Indiana Jones movie. The trailer dropped this month, and reveals a look into the past with a younger depiction of actor Harrison Ford, made possible through A.I.
“Hopefully you can continue to see our focus on those finer details that make a performance of an actor playing, in this case, himself at a younger age as believable as possible,” Bredow said.
Our new weekly Impact Report newsletter will examine how ESG news and trends are shaping the roles and responsibilities of today’s executives—and how they can best navigate those challenges. Subscribe here.