The Drum Awards for Marketing - Extended Deadline

-d -h -min -sec

Author

By John McCarthy, Opinion Editor

September 9, 2021 | 7 min read

Adland has long rubbed its hands at the prospect of how deepfake technology can help with the delivery of the most ambitious creative ideas. Maria Chmir, chief executive and founder of business Deepcake, explains how she did just that, helping a Russian telecoms giant deliver an ad staring a deepfaked Bruce Willis (with his blessing, of course).

MegaFon, working with Instinct creative agency (part of BBDO), struck a world first in getting the rights to a Hollywood star and producing his performance using a carefully-selected understudy and deepfake technology.

Russian engineers from Deepcake, using movie magic, had Kazakh comedian Azamat Musagaliyev star alongside a digital, de-aged and deepfaked Bruce Willis.

But how did the team navigate the challenges – and is there a wider benefit to the technology the industry should quickly realize?

The work

Before anything else, the agency needs to acquire the image rights to the actor. These are now commonly pursued and bartered for now that technology – CGI, deepfake and otherwise – can be used to replicate a star with minimal involvement. It’s also been used to resurrect beloved stars in iconic roles (Peter Cushing in Rogue One: A Star Wars Story, for example). In this instance, Willis’s estate signed a contract enabling the use of his digital image. That team had final sign-off on what was created.

Once the rights were secured, the real graft started.

Deepcake input 34,000 images of Willis in the right light and the correct profile, with the desired emotions on his face, into the neural network to train it to build an accurate estimate of his face. Many of these images came from Die Hard and The Fifth Element, at that point of vintage Willis.

Then 39 actors were tried in the role before Konstantin Solovyov (pictured below) was selected. This professional actor has over 100 roles to his name – a deepfake can produce a mask, not a performance. The acting talent has to be able to channel the star, which has its difficulties, as we’ve learned from resurrections of Bruce Lee and Audrey Hepburn.

Deep fake

The team then fed the production through an AI filter to improve facial realism and detailed skin texture, adding a natural shine to the gaze. The teeth in particular often need additional attention. Each iteration could take three to five days to generate. But from storyboard to finished project, the time was about five weeks.

The final product

Chmir says the whole team was excited to get the work into the world. And “fortunately, Bruce was satisfied”.

He could have vetoed the project at any moment. And, Chmir claims, many viewers believe Willis took the time to feature in person (she adds that it wasn’t the brand’s objective to mislead these viewers).

Deep fake bruce Willis

Willis said: “I liked the precision with which my character turned out. It’s a mini-movie in my usual action-comedy genre. For me, it is a great opportunity to go back in time.

“With the advent of modern technology, even when I was on another continent, I was able to communicate, work and participate in the filming. It’s a very new and interesting experience, and I thank our entire team.”

So will more stars pay attention to this project and lease out their rights to excited brands in distant corners of the planet they’d be unlikely to visit?

“Imagine what the budget would be if Bruce would fly to Russia to shoot,” she adds. Industry rumbles price Willis at $1m a day to star in a movie, so it can’t be far from that now.

A trend?

We’ve still not seen a huge amount of deepfaked ad campaigns. That could be about to change.

Chmir is “confident that face-shifting will soon become commonplace in advertising and film projects”. She’s staked her entire business on that theory, after all.

For markets like the Russian one, it opens up new opportunities. Firstly, any human who’s been captured in an image since the 1800s could theoretically be brought to life using these techniques.

Of course, the quality of the inputs greatly affects the quality of the final product – even VHS-quality footage can complicate matters, as the agency learned when it resurrected a 70s hero. “In that instance, we could see the pain in the eyes of the director, who had to adapt his idea to the limitations of technology.”

And the tricks of the trade are becoming ever more apparent.

Keep the lighting simple and clear, and don’t partially obscure the face or have it jerk around or change profile needlessly. From every angle, hundreds of hours of processing power is needed to generate it. The engineers understand the limitations, but are excited by the potential too. “Our challenge is to overcome these barriers so that the production team has complete freedom for creativity,” promises Chmir.

She also sees the potential of using the technology in force majeure instances, such as the sudden illness of the actor, or an emergency accident.

This campaign de-aged Willis significantly, inspiring Chmir to wonder if some actors will favor artificial intelligence filters over the cosmetic scalpel.

There are more applications. Imagine the pilot of a series has already been shot, but the producer recasts the main character. Head generation can save the situation inexpensively. In Zack Snyder’s City of the Dead, Tig Notara was green-screened in at great expense to replace comedian Chris D’Elia. Deepfake technology could theoretically reduce these costs with a simple head swap.

And finally, everyone knows that Tom Cruise likes to film his own stunts. Insurance liability is a huge reason why he runs his own production company, as some studios wouldn't take on his risk. Chmir says: “Tom Cruise is insured for $30m a movie. If an understudy (although I understand that Tom is a risk taker) with an exact facial AI-replica of the actor will do all the stunts instead, the producers can save a great deal of money.”

She doesn’t believe that deepfakes are a “fad that will pass... it is an effective tool to optimize content production, which will evolve and become firmly in the hands of professionals”.

Why brands should consider generative tech

What Chmir calls generative tech creates “completely new opportunities for directors, casting managers and producers”.

It’s “cheaper, faster and often more realistic” to deliver a project than using CGI – especially when bringing a human character (back) to life. “Even the best SFX can’t always manage this. In the beautiful The Irishman, the senior actors were de-aged in many scenes but they often still had the bodies and mannerisms of older people.”

There are practical applications too. Pepsi used the tech to have football star Lionel Messi greet fans in tens of languages. Expect to see the technology used to lip-sync actors’ mouths to audio. “This is the age of the new, automatic content localization without translators and understudies. With full lip-and-speech synchronization.”

One day, the Russians may get to enjoy a seamless “Yippee-ki-yay motherfucker!” from Willis, synced seamlessly in the mother tongue due to this technology.

Artificial Intelligence Future of Media Media Planning and Buying

More from Artificial Intelligence

View all