There’s a video of former President Barack Obama calling Donald Trump “a total and complete dipshit” that has more than 9 million views on YouTube. Coming from the mind of Nope director Jordan Peele, it’s one of the most-viewed deepfake videos on the internet.
Peele created the faux public service announcement with deepfake technology, which uses artificial intelligence to recreate a person’s face and voice to make it look like they’re doing or saying something they haven’t. It’s most commonly used to target women, as 96% of deepfake videos on the internet mapped the faces of female celebrities onto pornographic scenes, according to a 2019 report from deepfake firm Deeptrace.
Late January, non-consensual deepfake porn struck the creator community as news that Twitch streamer Brandon Ewing paid for images and videos of female Twitch streamers and creators came to light. While that content has since been removed, its existence revealed that unethical deepfakes aren’t limited to Hollywood superstars; content creators on the internet are just as susceptible.
As deepfake technology advances and allows untrained users to create realistic imagery, it threatens creators’ livelihoods as brands navigate creator likeness misuse. Those familiar with pairing brands and talent say that marketers might avoid working with creators mired in controversy, even if it’s through no fault of their own.
“People can be judged on the way that their likenesses are used even if it’s not within their control,” said C. Riana Manuel-Peña, director of brand and unified marketing at Zebra Partners. “As someone who evaluates talent for projects and brand deals, we look at everything that could potentially come from partnership with a person. Unfortunately, when your brand is no longer within your control, that may affect future possibilities.”
It may affect current partnerships, too, as there generally aren’t standardized clauses or practices in contracts about deepfakes. Deepfaked Twitch streamers have worked with brands including Spotify, Dunkin’ and HyperX. While none have made statements about the incident, there are clauses allowing brands to back out of agreements should an unforeseen event complicate the relationship.
Creators are responsible for pursuing deepfake protection, Manuel-Peña said, adding that she expects standard contingencies for involuntary deepfaking to arrive soon, especially as the technology becomes more realistic and easier to use.
For now, creator marketing agencies handle likeness and brand safety issues on a case-by-case basis. Given that creators with large followings who command higher paychecks are at higher risk of deepfakes, budgeting for these issues adds an extra challenge, especially if a brand decides to terminate the relationship.
“In an ideal situation, we would compensate that creator for any work done and then hire an alternate,” said Danielle Wiley, founder and CEO of Sway Group. “When you get into people who are getting paid five or six figures, that becomes not doable. I don’t have an answer off the top of my head other than saying that would really, really suck.”
For now, the power lies with brands to decide if they want to continue to partner with a creator who’s been deepfaked and risk their image by association, especially if the deepfake is realistic enough to fool consumers.
But brands needn’t rush to end creator partnerships prematurely, as clauses to pause a deal and discuss next steps together benefit all parties involved, said Jess Phillips, CEO of the Social Standard. Once paused, PR teams can strategize how to communicate a brand’s decision to the creator and to the public.
“It’s a PR spin of saying ‘we looked into it, here’s what happened, we’re going to continue our support and partnership with this creator,’” she said.
Manuel-Peña agreed that communicators will have an easier time siding with whoever has done no wrong, but added that teams may publicly end relationships, even if it’s to protect the creator from further attention.
“Depending on what the instance of falsified image is, it’s really a tricky situation because it could bring them even more heat,” she said. “That would not be good for either party.”
There is a lighter side to the issue. With creator consent, deepfake technology has the potential to create fun parodies and comedic campaigns that suit platforms like TikTok, Phillips said. It’s also been used to revive iconic characters in franchises whose actors have died.
But given how often it’s used to non-consensually place women in porn, the creator marketing industry will need to quickly find solutions for the deepfake problem, say experts.
“We know how quickly technology moves,” Manuel-Peña said. ”We know how quickly people use it for harm rather than good, and it’s something that everyone should be prepared for yesterday because it’s clearly happening now.”
IL PRIMO ECOMMERCE SPECIALIZZATO IN DELIZIE AL TARTUFO E CAVIALE – CAVIAREAT.COM