In 1997, legendary performer Fred Astaire danced with a Dirt Devil in a commercial for the vacuum company. By the end of the 20th century, celebrity endorsements were certainly not unusual, except for the rather important detail that Astaire had been dead for 10 years at that point. Unsurprisingly then, reaction was mixed, bordering on negative. But a precedent had been set, so by the time Peter Cushing was digitally inserted into 2016’s Rogue One, the resulting controversy was barely a blip. A year later, though, we learned about deepfakes.
A (Very) Simple Guide to Deepfakes
A deepfake is a video that’s been created using a combination of a kind of machine learning, algorithms, and possibly magic. I don’t know–I’m not a scientist. The end result, though, is a counterfeit. It’s like face-swapping, except you don’t even have to use the whole face. Buzzfeed’s Jordan Peele/President Obama video, for instance, just switched out Obama’s mouth.
Deepfakes are not seamless by any measure. If you’ve ever used a photo filter, then you can understand. The doggy filter looks great straight-on, but if you turn your head, the effect is broken. Deepfake videos often have the same issue. The transitions are glitchy. The faces don’t match exactly. They can have the Uncanny Mariana Trench effect of the new Cats trailer. But they’re getting better, because they’re learning how to.
Using a free app, the Peele-Obama video took 56 hours of refining before it was ready for its close-up. However, as technology gets better, like it always does, it’ll get faster and easier. Soon, we may not be able to tell the difference between what’s real and what’s surreal. And that could have big consequences for media of all kinds.
The Rise of Unreality
After all, the first deepfakes to become well-known were manipulations of porn. A Reddit user, who coined the term “deepfake” for their username, posted videos in which mainstream actresses’ faces had been superimposed over the faces of porn actresses. Eventually Reddit banned the user under the aegis of banning anyone who posted involuntary pornography. But you can’t unring a bell. Anyone can now make a deepfake.
This would seem to have the most serious consequences for politicians and celebrities, whose images can be altered to show them saying or doing anything. However, they’re not the only ones whose images could be at risk. It’s easy to imagine, for example, how deepfakes could be a new weapon for your average bully. This is the worst possible scenario, but there are less dire effects we can imagine as well.
Artists as Altered Carbon
For instance, music shows, like Coachella and the Billboard Music Awards, have used holograms and similar effects to spice up performances in recent years. Black Mirror’s season four episode, “Rachel, Jack, and Ashley Too,” imagines a world in which that’s the norm. After all, why put up with a temperamental pop star when you can just replace her with a computer program?
And it isn’t just for extreme attitude adjustments. It’s easy, for example, to picture a future in which scheduling issues never prevent actors from appearing in any films they choose. As a result, one or two actors could literally, virtually dominate the industry. We could get off a tangent about the unfairness of that–and it would be unfair–but it would have a devastating impact on creativity. Think, for instance, of all the performances we’d never see, the breakout actors who’d never get their chance to break out.
Return of the Living Dead Performers
Ultimately, if the use of deepfakes became widespread in TV and movies, the logical conclusion would be the return of the Fred Astaire or Peter Cushing issue. That is to say, even death couldn’t prevent a comeback. That might not seem so unpleasant at first–we’re all going to miss those Stan Lee cameos–but it gets a bit gross when you think about consent. If deepfakes become sophisticated enough, then there could be post-mortem provisions for living people. But what about the people who are already dead?
This question–or at least, the ethics of it–has come up before. So far, the answer seems to be, “We dunno.” And I don’t know, either. If a public figure wants to make provisions for the future use of their image, then I believe it’s their prerogative. But I also believe there’s something distasteful about using someone’s image after their death. For a movie or other artistic work, there’s a gray area, a place for discussion. But for hawking a vacuum or a chocolate bar? It’s gross.
The Future Legality of Deepfakes
Because there is that gray area, though, we’re now seeing talk of legislation. Once again this year, a bill was introduced in the New York state legislature regarding the “right of publicity.” While the bill never got to the voting stage–due in part to objections over its chilling effect on free speech–it did include some language that could apply to deepfakes. Had it passed, it would have banned the use of “digital replicas” in sexually explicit contexts.
It would have also made a person’s image their property and thus, transferable. So again, while that’s seemingly okay if that’s what someone wants, what if they don’t? If your image is transferable, then you can lose it. That means that if a celebrity declared bankruptcy or got divorced, they could lose the rights to their own identity.
If that seems like something I’ve conjured out of a Black Mirror nightmare, then I should assure y’all it’s not. It’s already part of our world. Back in 2006, for example, Ron Goldman’s family sued for and won control of the rights to OJ Simpson’s purported memoir If I Did It. The Goldman family won a civil judgment against Simpson for wrongful death, so the book proceeds served as partial payment for the millions Simpson still owes the Goldmans.
However, the Goldmans were less successful in another legal venture–obtaining Simpson’s publicity rights. The judge in that case denied the request, citing an earlier opinion that basically said making a man into a product who would “serve the economic needs and interest of others” is a little too close to “involuntary servitude.”
Essentially, even if a public figure is a terrible person, they deserve the right to control their own image. But how much longer will that last? With deepfakes, the answer might be: not much.
Salomé Gonstad is a freelance writer who grew up in the swampy wilds of south Alabama. She now splits her time between the Appalachian wilds (of Alabama) and the considerably more refined streets of New York City. When she's not yelling about pop culture on the internet, she's working on a supernatural thriller about her hometown. Also, we're pretty sure she's a werewolf.