Deepfakes Are Coming for You (Even if You’re Not Rich and Famous)

ImageShield
3 min readFeb 13, 2021

By Rachel Chalsma

Looking back, it was probably the 1960’s when photography lost its innocence. Photos have been manipulated since the medium’s birth in the middle of the 19th century, but it was during the decade of flower power and campus sit-ins that photo manipulation really took off, with airbrushing of photos of political figures and celebrities a common practice. Given where we are today, efforts back then were downright primitive and unbelievable. (Look at an airbrushed mantelpiece photo in any old movie for proof of this.) Sophisticated editing software and high-resolution photo content on our smartphones have generated a massive leap forward (if that’s the right word) in the photorealism and believability of manipulated content over a half-century ago.

The terms “manipulated media” and “deepfake” are sometimes used interchangeably when we talk about manipulated imagery, but they are not the same thing. A deepfake is typically a doctored video created using some form of artificial-intelligence software, which can make someone appear to do or say anything (Cook, 2019). Manipulated media is often correlated to images or political propaganda. Understanding the two is helpful as we look at their evolution over the years.

Historically, we can see how editing tools progressed from being used by glossy magazines to the average Facebook user today. In August 1989, Oprah Winfrey graced the cover of TV Guide. Although it was Oprah in part only, as the magazine had put Oprah’s head onto the body of actress Ann-Margret from a photo taken from 1979. This composite was created without permission from either woman. In June 1994, a doctored photo of OJ Simpson appeared on the cover of Time magazine shortly after his murder trial. The photo was edited to make him appear “darker” and more “menacing.” Notorious edits continued into the new millennium, from the cigarette being removed from the hand of Paul McCartney on the Abbey Road album cover, to the photoshopping of Sarah Palin’s head onto the body of an overweight, gun-toting woman in an American Flag bikini in 2008 (Farid, 2010).

We may look at incidents like these and think that kind of thing comes with the territory for celebrities; however, with recent advancements in tools and technology, it’s no longer just the rich and famous who are being victimized by the theft or manipulation of their images.

As the technologies advance in power and ease of use, even semi-sophisticated users now have the ability to alter the content and message of entire videos. Targets for these forms of media manipulations are still more likely to be celebrities than you and me, but every day we see more and more cases like that of Kate from Texas. Kate was a victim of revenge porn — One day at work a colleague showed Kate a deepfaked porn video featuring her face on someone else’s body (Cook, 2019). The video is still on the web today.

Kate is by no means the only non-famous woman who’s been victimized by the new technologies. For example, more than 100,000 women worldwide last year had their photos stolen and turned into publicly-available nude photos by an artificial-intelligence bot.

Increasingly, malicious actors are going to use the tools that grow more powerful every day to wreak havoc of the lives of ordinary people, as well as the rich and famous. We’re creating ImageShield because we think that the right to protect one’s images from abuse or unlawful use is a basic online right. ImageShield will provide best-available technical protection for photos (and later for videos), and monitor where and how those images are being used across web and social media. And it will be free.

Learn more about ImageShield at http://imageshield.net.

--

--