As we learned back in the late aughts when Apple first had to explain the concept of apps to us in the “There’s an app for that” ad campaign, apps make doing things easier. This is great when it comes to streamlining stuff like banking and finding an apartment, less so when that same spirit of convenience via technology is applied to activities like, say, making deepfake porn.
Unfortunately, a new, unnamed app reportedly does exactly that. According to MIT Tech Review, the app — no platform or OS was mentioned — was recently discovered by deepfake researcher Henry Ajder and is the user-friendly deepfake generator researchers in Ajder’s field have been dreading. So far, the app exists in “relative obscurity,” and MIT Tech Review refers to the platform only as “Y,” presumably in an attempt to avoid inadvertently launching it into the mainstream.
For the uninitiated, “deepfake porn” is a form of AI-generated synthetic media used to create pornographic representations of real people. Celebrities are often the victims of deepfake porn, but anyone’s likeness could end up getting the deepfake treatment. Naturally, women are the most frequent victims of this technology. Per MIT Tech Review, research company Sensity AI estimates between 90 and 95 percent of all online deepfake videos are nonconsensual porn, and around 90 percent of those feature women. Becoming the victim of deepfake porn can be traumatic and violating, and deepfakes can also be used as a form of revenge porn leveraged to blackmail and humiliate victims.
In the past, employing deepfake technology in this manner required some degree of technical skill, but the new app known as Y reportedly makes it easier than ever, rendering harmful deepfake technology accessible to any average joe with a smartphone or computer. Per MIT Tech Review:
Y is incredibly easy to use. Once a user uploads a photo of a face, the site opens up a library of porn videos. The vast majority feature women, though a small handful also feature men, mostly in gay porn. A user can then select any video to generate a preview of the face-swapped result within seconds—and pay to download the full version.
The results aren’t perfect, but activists and researchers say even the least-realistic deepfakes can still wreak havoc on victims. “This kind of abuse—where people misrepresent your identity, name, reputation, and alter it in such violating ways—shatters you to the core,” activist Noelle Martin told the outlet. “It affects your interpersonal relations; it affects you with getting jobs. Every single job interview you ever go for, this might be brought up.”
Ajder is hoping to get the app pushed offline (the site put up a notice on its homepage in August saying it’s no longer available to new users), but fears now that the technology has become so easily accessible, trying to banish these platforms will essentially become a game of dark web whack-a-mole. Still, he believes pushing these platforms underground is better than allowing them to enter the mainstream: “Even if it gets driven underground, at least it puts that out of the eyes of everyday people.”
Thanks for reading InsideHook. Sign up for our daily newsletter and be in the know.