Non-consensual deepfake porn, which includes AI-manipulated content often featuring prominent female celebrities and public figures, is still a problem, and it seems no one is really doing anything about it.
The non-consensual videos seem to have only grown in popularity and mainstream accessibility since first emerging from Reddit in 2018. New analysis shows that up to 1,000 deepfake videos are uploaded to leading porn sites every month, where they rack up millions of views, Wired reported. The new numbers from deepfake detection company Sensity show that deepfake content has continued to grow in popularity throughout 2020, increasingly expanding beyond its origins in darker corners of the internet and into mainstream porn.
Unfortunately, while the problem seems to be growing, efforts to combat it do not.
“The attitude of these websites is that they don’t really consider this a problem,” Giorgio Patrini, CEO and chief scientist at Sensity, told Wired. “Until there is a strong reason for them [porn websites] to try to take them down and to filter them, I strongly believe nothing is going to happen,” Patrini added. “People will still be free to upload this type of material without any consequences to these websites that are viewed by hundreds of millions of people.”
As advances in artificial intelligence technology make it easier to create high-quality, inexpensive deepfakes, experts fear the problem will only continue to grow, eventually targeting new kinds of victims.
“What this shows is the looming problem that is going to come for non-celebrities,” Clare McGlynn, a professor at the Durham Law Schools, told Wired. “This is a serious issue for celebrities and others in the public eye. But my long-standing concern, speaking to individual survivors who are non-celebrities, is the risk of this is what is coming down the line.”
Thanks for reading InsideHook. Sign up for our daily newsletter and be in the know.