Forget Fake News, Deepfake Videos Are Really All About Non-Consensual Porn

Forbes wrote:

Forget Fake News, Deepfake Videos Are Really All About Non-Consensual Porn - The Reports

New research reveals the real problem with deepfake videos goes way beyond fake news

AFP/Getty Images

The supposed threat to fair elections, in particular, the 2020 U.S. presidential race, from “deepfakes” is getting a lot of attention online. Yet fake news is not where producers of deepfake video are putting their efforts. The overwhelming majority of deepfake video clips are, according to new research, pornographic. Unsurprisingly, then, this is about as big a privacy issue as you can get.

The shocking truth about deepfake videos

California has just passed a law, Assembly Bill 730, which criminalizes the creation and distribution of video content (as well as still images and audio) that are faked to pass off as genuine footage of politicians. AB730 makes it a crime to do this, with an exception for news media, within 60 days of an election. I can’t help but think this is overkill for a problem that doesn’t yet exist to any great extent.

Deepfakes, which is what we are talking about here, are painted as being one of the biggest “fake news” threats to fair and democratic elections. A deepfake is, not so simply when you dig into the technology required, the manipulation of genuine video footage by “AI software” to either make someone out to be saying something they did not or appear in video footage that they were not originally a part of.

I have no doubt that there will be people using the technology for such political purposes but, even as the 2020 U.S. election draws ever closer, these will remain but a tiny percentage of the deepfake videos produced. New research would appear to back up my thinking here: Deeptrace researchers found that the number of deepfake videos online had all but doubled in seven months (14,698 against 7,964 in December 2018,) but that surge was driven primarily by porn. And that presses both my gender equality and privacy buttons.

Indeed, “The State of Deepfakes” report, newly published by Deeptrace, reveals that some 96% of those deepfake videos online can be categorized as “non-consensual” pornography; manipulating existing porn videos to appear to feature female celebrities. The top four deepfake porn sites were found to have had more than 134 million views of these videos that exclusively target women — some 99% of the deepfake videos targeted female celebrities from the acting and music professions. Without revealing the names of the victims, Deeptrace researchers reported that the top ten individuals most frequently targeted included three American actresses, three South Korean musicians and two British actresses.

The deepfake assault on a woman’s right to privacy

Henry Ajder, head of research analysis at Deeptrace, told the BBC that “a lot of people are forgetting that deepfake pornography is a very real, very current phenomenon that is harming a lot of women.” And I, for one, agree. It is this weaponization against women, this patriarchal demeaning of an entire gender by the seamless insertion of their faces into pornographic video footage, that is the real scandal here; not the potential to influence voters who are already exposed to so many other avenues of less than honest persuasion.

This is a privacy issue that goes just about as far down the road of literally stripping a person bare as can be imagined. Deepfakes have a genuine impact on women whose privacy has been abused. The report quotes Danielle Citron, professor of law at Boston University, as stating that deepfakes are “terrifying, embarrassing, demeaning and silencing. Deepfake sex videos say to individuals that their bodies are not their own and can make it difficult to stay online, get or keep a job, and feel safe.” While the leaking of information from government databases, sometimes affecting the entire population of a country, are bad enough from the privacy perspective, the personal impact is still likely less traumatic than being the subject of a deepfake porn video.

Georgio Patrini, founder, CEO, and chief scientist at Deeptrace said: “We hope this report stimulates further discussion on this topic and emphasizes the importance of developing a range of countermeasures to protect individuals and organizations from the harmful applications of deepfakes.” I would hope that this includes enforcement of existing privacy legislation and even the passing of new laws if necessary, to have deepfake porn taken offline.

What about shallowfakes?

The truth is, I suspect that political “shallowfakes” will rule the 2020 election fake news roost. By which I don’t mean things such as the ZAO face-swapping app, but rather that as evidenced by the manipulation of Nancy Pelosi audio in a video of a speech shared on May 23.

Synthetic video has also been linked to the cybersecurity threatscape, with examples including a LinkedIn account (quickly deactivated) thought to have been used as part of a nation-state espionage operation quoted by Deeptrace in the report.

“This technology isn’t going away, so we need to prepare for a world where deepfakes are commonplace,” Paul Bischoff, a privacy advocate at Comparitech.com said. Suggesting it would be unwise as well as impractical to make deepfake software illegal, despite California having done just that albeit in a time-limited fashion, Biscoff said: “What we need to do is build awareness, good judgment, and a healthy sense of skepticism.”

Leave a comment

Your email address will not be published. Required fields are marked *