This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Insights Insights
| 1 minute read

US Law to Require Take-down of Nonconsensual Porn, Including Deepfakes

The US Congress has passed a new bill that would attack public distribution of revenge porn and deepfake porn through civil and criminal compliance and penalty measures.  If signed by the president, the bill would take effect as a federal law one year from signature.  The bill, the acronymic title for which is the “TAKE IT DOWN” Act, passed the House in February and the Senate in April, both by overwhelming votes.  

The bill applies to posting of so-called “NCII,” or non-consensual intimate images, regardless of whether they are real or forged.  The bill requires that companies like social media platforms take down NCII upon request, much as they do with copyright infringement under existing laws.  The president is expected to sign the bill, which was supported by the White House.  

WHY IT MATTERS

The rise of “revenge porn” – posting nude photos or video of someone, often an ex, who didn't consent to its posting – in the last decade has been a difficult matter to manage through legal means.  Celebrities also have been the targets of hackers who find their nude photos and post them.  Once something is posted, it tends to live forever in the digital world.  Flagging something as a violation of a platform's “terms of use” might or might not get it removed from the platform, since the operator gets to decide what violates their terms.  Further, the patchwork of laws that apply to the unauthorized posting of someone's photo are not the same in every state, and must be enforced state by state (through a lawsuit).  The rise of AI porn/nude images makes the problem immeasurably worse, because if the image is not an actual photo of a person, there might not be any law that squarely addresses it.  

This law would change much of that in one fell swoop: it will require, operationally, that platforms have a mechanism to address NCII take-down requests.  Because it carries potential criminal penalties for posters of NCII, it may deter the creation and posting of NCII.  Particularly for images involving minors (the high school ex, for example), that deterrence can be valuable.  Finally, the fact that victims can require – not just request – taking down of NCII depicting them, whether it's a real photo or not, they will have a measure of power over the harmful behavior of others.  

 

TAKE IT DOWN is a backronym of Tools to Address Known Exploitation by Immobilizing Technological Deepfakes On Websites and Networks. Though both titles describe goals of the bill — a limited right to removal and a focus on deepfakes respectively — neither captures its full scope. The bill includes both criminal and civil elements. On the criminal side, it makes it illegal for a person to "knowingly publish" authentic or synthetic nonconsensual intimate images. On the civil side, it requires covered platforms to remove flagged NCII from their platforms within 48 hours of receiving a valid request.

Tags

data security and privacy, hill_mitzi, cybersecurity, data privacy, insights