Key Takeaways
Earlier this year, House Representative Alexandria Ocasio-Cortez introduced the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act to enable victims of “digital forgery” to sue deepfake creators.
Now, research by The American Sunlight Project has demonstrated just how close to home the issue really is for at least 25 Congresswomen who were the subject of nonconsensual intimate imagery.
The ASP research identified more than 35,000 mentions of 26 members of Congress on deepfake pornography sites.
Reflecting a wider pattern of nonconsensual deepfake porn overwhelmingly targeting women, only one of the victims identified was male. The study also found that younger lawmakers were more likely to be victims.
Commenting on the recent surge in deepfake pornography, ASP said the “weaponization” of generative AI affects women and girls from all walks of life.
“For every headline-grabbing incident, thousands more—including underage girls—suffer in silence, enduring profound and often invisible trauma,” the organization stated.
ASP’s findings underscore the importance of legislation like the DEFIANCE Act that can be used to hold the perpetrators of deepfake abuse accountable.
Under the act, victims will be able to pursue civil remedies against those who produce or process non-consensual deepfake porn with the intent to distribute it.
Anyone found guilty of doing so could be liable to pay up to $150,000 in damages or up to $250,000 if the incident was connected to further sexual assault, stalking, or harassment.
The bipartisan bill passed the Senate unanimously in July. It now awaits passage through the House before the president can sign it into law.
Outside of the U.S., other governments are also taking steps to crack down on deepfake abuse and prosecute those who are responsible for it.
In the U.K., the previous government initially moved to create a new offense with potentially unlimited fines and the prospect of jail time for the worst offenders.
However, this summer’s surprise general election interrupted legislative progress, and the cause has yet to be picked up by the new government.
Meanwhile, public outrage over deepfake pornography in South Korea has prompted the government to investigate Telegram over its role in distribution.
While other areas of AI regulation have been more contentious, there is broad support for legislation targeting deepfake sexual abuse from across the political spectrum.