Home / News / Technology / UK Lawmaker Introduces Bill Targeting Deepfake Sexual Abuse — ‘No Woman Is Safe’
Technology
3 min read

UK Lawmaker Introduces Bill Targeting Deepfake Sexual Abuse — ‘No Woman Is Safe’

Published September 6, 2024 2:04 PM
James Morales
Published September 6, 2024 2:04 PM
By James Morales
Verified by Samantha Dunn

Key Takeaways

  • Baroness Owen is set to introduce a new Bill in parliament to criminalize the creation and solicitation of non-consensual deepfake pornography. 
  • Owen argued that loopholes in existing laws make it difficult to prosecute the perpetrators of deepfake abuse.
  • A growing number of countries have adopted specific legislation to deal with the problem.

The UK Peer Baroness Owen is set to introduce a new Bill in parliament to outlaw the creation and solicitation of non-consensual deepfake pornography. 

In a series of publications and media appearances, Owen argued that loopholes in existing legislation have allowed perpetrators to get away with deepfake sexual abuse that overwhelmingly targets women.

Existing Laws Insufficient

Citing the case of a real woman whose photos were “nudified” with AI, in an article  in the Daily Mail, Owen described how the victim struggled to bring the perpetrator to justice. 

Ultimately, the person who created the deepfakes was charged with sending offensive messages under the Communications Act. However, as Owen argued, this only highlights the need for new legislation.

“Laws relating to the making of obscene images are woefully inadequate,” she said. For example, although the 2023 Online Safety Act made it illegal to share intimate images online without consent, “it left glaring omissions” that have allowed sexually explicit deepfakes to fall through the gaps. 

International Crackdown

Baroness Owen’s proposed Bill reflects an emerging international consensus that dedicated legislation is needed to tackle deepfake abuse.

A 2022 EU directive  required member states to prohibit “non-consensual sharing of intimate or manipulated material.” As stated in Article 7 (b) of the directive on combating violence against women and domestic violence, images and videos that “[make] it appear as though another person is engaged in sexual activities without that person’s consent;” must be punishable as criminal offences.

Meanwhile, in the US, the Senate recently approved the DEFIANCE Act, which allows victims to sue the creators of non-consensual deepfakes as well as those who distribute or receive manipulated images.

While she said that criminalizing deepfake sexual abuse represents an important first step, in a Channel 4 News interview , Owen acknowledged that technology providers also have a role to play. “Big Tech does need to take responsibility,” she stated, reflecting rising criticism of platforms viewed as complicit in the trend.

Platforms Come Under Scrutiny

Amid growing concerns about deepfake sexual abuse, the platforms used to create and distribute abusive content are increasingly coming under scrutiny.

In South Korea, Telegram has been blamed for a surge of explicit deepfakes in schools, prompting police in the country to investigate the messaging app.

Professor Clare McGlynn, whose research focuses on online abuse against women, told CCN that companies like Meta, Google, and Microsoft have also failed to prevent malicious deepfakes.

“Meta could remove [ads for deepfake services] and do so much more to remove and prevent deepfakes,” she stated. Meanwhile, Google could proactively “downrank or delist” dedicated deepfake websites and apps from its platforms.

“The tech companies that own the image generators need to do more to stop them being used to generate deepfake porn,” she observed.

Was this Article helpful? Yes No