Home / News / Technology / UK’s Labour Party Eyes Ban on ‘Nudification’ AI Tools Amid Deepfake Concerns
Technology
3 min read

UK’s Labour Party Eyes Ban on ‘Nudification’ AI Tools Amid Deepfake Concerns

Last Updated March 18, 2024 3:27 PM
James Morales
Last Updated March 18, 2024 3:27 PM

Key Takeaways

  • A Labour Together policy paper has proposed banning so-called nudification apps.
  • The paper also recommended other measures to help crack down on harmful deepfakes.
  • Nonconsensual nudified images are illegal in the UK, but tackling the issue remains a challenge.

So-called “nudify” apps that use AI to undress input photos have created an epidemic of non-consensual sexually explicit deepfakes, which are illegal in many countries. Given this tendency for abuse, politicians in the UK are considering an outright ban on AI nudification tools.

As discussed in a recent policy paper, if the Labour Party wins the next election, they may introduce a general prohibition against nudify apps as well as stricter rules for AI developers to ensure their image and video-generation engines aren’t used to make harmful deepfakes.

Labour Policy Paper Recommends Nudify Ban

The policy paper, produced by the Labour Together think tank, proposes a general ban on dedicated nudification tools that let users generate explicit images depicting real people.

Labour Together’s proposals recognize that the majority of nudified images are created without their subject’s consent and are therefore illegal in the UK.

But the paper doesn’t just target app users. It also recommends introducing new obligations for AI developers and to help prevent general-purpose computer vision models from being abused. In addition, it proposes measures web hosting companies could be required to take to help ensure they aren’t involved in the production or distribution of harmful deepfakes

Deepfakes in UK Law

Under the UK’s Online Safety Act, which received royal assent in October, the communications regulator Ofcom is responsible for penalizing digital platform operators who don’t do enough to protect their users.

In its new role as the UK’s online safety watchdog, the regulator is empowered to fine video platforms up to 5% of their relevant revenue. 

While detailed guidelines aren’t expected to be published until next year, Ofcom will certainly look to crack down on social media companies that don’t meet their safeguarding responsibilities and firms that enable websites hosting deepfake porn.

Tech Firms Implicated in Deepfake Crackdown

By creating obligations for social media platforms and porn sites, the Online Safety Act hopes to prevent the distribution of illegal content. But Labour’s latest policy paper suggests the party could apply more pressure to the technology firms.

In an email to CCN, Professor Clare McGlynn, a legal scholar researching online sexual abuse, highlighted that large technology companies have an important role to play in cracking down on nudify apps and illegal deepfakes.

“Google needs to stop high-ranking dedicated deepfake websites and apps,” she stressed. But at the same time, “the tech companies that own the image generators need to do more to stop them being used to generate deepfake porn.”

Was this Article helpful? Yes No