Home / News / Technology / Deepfake Sharing is Illegal in the UK, but Creating Them is Not
Technology
3 min read

Deepfake Sharing is Illegal in the UK, but Creating Them is Not

Published
James Morales
Published

Key Takeaways

  • The UK’s Online Safety Act makes sharing or threatening to share non-consensual explicit deepfakes illegal.
  • However, simply creating indecent deepfake images or videos isn’t currently a crime.
  • A growing number of voices are now calling for tougher rules to prevent deepfake sexual abuse.

The emergence of easy-to-use AI tools that can generate believable images, voice and video has thrown up a host of uncertainties for the world’s legal systems.

In the UK, the first law to seriously address the issue was the 2023 Online Safety Act, which outlawed the distribution of nonconsensual deepfake pornography. But now, some lawmakers want to go further.

What is the UK Law on Deepfakes?

Section 66B of the Online Safety Act prohibits sharing a film or photograph “which shows, or appears to show, another person in an intimate state.” It also makes threatening to share nonconsensual explicit content a crime.

However, although the law criminalizes sharing and threatening to share deepfake porn, it stops short of banning the creation of such images or videos.

Looking to close the loophole, the opposition Labour Party recently tabled an amendment to the Criminal Justice Bill  that would make creating deepfakes illegal too.

Labour Party Calls For Stricter Deepfake Rules

“Making deepfake intimate images and videos is an appalling violation of somebody’s autonomy and privacy and it should not be tolerated,” shadow home secretary Yvette Cooper commented .

Warning about the negative effect of deepfake pornography on women and girls, she said the government must deliver “a clear and unambiguous message that such activity is harmful and it is wrong.”

Against the backdrop of a surge in deepfake abuse, similar calls emerged from across the political spectrum

UK Lawmakers Mull Nudification Ban

During a session of the House of Lords last month, the Conservative peer Baroness Charlotte Owen pressed for the government to go further than the current legislation and ban AI “nudify” apps used to make intimate deepfakes.

Representing the government, Viscount Camrose responded to Owen’s question:

“The law commission consulted widely on this, looking at the process of taking, making, possessing and sharing deepfakes, and their conclusion was that the focus of legislative efforts ought to be on the sharing, which it now is.”

“But that said” he acknowledged, “it is a fast-moving space, the capabilities of these tools are growing rapidly as are the number of users, so we will obviously continue to monitor that.”

More recently, a Labour policy paper recommended the same kind of broad nudification ban as proposed by Baroness Owen.

In addition to the proposed ban on nudify apps, the paper suggested creating obligations for AI developers to ensure their technology isn’t abused. Meanwhile, web hosts could be required to take additional steps to remove illegal deepfake websites.

Parallels in the US

Mirroring developments in the UK, in the US, lawmakers have focused on how deepfake pornography is shared rather than how it is created.

At least 10 states have passed legislation banning the distribution of sexually explicit deepfakes.

Going forward, however, a new bill spearheaded by Alexandria Ocasio-Cortez could make both the distribution and production of non-consensual AI-generated pornography a federal crime.

Was this Article helpful? Yes No