Call for sharing of deepfake porn to be made illegal in the UK

The sharing of so-called “deepfake porn” should be made illegal in the UK, according to a government-backed review which warned current laws do not go far enough to cover “disturbing and abusive new behaviours born in the smartphone era”.

The Law Commission on Thursday laid out a series of recommendations relating to deepfake porn, where computers generate realistic but fake sexualised images or video content of an individual without their consent.

The independent body, which looks at whether legislation needs to be overhauled, has been reviewing existing laws on the non-consensual taking, making and sharing of intimate images since 2019.

There is currently no single criminal offence in England and Wales that applies to non-consensual intimate images. The report proposes widening the motivations behind these crimes to include things like financial gain, as well as extending automatic anonymity to all victims of intimate image abuse.

Only victims of voyeurism and upskirting are provided with these protections under existing law and prosecutors must prove the perpetrators acted to cause distress or for sexual gratification.

The review comes as advances in deep learning have meant that deepfakes are increasingly available online and cheap to use, with fake videos of politicians and celebrities proliferating on the internet.

The use of these tools in porn, where often a person’s face is superimposed on to a porn actor’s body in a video, has led the Department for Digital, Culture, Media and Sport select committee as well as campaign groups to call for it to be criminalised.

“Altered intimate images are almost always shared without consent,” said Professor Penney Lewis, the law commissioner for criminal law. “[They] often cause the same amount of harm as unaltered intimate images shared without consent.”

The phenomenon has “dramatic under-reporting” as victims do not have anonymity under current laws, which “do not go far enough to cover disturbing and abusive new behaviours born in the smartphone era,” she added.

The review comes as the long-awaited Online Safety Bill makes its way through parliament. Many of the Law Commissions’ previous recommendations have already been added to the legislation, including criminalising revenge porn and cyberflashing, where an indecent image is shared without the recipient’s consent.

The Government said the Online Safety Bill “will force internet firms to protect people better from a range of image-based abuse — including deepfakes” and it will consider the proposals.

Companies including Twitter, Reddit and PornHub have already banned deepfake porn generated without the person’s consent. In the US, Virginia and California have also made it illegal, while Scotland has also made it illegal to distribute deepfake porn.

Last month the European Union also strengthened its disinformation rules to include deepfakes. Under a new EU code of practice, regulators can fine technology companies up to 6 per cent of their global turnover if they do not crack down on deepfakes.

Read the full article Here

Leave a Reply

Your email address will not be published. Required fields are marked *

DON’T MISS OUT!
Subscribe To Newsletter
Be the first to get latest updates and exclusive content straight to your email inbox.
Stay Updated
Give it a try, you can unsubscribe anytime.
close-link