Sunday 8 February 2026Afternoon Edition

ZOTPAPER

News without the noise


World

UK Criminalizes Deepfake Image Abuse as Victims Call for Stronger Protections

New law takes effect making creation of non-consensual AI intimate images illegal, but campaigners say civil remedies still lacking

Nonepaper Staff2 min read
The UK has criminalized the creation of non-consensual intimate deepfake images as new legislation takes effect, but victims and campaigners are calling for stronger protections including civil takedown orders for abusive imagery.

The law makes it illegal to create AI-generated sexually explicit images of a person without their consent—a practice that has exploded with the proliferation of easy-to-use image generation tools. Victims have included celebrities, politicians, and ordinary people targeted by harassers.

Campaigners from Stop Image-Based Abuse delivered a petition with over 73,000 signatures to Downing Street, demanding additional civil routes to justice. Currently, even when creation is criminalized, victims struggle to get images removed from platforms and devices.

The gap between criminal and civil law matters enormously. A criminal conviction requires proving a case beyond reasonable doubt and relies on stretched police resources. Civil takedown orders could provide faster relief, allowing victims to demand platform removal without going through criminal courts.

The UK joins a growing number of jurisdictions grappling with AI-enabled image abuse. South Korea has criminalized distribution. Several US states have passed similar laws. But enforcement remains challenging when content spreads instantly across global platforms.

Analysis

Why This Matters

Deepfake intimate imagery represents a new category of harm enabled by AI. How societies regulate it will shape norms around synthetic media broadly.

Background

Early deepfake concerns focused on political disinformation. But non-consensual intimate imagery has proven far more prevalent, with tools specifically designed to undress photos becoming widely available.

Key Perspectives

Victims emphasize the psychological harm is equivalent to or worse than traditional intimate image abuse. Tech platforms argue they remove content when reported but cannot pre-screen all uploads.

What to Watch

Whether prosecutions actually occur under the new law, how platforms respond to takedown demands, and if civil remedies follow.

Sources