Deepfake pornography has affected nearly 4,000 celebrities, including 255 British individuals and Channel 4 presenter Cathy Newman.
Channel 4's research found this illicit material on five major deepfake websites, with some sites accruing 100 million views in just three months. The Online Safety Act, effective January 31, criminalizes sharing such images without consent in the UK, but not their creation.
The AI-constructed videos increased dramatically, with 143,733 new videos uploaded in the first nine months of 2023.
Thirty-one-year-old Sophie Parrish from Merseyside found herself a victim of this content before the law's introduction, citing feelings of degradation and worthlessness. Current consultations are reviewing how Ofcom will implement and enforce the Online Safety Act.
Ofcom responded, emphasizing the imperative for firms to manage such harmful material proactively. In parallel, Google and Meta (owning
Facebook and Instagram) outlined their efforts and policies against non-consensual explicit content, including removal options from search results, new safeguards, and ad and account removals for policy violators.