AI-generated child sex abuse images targeted with new laws


The UK government has announced the introduction of four new laws to combat the threat posed by child sexual abuse images created by artificial intelligence (AI). Under the new legislation, it will be illegal to possess, create or distribute AI tools specifically designed to create child sexual abuse material (CSAM), punishable by up to five years in prison. It will also be illegal to possess paedophile manuals about the use of AI to abuse children, punishable by up to three years’ imprisonment.

Home Secretary Yvette Cooper said that the government will “act to ensure the safety of children online by ensuring our laws keep pace with the latest threats.” Additionally, a law prohibiting websites on which paedophiles can share child sexual abuse content or offer advice on grooming will be punished by up to 10 years in prison. The Border Force will be given powers to instruct suspected child sexual abusers to unlock digital devices for inspection when entering the UK.

The use of AI-generated CSAM can involve images that are partly or completely computer-generated. ‘Nudify’ software takes real images and manipulates them, while in other cases, voices of abused children have been used to create realistic images. Misused images can also be used to blackmail children, who may then be forced into further abuse.

Experts have welcomed some aspects of the new laws but pointed out a range of significant gaps. Professor Clare McGlynn of Durham University said the UK should ban “nudify” apps and deal with the “normalisation of sexual activity with young-looking girls on the mainstream porn sites”.

According to the charity, Internet Watch Foundation, the number of sexual abuse AI images of children on the open web continues to grow, with calls about child sexual material nearly quadrupled from 51 in 2023 to 245 reports in 2024. The charity warned that such AI content normalises and encourages child sexual abuse and urged tech co

Read the full article from The BBC here: Read More