Policy paper

Crime and Policing Bill: Child sexual abuse material factsheet 

Published 25 February 2025

What are we going to do? 

Measures in the bill will: 

Modernise the legal basis under which Border Force detects digitally held child sexual abuse material (CSAM) at the UK Border (both inbound and outbound) to ensure it is fit for the challenges of the digital age.   

Introduce a new criminal offence that criminalises AI models that have been optimised to create CSAM. These optimised models produce hyper-realistic CSAM that often contains the likeness of real children. The models are not currently illegal in the UK. 

Update the existing law criminalising ‘paedophile manuals’ to cover Artificially Generated (AI) CSAM reflecting the evolving nature of online child sexual abuse (CSA).  

Bring in new law to ensure there are no safe spaces for offenders to network and facilitate abuse. We will criminalise those who provide, maintain and / or moderate online services, i.e. websites, which are being used to share child sexual abuse imagery or commit other child sexual abuse offences.   

How are we going to do it? 

CSAM at the border 

The bill amends  the Customs and Excise Management Act 1979 (CEMA) to provide Border Force with the power to require that an individual whom it is reasonably suspected is in possession of digitally stored child sexual abuse material (CSAM) unlocks their digital devices in furtherance of technology-enabled inspection.  Refusal to allow such an inspection would constitute an offence of obstruction under section 31 of the Commissioners of Revenue and Customs Act 2005. 

The power would be exercised  only at the UK Border (point of entry into / departure from the country) by a warranted Customs Officer. 

CSA image generators 

This new offence will make it illegal to adapt, possess, supply or offer to supply a CSA image generator, punishable by a maximum sentence of five years in prison.  

In this offence, CSAM will be defined in accordance with existing legislation, i.e. photographs/pseudo photographs (section 1 of the Protection of Children Act 1978) or prohibited images (section 62 of the Coroners and Justice Act 2009).  

There are defences for Ofcom and the intelligence agencies to this offence as per the Protection of Children Act 1978. There is also a delegated power for the Secretary of State to permit relevant organisations to possess CSA image generators for an appropriate purpose, for example testing to determine the capabilities of the models to prevent future crime.  

Paedophile manuals 

The bill  amends the existing ‘paedophile manuals’ offence under section 69 of the Serious Crime Act 2015 to include pseudo-photographs and prohibited images. This will ensure consistency between the approach taken for ‘real’ CSAM and AI generated abuse imagery, which can also feature real children. 

Moderators and administrators 

The bill creates a new offence which criminalises carrying out a relevant internet activity with the intention of facilitating child sexual abuse. 

‘Relevant internet activity’ could include the maintenance of a CSA site, writing code, controlling or providing access to these sites.  

Child sexual abuse is defined as conduct that would constitute an offence specified in Schedule 6 to the bill or conduct outside of the UK that would constitute such an offence if it took place in the UK. 

Background 

CSAM at the border 

Many of those who pose a direct risk to children travel frequently across the UK Border to commit child sexual abuse offences abroad. The border represents a unique chokepoint through which individuals leaving or entering the UK must pass, at which point they can be questioned and their baggage searched for ‘prohibited goods’ as defined by the Customs and Excise Management Act 1979 (“CEMA”).   

Before the development of digital media devices, CSAM would typically present in the form of printed photographs, video cassettes or DVDs.  As such, it would be detected routinely via a baggage search under CEMA.  However, CSAM is now usually held digitally and within the memory of digital devices such as phones, tablets and laptops – all of which are protected by passcodes or biometrics.  Whilst, acting under CEMA, a Border Force officer is able to compel an individual to present digital devices, the Act does not enable the officer to require that the digital device is unlocked in furtherance of a search to detect CSAM.   

In recent years, the Home Office has developed the Child Abuse Image Database (“CAID”) – a repository of all known CSAM detected during UK Police investigations.  The CAID now holds millions of unique files.  In parallel to the CAID, the capability now exists to undertake a rapid scan of a digital device to determine whether known material is held within its memory.  Accordingly, it is possible to scan a digital device (such as a phone) for CAID material.  As a scan is not a download, it will take approximately 15 seconds to identify whether CAID material is or is not present.  This capability has now been operationalised at the UK Border, with trials generating significant intelligence around individuals representing a sexual risk to children – leading to investigation and arrests.   

The bill  enables Border Force officers to require an individual entering or leaving the UK to unlock or unblock their digital devices for examination, where that officer reasonably suspects that the device may contain evidence of child sexual abuse material.      

CSA image generators 

Child sexual abuse offenders are using artificial intelligence models to generate photorealistic child sexual abuse material. This imagery often depicts the most severe and graphic forms of child sexual abuse, can be used to generate abuse images of “real” children – including those known to the victim, serves to normalise the sexual abuse of children, and increases the volume of child sexual abuse material available which could affect identification of “real” victims.    

Offenders are optimising off-the-shelf AI models to enhance their ability to create CSAM, for example by training the models on vast quantities of abuse imagery, or the likeness of a specific child. They also sell these models to other offenders, making significant profits.  

Whilst AI CSAM is illegal to make, possess and distribute in the UK (as per Protection of Children Act 1978), these fine-tuned models are not currently illegal. 

Paedophile manuals 

Child Sexual Abuse offenders are using artificial intelligence models to generate photorealistic child sexual abuse material. This imagery often depicts the most severe and graphic forms of child sexual abuse, can be used to generate abuse images of “real” children – including those known to the victim, serves to normalise the sexual abuse of children, and increases the volume of child sexual abuse material available which could affect identification of “real” victims.   

The Internet Watch Foundation (IWF) [footnote 1] discovered a guide in March 2023 which detailed how to generate AI CSAM with one particular mainstream AI image generation model was widely shared in dark web forums. In May 2023, they discovered a guide to creating models using personal child sexual abuse material datasets has been shared – i.e. guidance on using existing abuse images to create new material, further re-victimising the children pictured. There are other small or one-page guides that have been identified in circulation.  

Section 69 of the Serious Crime Act 2015 created the offence of being “in possession of any item that contains advice or guidance about abusing children sexually” which was intended to target so-called “paedophile manuals”. This offence defines “abusing children sexually” as including any offences under section 1 of the Protection of Children Act 1978, excluding pseudo-photographs. A pseudo-photograph is defined by the Protection of Children Act 1978 as “an image, whether made by computer-graphics or otherwise howsoever, which appears to be a photograph”. An AI Generated image is therefore mostly likely to be considered a pseudo-photograph and excluded from this offence. It is also possible that AI generated CSAM could be classified as a prohibited image as per the Coroners and Justice Act 2009, which are also out of scope of the original section 69 offence. 

Moderators and administrators 

The National Assessment Centre (NAC) found that it is almost certain that the vast majority of online CSA offending is underpinned by networking between offenders. 

Electronic services used by offenders:  

a. Provide a forum which facilitates criminal activity. Offenders in forums, closed messaging groups or other online environments share and link to indecent images of children.  

b. Reinforce narratives legitimising or escalating the abuse of children. These sites can encourage the production of first-generation imagery, therefore inciting contact abuse against children. For example, first generation imagery is often used as currency to be traded for access into exclusive online CSA VIP forums or restricted sections of forums. It is entirely possible that individuals would commit contact offences against children specifically to satisfy this requirement.  

c. Facilitate the commercialisation of child sexual abuse material. Law enforcement partners are increasingly seeing offenders choosing to operate CSA sites as a means of monetising the sexual abuse of children. These individuals are motivated by financial gain, as opposed or in addition to a sexual interest in children. The commercialisation of child sexual abuse material can involve offering access to VIP areas (as referenced above) or involve facilitators requesting payment in exchange for allowing offenders to direct real-world child abuse. 

d. Encourage an evolution of tradecraft and operational security to avoid detection. Forums enable offenders to develop their tradecraft and share techniques such as the creation of imagery, the evasion of detection, security, and methods of abusing children including by travelling abroad. These forums can provide those new to offending with the resources and materials needed to escalate their offending. These forums provide advice on how offenders can avoid detection by law enforcement. They have also been seen to share contact details of victims to allow direct contact and further abuse by new offenders.    

Key statistics 

CSAM at the border 

During consent-based trials, over 80% of individuals consented for their digital devices to be examined.  Some 20% refused.  Of 44 devices examined with consent, 70% yielded actionable intelligence, of which 33% was relevant to child protection.  The National Crime Agency and UK Policing assess the intelligence dividend secured from the trials to be high value. 

The National Strategic Assessment indicates the presence in the UK of up-to 840,000 individuals with a sexual interest in children.  The minority of these individuals are identified and less than 1% subject to travel restriction orders. 

CSA image generators 

On a single dark web forum, the IWF identified 3,152 AI CSAM images in a one-month period. 

The Internet Watch Foundation identified an anonymous webpage that contained links to optimised AI models that featured the likeness of 128 different named victims of CSA. 

Paedophile manuals 

Reports of AI-generated CSAM found online by the IWF have quadrupled in a year. IWF analysts confirmed 245 reports of AI-generated child sexual abuse in 2024, compared with 51 in 2023, a 380% rise. 

The IWF [footnote 2] reviewed AI-generated images posted to a single dark web CSAM forum in a one-month period. In total, 2,562 images were assessed as criminal pseudo-photographs and 416 assessed as criminal prohibited images under UK Law.  

Moderators and administrators 

In 2023, the IWF [footnote 3] confirmed 275,652 URLs as containing child sexual abuse imagery, having links to the imagery, or advertising it.  

23% of images reviewed by the IWF in 2023 were Category A – these depict children subjected to penetrative sexual activity, bestiality or sexual activity involving sadism. 

Of images involving babies or toddlers, 92% were assessed as Category A.  

The National Crime Agency identified 2.88 million global registered accounts across ten of the most harmful CSA dark web sites, including those dedicated to the abuse of babies, toddlers, and sadism.  

Frequently asked questions 

CSAM at the border 

Will scanning a digital device lead to access to other material held within its memory? 

No – the device scan will be undertaken by technology which looks for CAID-known content only.  It does not download that content and nor does it look for non-CAID content.  If CAID-known content is found, the officer performing the scan will be notified via an alert but will not be exposed to the content itself. 

Won’t the capability detect evidence of criminality of CSAM not known to CAID? 

The capability will only detect CAID known material.  If there is evidence of non-CAID criminal material on the device but no CAID-known material, it will not be detected (e.g. first generation CSAM created but not yet uploaded onto CAID). 

Will a person refusing to unlock their digital device be arrested? 

Refusal without reasonable cause would trigger the offence of obstruction, which is an arrestable offence. This would also result in seizure of the digital device to enable a full forensic download.  Such a download may identify other offences alongside possession of CSAM. 

What happens to an individual if they’re arrested by Border Force for possession of CSAM? 

Any person arrested for a customs offence by Border Force will be referred to UK Police or the National Crime Agency for investigation and, where agreed by the Crown Prosecution Service, prosecution. 

CSA image generators 

Is it a crime to use AI to create child sexual abuse material? 

UK law is clear - creating, possessing, or distributing child sexual abuse images, including those that are AI generated, is already illegal. However, we are committed to giving law enforcement agencies the powers needed to combat child sexual abuse.

Will this measure punish developers for offenders misusing their models? 

This offence will not criminalise AI developers. It is targeted at offenders who optimise AI models specifically to enhance their ability to create child sexual abuse material.  

Does AI CSAM actually harm real children? 

Yes, AI CSAM can have direct impact on real children. Offenders use AI to create photorealistic abuse imagery that often features real children, for example children known to the offender or existing victims. We also know that offenders are using AI imagery to groom and blackmail children.  AI CSAM serves to normalise child sexual abuse, and often contains the most severe forms of CSAM. 

Paedophile manuals 

What is a paedophile manual?  

A paedophile manual is an item (for example, a document) “that contains advice or guidance about abusing children sexually”.  

What is a pseudo-photograph?  

A pseudo-photograph is defined by the Protection of Children Act 1978 as “an image, whether made by computer-graphics or otherwise howsoever, which appears to be a photograph”. This includes images made by Artificial Intelligence (AI). 

Moderators and administrators 

What if someone uses an electronic service to share child sexual abuse material without the knowledge of the service provider?  

An individual must have intention to facilitate child sexual exploitation and abuse to be convicted under this offence. Where an internet service is used without knowledge or intention of the service provider to carry out child sexual exploitation and abuse, they will not be criminally responsible as the service provider.  

Are child sexual abuse websites only available on the dark web? 

Whilst these sites do exist on the dark web, they are increasingly prevalent on the clear web too. End to end encryption on mainstream apps offers offenders a level of security and anonymity previously only possible via use of the dark web.  

Isn’t it already illegal to run a child sexual abuse site? 

Moderators and administrators of CSA sites are most commonly charged with indecent image offences contained within the Protection Children Act 1978. The bill  introduces a specific offence to ensure the highest harm CSA offenders are charged with offences commensurate to the severity of their offending.

Footnotes