UK

Online Safety Act 2023: CSEA Reporting and Terms of Service

CSEA Reporting and Terms of Service [Sections 66-74]

Rule: All regulated services must report detected CSEA content to the National Crime Agency. Category 1 services must enforce their terms of service consistently and transparently.

Effective: CSEA reporting came into force March 2024. Terms of service duties in force from January 2025.


Part 1: CSEA Content Reporting [Sections 66-70]

Section 66: Duty to Report CSEA Content to NCA

66.1 — Who Must Report?

ALL regulated services must report:

Service TypeMust Report
User-to-user services✅ YES
Search services✅ YES
UK providers✅ All detected CSEA
Non-UK providers✅ UK-linked CSEA only

No exemptions:

  • Service size irrelevant
  • Category classification irrelevant
  • UK presence irrelevant

Critical: If your service is regulated under the Online Safety Act, you MUST report CSEA content.

66.2 — What Must Be Reported?

UK providers must report:

All detected and unreported CSEA content.

Non-UK providers must report:

UK-linked CSEA content.

“Detected” means:

Detection MethodReportable?
Automated systems (AI, hash matching)✅ YES
User reports✅ YES
Human moderator review✅ YES
Third-party notification✅ YES
Accidentally encountered by staff✅ YES

Important timing: Only content detected after Section 66 came into force (March 2024) must be reported.

66.3 — UK-Linked CSEA Content (Non-UK Providers)

CSEA content is “UK-linked” if:

Evidence suggests:

Link TypeExamples
1. Published from UKContent uploaded using UK IP address
2. Offender in UKCreator/distributor is UK national OR located in UK
3. Victim in UKChild depicted is in UK OR UK national

Practical assessment:

Detected CSEA content (if you're non-UK provider)

Evidence of UK links?
├─ Upload from UK IP → YES, report
├─ User profile says UK location → YES, report
├─ Content shows UK landmarks → YES, report
├─ Payment from UK card → YES, report
└─ No UK indicators → NO, don't report (but must still remove)

Why different for non-UK providers? Avoid overwhelming NCA with reports about content with no UK connection.

66.4 — Reporting Timeframes

General rule:

Reports must be made “as soon as reasonably practicable” after detection.

In practice (from regulations):

Content SeverityReporting Deadline
Immediate harm (e.g., live abuse)Within 2 hours
Other CSEA contentWithin 24 hours
Batch reports (large volume)Within 7 days (but high severity still urgent)

“Reasonably practicable” factors:

  • Volume of detections
  • Verification requirements
  • Technical system capacity

Not acceptable excuses:

  • ❌ “We don’t have automated reporting yet” — Must build it
  • ❌ “We’re a small startup” — Size doesn’t matter
  • ❌ “It’s expensive to implement” — Mandatory duty

66.5 — How to Report

Reporting mechanism:

Via National Crime Agency’s reporting portal (technical details in regulations).

Report must include:

Information CategoryDetails Required
Content detailsURLs, file hashes, content type (image/video/text)
User informationUsername, account details, IP addresses, payment info
Detection methodHow content was found (AI, user report, etc.)
TimestampWhen detected, when uploaded (if known)
UK link evidenceFor non-UK providers: why you think it’s UK-linked

Data format: Specified in regulations — typically structured JSON via API.

66.6 — Systems and Processes Requirement

Providers must implement:

Systems to identify, report, and track CSEA content.

Minimum system components:

ComponentPurpose
DetectionAI/hash matching to find CSEA (e.g., PhotoDNA, hashing against NCMEC database)
TriageHuman review to confirm detections (reduce false positives)
Reporting interfaceIntegration with NCA reporting API
Record-keepingLogs of what was reported, when, and outcome
PreservationRetain reported content for law enforcement access

Example workflow:

User uploads image

Automated hash check (PhotoDNA)

Match found → Flag for review

Human moderator confirms CSEA

System generates NCA report

Report transmitted within 24 hours

Content preserved for law enforcement

Content removed from platform

Section 67: Reporting Regulations

67.1 — Secretary of State’s Regulations

Regulations specify:

  1. Information requirements — What data fields must be in reports
  2. Format requirements — Technical specifications (JSON schema, API endpoints)
  3. Transmission methods — How to send reports securely
  4. Urgency procedures — Expedited reporting for immediate harm cases
  5. Record-keeping — What logs to maintain
  6. Data retention — How long to preserve reported content

Current regulations: The Online Safety (CSEA Reporting) Regulations 2024

67.2 — Consultation Requirements

Before issuing regulations, Secretary of State must consult:

  • ✅ National Crime Agency (operational needs)
  • ✅ OFCOM (regulatory perspective)
  • ✅ Industry representatives (technical feasibility)

Why consultation matters: Balance effective law enforcement with technical practicality.

67.3 — Data Preservation Requirements

Regulations require:

Providers must retain reported content and associated user data for law enforcement access.

Retention period: Typically 90 days minimum after reporting (regulations specify exact duration).

What to preserve:

Data TypeExamples
Content itselfOriginal file, not just hash
MetadataUpload time, device info, EXIF data
User account dataEmail, IP addresses, payment info
Communication logsIf CSEA shared via messages

Storage requirements:

  • Secure storage (encrypted)
  • Access controls (only law enforcement + limited staff)
  • Audit trail of who accessed

Section 68: NCA Information Sharing with OFCOM

Enables NCA to share information with OFCOM when:

  • OFCOM investigating provider compliance
  • OFCOM assessing service’s CSEA risks
  • NCA has evidence relevant to Online Safety Act enforcement

Example scenarios:

ScenarioNCA Can Share
Provider repeatedly fails to report detected CSEA✅ NCA tells OFCOM about non-reporting
OFCOM investigating service’s child safety measures✅ NCA shares data on CSEA prevalence on that service
NCA prosecuting CSEA case involving platform✅ NCA can share relevant evidence with OFCOM

68.2 — Privacy Safeguards

Restrictions:

  • NCA only shares when necessary for OFCOM’s Online Safety Act functions
  • Data protection laws still apply
  • Sensitive operational details protected

Purpose: Enable effective enforcement — OFCOM can’t monitor CSEA reporting compliance without NCA cooperation.


Section 69: False Reporting Offence

69.1 — Criminal Liability

It is an offence to:

Provide false information in CSEA reports intentionally or recklessly.

Mens rea (mental state) required:

Mental StateGuilty?
Intentional (knowingly false)✅ GUILTY
Reckless (careless, didn’t check)✅ GUILTY
Honest mistake (reasonable error)❌ NOT GUILTY
Technical glitch (system error)❌ NOT GUILTY

What counts as “false information”:

ExampleFalse Information?
Report fake CSEA to frame someone✅ YES — intentional
Report without verifying (turns out false)✅ YES — reckless
AI false positive, reported in good faith❌ NO — honest mistake
Wrong user ID due to database error❌ NO — technical error

69.2 — Penalties

Summary conviction:

  • Max 12 months imprisonment
  • Or fine (unlimited)
  • Or both

Conviction on indictment:

  • Max 2 years imprisonment
  • Or fine (unlimited)
  • Or both

69.3 — Why This Offence Exists

Deterrence:

  • Prevent malicious false reports (weaponizing the system)
  • Ensure providers verify before reporting
  • Protect individuals from false accusations

Balance: Don’t want to deter legitimate reporting due to fear of prosecution for mistakes.

Good-faith safe harbor: Honest errors (e.g., AI false positives reviewed in good faith) are NOT criminal.


Section 70: Definitions — CSEA Content

70.1 — What is CSEA Content?

CSEA content has the meaning in Section 59.

Categories:

TypeExamples
Category A (most serious)Sexual abuse of children under 13
Category BSexual abuse of children 13-15
Category CSexual abuse of children 16-17
Indecent imagesAny indecent photograph or pseudo-photograph of a child
Prohibited imagesNon-photographic but realistic depictions of child sexual abuse

Age determination:

  • Child = person under 18
  • If uncertain, assume child if reasonable to believe under 18

70.2 — UK Provider Definition

UK provider means:

TypeUK Provider If
IndividualHabitually resident in the UK
CompanyIncorporated in the UK (Companies Act 2006)
PartnershipFormed under UK law

Why it matters: UK providers report ALL CSEA; non-UK providers report only UK-linked.

70.3 — UK-Linked Content (Detailed)

Evidence suggesting UK links:

  1. Publication from UK

    • Uploaded via UK IP address
    • Device geolocation shows UK
    • Account registered with UK address
  2. Offender UK connection

    • User profile indicates UK nationality
    • UK payment method used
    • Communication suggests UK location
    • Previous offences in UK (if known)
  3. Victim UK connection

    • Child depicted appears to be in UK (landmarks, language, etc.)
    • Victim reported missing in UK
    • Other evidence suggests UK victim

Reasonable belief standard: Don’t need certainty — if evidence suggests UK link, report.

Better to over-report than under-report: NCA can triage; failure to report is criminal breach.


Part 2: Terms of Service [Sections 71-74]

Section 71: Duty to Enforce Terms of Service

71.1 — Who Does This Apply To?

ONLY Category 1 services.

As of 2026: ~20 largest platforms (Meta, Google/YouTube, X, TikTok, Snapchat, Reddit, etc.)

71.2 — Core Duty

Obligation:

Operate service using proportionate systems and processes to ensure content removal, access restrictions, or user suspensions only occur as stated in terms of service.

What this means:

Provider ActionMust Be In Terms?
Remove content✅ YES
Restrict access (shadow ban, limit reach)✅ YES
Suspend user account✅ YES
Ban user permanently✅ YES

Principle:

Users must know the rules. Platforms can’t make up enforcement as they go.

71.3 — Exceptions to Duty

Providers CAN act outside terms when:

ScenarioAllowed?
1. Illegal content removal✅ YES — Section 9-10 duties override
2. Children protection✅ YES — Section 11-13 duties override
3. Avoiding liability✅ YES — Defamation, copyright, etc.
4. Fraudulent advertising✅ YES — Section 38-39 duties override

Example:

Terms say: “We remove content that violates our community guidelines.”

Scenario: User posts legal but offensive content NOT covered by community guidelines.

  • Can’t remove under Section 71 (not in terms)
  • CAN remove if it’s illegal content (duty overrides)
  • CAN remove if it harms children (duty overrides)

71.4 — UK Users and UK Operations

Scope limitation:

Duty applies only to UK users and operations in the UK.

Practical effect:

User LocationTerms Enforcement Duty Applies?
UK user✅ YES
Non-UK user❌ NO (but provider may choose to apply globally)

Why limitation? OSA only regulates UK operations — can’t dictate global content policies.


Section 72: Terms of Service Content and Consistency

72.1 — User Rights Information (All Services)

ALL regulated services must include in terms:

Information about users’ rights to sue for breach if content is wrongfully removed or they’re suspended without cause.

Wording example:

“If we remove your content or suspend your account in breach of these terms, you may have the right to take legal action against us for breach of contract. We recommend seeking legal advice if you believe we’ve wrongfully removed your content.”

Why required: Users often don’t realize they have contractual rights — terms must make this clear.

72.2 — Consistent Enforcement (Category 1 Only)

Category 1 services must:

1. Clear and accessible terms:

Terms must be written in plain language with sufficient detail for users to understand when enforcement occurs.

What “clear and accessible” means:

RequirementExamples
Plain languageNo legal jargon, short sentences, everyday words
SpecificNot “we may remove harmful content” but “we remove content promoting suicide”
FindableEasy to locate (not buried in 50-page document)
Accessible formatsScreen reader compatible, translations available

2. Consistent application:

Operate systems ensuring stated policies are applied consistently.

Example:

Bad (inconsistent):

  • Terms say: “No hate speech”
  • In practice: Remove some hate speech, ignore similar violations
  • Result: Inconsistent enforcement ❌

Good (consistent):

  • Terms say: “We remove content promoting violence against protected groups”
  • In practice: Apply rule uniformly using AI + human review
  • Result: Consistent enforcement ✅

3. Easy reporting mechanisms:

Users must be able to easily report content and policy violations.

Requirements:

FeatureMust Have
In-app reportingButton/link on every piece of content
Report categoriesSpecific violation types (hate speech, CSEA, spam, etc.)
ConfirmationUser gets acknowledgment report received
No barriersNo login wall for reporting serious content

4. Accessible complaints procedures:

Users must have clear process for complaining about enforcement decisions.

Complaint handling:

User's content removed

User disagrees → Files complaint

Provider reviews complaint

Provides clear explanation (why removed, which rule violated)

If error → Restore content + notify user

If no error → Explain reasoning + inform of appeal rights

Timeline expectations: While not statutory, OFCOM guidance suggests 48 hours for initial response.

72.3 — Consumer Content Exclusion

These requirements DON’T apply to:

Terms addressing consumer content (goods/services offered via platform).

Example:

Platform hosts marketplace where businesses sell products.

  • Platform terms about user behavior → Section 72 applies ✅
  • Terms governing seller-buyer transactions → Section 72 doesn’t apply ❌

Why exclusion? Consumer contracts governed by Consumer Rights Act 2015, not OSA.


Section 73: OFCOM Guidance on Terms of Service

OFCOM must publish guidance helping Category 1 providers comply with Sections 71-72.

Guidance covers:

TopicContent
Terms clarityHow to write plain language policies
Consistent enforcementSystems to ensure uniform application
Reporting mechanismsBest practices for user reporting tools
Complaints handlingEffective procedures for appeals
TransparencyWhat information to provide users about decisions

Status: OFCOM issued guidance in 2024 (available here).


Section 74: Definitions

74.1 — Regulated User-Generated Content

Means:

Content generated by users of a regulated service (user-to-user or search service).

Includes:

  • Posts, comments, reviews
  • Uploaded media (images, videos)
  • User profiles, bios
  • Messages (if not one-to-one exempt)

74.2 — Consumer Content

Means:

Content relating to goods or services supplied to consumers.

Examples:

  • Product listings in marketplace
  • Service descriptions
  • Seller terms and conditions
  • Consumer contract details

Why defined separately: Excluded from terms of service duties (governed by consumer law instead).

74.3 — Affected Persons

Includes:

CategoryWho
UsersAnyone with account on service
Content subjectsIndividuals depicted/mentioned in content (even without account)
ParentsParents of child users
Support providersOrganizations supporting affected persons (e.g., victim support)

Why broad definition? Terms of service enforcement affects many beyond just account holders.


Practical Application for AI Agents

CSEA Reporting Workflow

For content moderation AI agents:

Detect potential CSEA content

Confidence level?
├─ HIGH (95%+) → Flag for immediate reporting
├─ MEDIUM (70-95%) → Human review queue (URGENT)
└─ LOW (<70%) → Human review queue (standard)

Human moderator reviews

Confirmed CSEA?
├─ NO → Log false positive, improve AI
└─ YES → Report to NCA
    ├─ Severity assessment
    │   ├─ Immediate harm → Report within 2 hours
    │   └─ Other CSEA → Report within 24 hours
    ├─ Preserve content (secure storage)
    ├─ Preserve user data (account, IP, payment)
    └─ Remove content from platform

CSEA Detection Technology

Recommended approaches:

TechnologyUse CaseAccuracy
PhotoDNAKnown CSEA image detection (hashing)~99% (NCMEC database)
Microsoft Azure Content ModeratorImage analysis for CSEA~95%
Google Cloud Vision APISafe search + custom CSEA models~90%
Thorn SaferChild safety technology~95%
Custom AI modelsTrained on proprietary datasetsVaries (validate carefully)

Critical:

  • ✅ Use multiple detection methods (defense in depth)
  • ✅ Always include human review for high-stakes decisions
  • ✅ Continuously update detection models

Terms of Service Transparency Checklist

For Category 1 services:

Terms clarity:

  • Written in plain language (8th-grade reading level)
  • Specific rules (not vague “harmful content”)
  • Examples provided for each rule
  • Translations available for primary user languages
  • Accessible formats (screen readers, etc.)

Consistent enforcement:

  • Automated enforcement systems (AI + human review)
  • Written enforcement guidelines for moderators
  • Quality assurance checks on moderation decisions
  • Regular audits for consistency
  • Public transparency reports showing enforcement data

Reporting mechanisms:

  • Report button on every piece of content
  • Report categories match terms of service rules
  • Confirmation message when report submitted
  • No login required for serious violations (CSEA, terrorism)

Complaints procedures:

  • Clear explanation of how to appeal
  • 48-hour initial response target
  • Human review of all appeals
  • Detailed explanation of decision
  • Information about further appeal rights

Common Compliance Mistakes

CSEA Reporting:

“We don’t proactively scan so we don’t need to report”

  • Wrong: Must report anything detected, regardless of how

“We’re end-to-end encrypted so we can’t report”

  • Partial: Can’t scan encrypted messages, BUT must report anything you DO detect (e.g., in metadata, unencrypted features)

“We’ll report when we have 100% certainty”

  • Wrong: Report when reasonably confident — NCA will investigate

“Small providers don’t need to report”

  • Wrong: ALL regulated services must report, regardless of size

Terms of Service:

“Our terms are vague to give us flexibility”

  • Wrong: Terms must be specific enough for users to understand

“We enforce terms inconsistently based on context”

  • Wrong: Must have systems for consistent application

“Users can email us to appeal”

  • Wrong: Must have accessible, clear procedure (not just email address)

“We don’t explain removal decisions to protect privacy”

  • Wrong: Must explain which rule was violated (can omit sensitive details)

Compliance Checklist

For All Regulated Services:

CSEA reporting:

  • Implement CSEA detection systems (PhotoDNA or equivalent)
  • Human review process for suspected CSEA
  • Integration with NCA reporting portal
  • Reporting within statutory timelines (2 hours for urgent, 24 hours standard)
  • Content and data preservation (90 days minimum)
  • Staff training on CSEA identification
  • False positive tracking (improve AI accuracy)
  • Record-keeping of all reports submitted

Terms of service (all services):

  • Terms include information about user rights to sue
  • Terms clearly state when content may be removed
  • Terms accessible and findable

For Category 1 Services:

Terms enforcement:

  • Terms written in plain language
  • Specific rules with examples
  • Systems for consistent enforcement
  • Easy in-app reporting mechanism
  • Clear complaints procedure
  • 48-hour complaint response target
  • Transparency reports on enforcement decisions
  • OFCOM guidance reviewed and followed

Key Takeaways

CSEA Reporting:

  1. All regulated services must report — No exemptions based on size
  2. Detection triggers reporting — Regardless of how detected
  3. UK providers report all CSEA — Non-UK providers report UK-linked only
  4. 2-24 hour reporting timelines — Urgent cases within 2 hours
  5. Preserve content and data — For law enforcement access (90 days+)
  6. False reporting is criminal — But honest mistakes protected

Terms of Service: 7. Category 1 only — Smaller services exempt 8. Enforce only as stated in terms — No arbitrary removals 9. Terms must be clear and specific — Plain language, examples provided 10. Consistent application required — Systems to ensure uniform enforcement 11. Easy reporting and complaints — Accessible mechanisms for users 12. Illegal content/children protection override — Safety duties trump terms


Citation

Part 4, Chapter 2 — CSEA Reporting, Online Safety Act 2023

Part 4, Chapter 3 — Terms of Service, Online Safety Act 2023

Related:

Contains public sector information licensed under the Open Government Licence v3.0 where applicable. This is not legal advice. Always refer to official sources for authoritative text.

llms.txt