UK

Online Safety Act 2023: Terrorism and CSEA Content Notices

Terrorism and CSEA Content Notices [Sections 121-129]

Rule: OFCOM can require services to use specific technology to identify and remove terrorism or CSEA content if current measures are inadequate. Notices issued only after expert assessment and proportionality review.

Effective: March 2024


Section 121: Power to Issue Content Technology Notices

121.1 — What Are Content Notices?

Two types of notices OFCOM can issue:

Notice TypeRequires Provider To
Accredited technology noticeUse technology already approved by Secretary of State for identifying/removing terrorism or CSEA content
Development/sourcing noticeDevelop or source new technology meeting Secretary of State’s standards

Purpose: Urgent power to ensure services effectively tackle the most harmful content (terrorism and CSEA).

121.2 — Which Services Can Receive Notices?

All regulated services where relevant:

Service TypeCan Receive Notice?
User-to-user services✅ YES
Search services✅ YES
Combined services (both user-to-user and search)✅ YES

No size exemptions:

  • Category 1, 2A, 2B, or uncategorized = all can receive notices
  • Small startups to tech giants = all subject to this power

121.3 — Accredited Technology Notice (User-to-User)

Provider must EITHER:

Option A: Use accredited technology

“Use accredited technology to identify all terrorism content communicated by means of the service and to swiftly take down that content.”

Option B: Deploy accredited technology with human review

Use accredited technology in combination with human moderators.

Key requirements:

ElementMeaning
”All terrorism content”System must identify ALL instances (not sample/portion)
“Swiftly take down”Immediate removal (within hours)
“Accredited technology”Pre-approved by Secretary of State as effective

Practical effect: OFCOM forces provider to deploy specific technology (e.g., PhotoDNA for CSEA images, specific AI models for terrorism).

121.4 — Accredited Technology Notice (Search Services)

Provider must EITHER:

Option A: Use accredited technology

“Use accredited technology to identify relevant search content and to take measures in relation to that content.”

Option B: Technology + human review

Use accredited technology with moderators.

“Measures in relation to search content”:

MeasureExample
De-indexRemove from search results
De-rankMove to bottom of results
Content warningsFlag as potentially illegal
Block accessPrevent click-through to content

Difference from user-to-user: Search services don’t “take down” content (they don’t host it), but must prevent it appearing/being accessible via search.

121.5 — Development/Sourcing Notice

If no accredited technology exists yet:

OFCOM can require provider to:

Develop or source technology meeting Secretary of State’s technical standards.

Timeline: Notices can require development/sourcing over period of up to 36 months.

Example scenario:

SituationOFCOM Action
New form of terrorism content emerges (e.g., AI-generated terror propaganda)No accredited tech yet for this
OFCOM assesses major platforms failing to address itIssue development notice
Notice requiresDevelop AI capable of identifying this content type within 12 months

Provider responsibilities:

  • Invest in R&D
  • Report progress to OFCOM
  • Deploy technology once developed

121.6 — Technology Deployment Options

Technology can be used:

  1. Alone — Automated systems only (AI/algorithms)
  2. With human moderators — Technology flags content, humans review

Practical balance:

ApproachProsCons
Technology aloneFast, scales to billions of postsFalse positives (remove legal content)
Technology + human reviewAccuracy, context considerationSlower, expensive

Most services use hybrid:

  • AI scans everything
  • High-confidence detections = auto-remove
  • Medium-confidence = human review
  • Low-confidence = monitoring

Section 122: Skilled Person Report Requirement

122.1 — Pre-Notice Investigation

BEFORE issuing notice, OFCOM must:

Obtain a report from a skilled person appointed by OFCOM.

Who is a “skilled person”?

QualificationExamples
Technical expertiseAI researchers, content moderation tech experts
Content expertiseCounter-terrorism specialists, child protection experts
IndependenceNo financial/business ties to provider or tech vendors

Report must assess:

  1. Current measures — What provider already does
  2. Effectiveness — Are current measures working?
  3. Gaps — Where are failures occurring?
  4. Technology suitability — Would accredited tech address gaps?
  5. Proportionality — Is notice necessary and proportionate?

122.2 — Report Contents

The skilled person’s report must cover:

TopicQuestions Addressed
Service characteristicsWhat type of service? User base composition? Functionalities?
Content prevalenceHow much terrorism/CSEA content is on the service?
Current systemsWhat detection/removal systems does provider use now?
EffectivenessAre current systems identifying and removing harmful content?
Technology needWould accredited technology significantly improve outcomes?
Implementation impactWhat would deploying this technology mean for the service?

Timing: Report typically takes 2-4 months to complete (site visits, data analysis, testing).

122.3 — Provider Cooperation

Provider must cooperate with skilled person:

  • ✅ Provide access to systems (APIs, moderation dashboards)
  • ✅ Supply data on content prevalence, detection rates, removal times
  • ✅ Explain current approaches
  • ✅ Allow site visits if needed

Failure to cooperate: OFCOM enforcement powers apply (fines up to £18M or 10% turnover).


Section 123: Warning Notice Procedure

123.1 — Warning Notice Contents

Before issuing final notice, OFCOM must issue warning notice containing:

ElementDetails
Technology specificationWhich accredited technology OR what standards technology must meet
Content scopeTerrorism content, CSEA content, or both
RequirementsWhat provider must do (use tech, develop tech, etc.)
Compliance periodHow long provider has to comply
Skilled person report summaryKey findings justifying notice
Right to make representationsHow provider can object/respond

123.2 — Provider Representations

Provider has opportunity to:

  • ✅ Challenge skilled person findings
  • ✅ Argue technology is inappropriate for their service
  • ✅ Propose alternative measures
  • ✅ Raise proportionality concerns
  • ✅ Submit evidence of effectiveness of current measures

Timeline: Typically 28 days to submit representations.

OFCOM must consider: All representations before issuing final notice.

123.3 — OFCOM Decision After Representations

After reviewing provider’s representations, OFCOM must:

DecisionWhen Made
Issue final noticeProvider’s arguments not persuasive, notice necessary
Modify noticeSome concerns valid, adjust requirements
Withdraw noticeProvider demonstrated current measures sufficient

Transparency: OFCOM must publish explanation of decision.


Section 124: Proportionality Requirements

124.1 — Matters OFCOM Must Consider

For accredited technology notices, OFCOM must consider:

1. Service characteristics:

FactorWhy Relevant
Kind of serviceDating app vs video platform = different tech needs
FunctionalitiesLive streaming vs static posts = different detection challenges
User base compositionMillions of users vs thousands = different scale

2. Content prevalence:

Service TypeAssessment
User-to-userHow prevalent is terrorism/CSEA content? How widely is it disseminated?
SearchHow often does terrorism/CSEA appear in search results?

3. Risk level:

FactorAssessment
Harm likelihoodHow likely are UK individuals to encounter this content?
Harm severityHow severe is the harm caused? (terrorism/CSEA = highest severity)

4. Current safeguards:

FactorQuestion
Existing systemsWhat is provider already doing?
EffectivenessAre current systems working to some degree?

5. Freedom of expression impact:

ConcernAssessment
Over-blockingWill technology remove legal content?
Chilling effectWill users self-censor due to fear of false positives?
Journalistic contentWill news reporting on terrorism be removed?

6. Privacy impact:

ConcernAssessment
Content scanningDoes technology require reading private messages?
Data collectionWhat user data must be collected/analyzed?
Legal complianceDoes deployment violate GDPR, DPA 2018, or other privacy laws?

7. Journalistic content protection (user-to-user only):

ConcernAssessment
Content of democratic importanceWill removal affect public interest journalism?
Source protectionWill technology endanger journalistic source confidentiality?

8. Alternative measures:

QuestionAssessment
Less intrusive optionsCould provider achieve significant harm reduction without this technology?
ProportionalityIs notice the least intrusive measure that would be effective?

124.2 — Special Rules for CSEA Development Notices

For notices requiring development/sourcing of CSEA technology:

OFCOM does NOT need to consider:

  • ❌ Freedom of expression concerns
  • ❌ Privacy law compliance
  • ❌ Journalistic content impact
  • ❌ Alternative measures

Why exemption?

CSEA content has NO freedom of expression protection — it’s always illegal.

BUT: Still must consider service characteristics, content prevalence, risk level.

Practical effect: Higher bar to challenge CSEA technology notices — provider can’t argue “privacy concerns” or “over-blocking legal content” for CSEA specifically.

124.3 — Balancing Test

OFCOM must weigh:

Harm prevention benefit
    vs
Rights interference cost

Only issue notice if:
Benefits > Costs
AND
No less intrusive alternative achieves similar benefits

Example:

ScenarioProportionate?
Massive terrorism content problem, tech solves it, minimal over-blocking✅ YES
Small terrorism problem, tech would remove 10% legal content, major privacy invasion❌ NO
Medium CSEA problem, tech highly accurate, no freedom of expression issue (CSEA illegal)✅ YES

Section 125: Final Notice Requirements

125.1 — Notice Must Specify

When issuing final notice, OFCOM must include:

ElementDetails
Technology to be usedSpecific accredited tech OR standards tech must meet
Content scopeTerrorism content, CSEA content, or both
Service scopeWhich parts of service (e.g., only public posts, or also private messages)
Compliance periodReasonable time to deploy technology
Compliance measuresWhat provider must do to demonstrate compliance
Review dateWhen OFCOM will assess compliance

125.2 — Compliance Period

“Reasonable period” means:

Technology TypeTypical Period
Accredited technology (already exists)3-6 months
Development/sourcing notice12-36 months

Factors affecting timeline:

FactorImpact
Service sizeLarger = more time needed (integrate with complex systems)
Technology complexityCustom AI = longer than off-the-shelf
Privacy engineeringPrivacy-preserving deployment = more time

Provider can request extension: If unexpected technical challenges arise.

125.3 — Technology Deployment Requirements

Provider must:

  1. Deploy technology — Implement as specified in notice
  2. Apply service-wide — Cover all relevant content (can’t just do subset)
  3. Maintain effectiveness — Keep technology updated and working
  4. Report to OFCOM — Demonstrate deployment and effectiveness

Ongoing obligations: Technology must remain in use for duration specified in notice (up to 36 months for development notices).


Section 126: Compliance Review

126.1 — OFCOM Review Before Expiry

Before notice expires, OFCOM must:

Review whether provider has complied with notice requirements.

Review includes:

AssessmentQuestion
Technology deployed?Did provider implement required technology?
Effective deployment?Is technology actually identifying/removing content?
Service-wide application?Applied across entire service, not just subset?
Continued need?Is notice still necessary?

126.2 — Possible Outcomes

After review, OFCOM can:

OutcomeWhen
Confirm complianceProvider met requirements, notice expires
Issue new noticeContinued need for technology, extend or modify
Enforcement actionProvider failed to comply, penalties apply

Notice renewal: If content problem persists, OFCOM can issue new notice extending technology requirement.


Section 127: Complaints Procedures

127.1 — User Right to Challenge Removals

Services subject to content notices must:

Provide accessible complaints procedures for users whose content is removed.

Procedure must allow users to:

  • ✅ Challenge removal decisions
  • ✅ Request human review (if automated removal)
  • ✅ Appeal if challenge rejected

Timeline: Services must respond to complaints within 48 hours (OFCOM guidance).

127.2 — Complaints Handling

Process:

User's content removed by notice-mandated technology

User files complaint: "This was removed incorrectly"

Service reviews:
├─ Technology error (false positive)? → Restore content, apologize
├─ Borderline case? → Human moderator reviews
└─ Correctly removed (terrorism/CSEA)? → Explain why, uphold removal

User can appeal to second-tier review

Transparency: Services must explain:

  • Why content was removed (which rule violated)
  • How to appeal (clear process)
  • Appeal timelines (when decision expected)

Section 128: Appeals to Court

128.1 — Provider Right to Appeal

Providers can appeal content notices to Upper Tribunal (Section 168).

Grounds for appeal:

GroundArgument
Not proportionateNotice is excessive given service characteristics
Technology inappropriateAccredited tech doesn’t work for this service type
Procedural errorOFCOM didn’t follow proper process (e.g., no skilled person report)
Current measures sufficientProvider already effectively addressing content

Tribunal can:

  • ✅ Uphold notice (provider must comply)
  • ✅ Modify notice (change requirements)
  • ✅ Quash notice (cancel entirely)

Timeline: Appeal must be filed within 28 days of final notice issuance.


Section 129: OFCOM Reporting on Technology Development

129.1 — Annual Report Requirement

OFCOM must publish annual report on:

Technology for identifying terrorism and CSEA content.

Report includes:

TopicDetails
Accredited technologiesList of approved technologies, how they work
Development progressTechnologies being developed pursuant to notices
EffectivenessHow well technologies are working
InnovationNew technologies emerging
ChallengesObstacles to effective detection/removal

Purpose:

  • Transparency for Parliament and public
  • Inform industry about available/emerging technologies
  • Drive innovation in content safety tech

129.2 — Technology Accreditation Process

Secretary of State accredits technology after:

  1. Technical testing — Validate effectiveness at identifying content
  2. Accuracy assessment — Measure false positive/negative rates
  3. Privacy review — Ensure compliance with privacy laws
  4. Independent evaluation — Third-party experts assess

Accredited technologies (2026 examples):

TechnologyUse CaseEffectiveness
PhotoDNAKnown CSEA image detection (hash matching)~99%
Microsoft Content ModeratorCSEA detection in images/video~95%
Google Jigsaw Perspective APITerrorism text detection~90%
Thorn SaferCSEA detection across multiple media types~95%
Custom AI modelsEmerging terrorism content (e.g., AI-generated propaganda)Varies

Practical Application for AI Agents

When Might Your Service Receive a Notice?

Red flags that could trigger OFCOM investigation:

IndicatorWhy Concerning
High content prevalenceSignificant amounts of terrorism/CSEA content found
Slow removal timesContent reported but not removed quickly
Repeat offendersSame users repeatedly post harmful content
Platform designFeatures facilitate spread of harmful content
Inadequate moderationNo effective systems for detecting/removing content

Example scenario:

Video sharing platform has rising terrorism content

NGOs report content to OFCOM

OFCOM investigates, finds provider's detection systems miss 40% of terrorism videos

OFCOM appoints skilled person to assess

Report confirms significant gaps in detection

OFCOM issues warning notice requiring PhotoDNA deployment

Provider objects, claims current AI sufficient

OFCOM considers representations, finds AI only 60% effective

OFCOM issues final notice: Deploy PhotoDNA within 4 months

Provider must comply or face penalties

Responding to a Warning Notice

If you receive a warning notice:

Step 1: Review skilled person report

  • Understand findings
  • Identify gaps in current measures
  • Assess whether criticisms are valid

Step 2: Evaluate proposed technology

  • Is accredited tech compatible with your service?
  • What would deployment cost/require?
  • Are there privacy or freedom of expression concerns?

Step 3: Prepare representations

Argument TypeExamples
Factual challenge”Skilled person overstated content prevalence — actual rate is X%“
Technical objection”Accredited tech doesn’t work for our service type (e.g., live streaming)“
Proportionality”Our current measures remove 95% of content — notice not proportionate”
Alternative measures”We propose different approach that’s equally effective”
Privacy concerns”Deployment would require scanning encrypted messages, violating privacy law”

Step 4: Submit representations

  • Within 28 days
  • Include evidence (data, expert opinions, technical documentation)
  • Be specific and substantive

Step 5: Engage with OFCOM

  • Meetings to discuss concerns
  • Demonstrate current effectiveness where possible
  • Propose compromises if appropriate

Complying with Final Notice

If OFCOM issues final notice despite your representations:

Deployment process:

1. Acquire technology (if accredited tech) or develop (if development notice)

2. Integration planning
├─ Technical integration (APIs, data flows)
├─ Privacy engineering (minimize data collection)
├─ False positive management (human review for borderline cases)
└─ User communication (explain new measures)

3. Testing
├─ Accuracy testing (measure detection rates)
├─ False positive testing (ensure legal content not over-blocked)
└─ Performance testing (ensure doesn't break service)

4. Deployment
├─ Phased rollout (start with subset, expand)
├─ Monitor effectiveness
└─ Adjust as needed

5. Reporting to OFCOM
├─ Demonstrate deployment
├─ Provide effectiveness data
└─ Explain any challenges

Timeline management:

MilestoneTypical Timeline
Acquire technology1-2 months
Integration2-3 months
Testing1 month
Deployment1 month
Total5-7 months

Extension requests: If unforeseen technical challenges, request extension with detailed justification.

Privacy-Preserving Deployment

Key challenge: Content notices may require scanning content, raising privacy concerns.

Privacy-preserving approaches:

TechniqueHow It Helps
On-device scanningScan content on user’s device, not server
Hash matching onlyCompare hashes, not content itself
End-to-end encryption preservationScan before encryption or after decryption (client-side)
Data minimizationOnly collect data necessary for detection
AnonymizationRemove identifying information from scans

Example: Encrypted messaging service

User sends image via encrypted message

On user's device (before encryption):
    PhotoDNA generates hash of image

Hash compared to known CSEA hashes

Match found → Report to provider (hash only, not image)

Provider investigates using hash

If confirmed CSEA → Remove, report to NCA

Message remains encrypted throughout (privacy preserved)

Common Compliance Mistakes

“We can’t deploy because we’re end-to-end encrypted”

  • Challenge but not impossible: Use client-side scanning (before encryption)

“Technology is too expensive”

  • Cost alone generally not valid objection (but can argue disproportionate for very small services)

“We’ll wait for OFCOM to enforce before deploying”

  • Delays harm prevention, increases penalties if ultimately found non-compliant

“We’ll only deploy in UK to minimize cost”

  • Allowed but geo-fencing complex — often easier to deploy globally

“We object because technology has false positives”

  • All technology has some false positives — provide human review for edge cases

Compliance Checklist

If you receive a warning notice:

Immediate actions:

  • Review skilled person report thoroughly
  • Assess current measures’ effectiveness (gather data)
  • Evaluate proposed technology compatibility
  • Identify privacy/freedom of expression concerns
  • Prepare representations (within 28 days)

If final notice issued:

Deployment:

  • Acquire/develop required technology
  • Plan privacy-preserving implementation
  • Integrate with existing systems
  • Test effectiveness (accuracy, false positives)
  • Phased rollout
  • Monitor and adjust

Ongoing compliance:

  • Maintain technology effectiveness
  • Provide accessible complaints procedure
  • Report to OFCOM as required
  • Prepare for compliance review before notice expiry

Key Takeaways

  1. Urgent power — OFCOM can require specific technology for terrorism/CSEA
  2. Skilled person report first — Independent assessment before notice issued
  3. Warning notice required — Provider can make representations before final notice
  4. Proportionality essential — OFCOM must consider freedom of expression, privacy, alternatives
  5. CSEA exception — Fewer safeguards for CSEA notices (no freedom of expression consideration)
  6. Reasonable compliance period — 3-6 months for accredited tech, up to 36 months for development
  7. Appeals available — Providers can challenge notices in Upper Tribunal
  8. User complaints required — Services must provide process for users to challenge removals
  9. High bar for OFCOM — Multiple procedural safeguards before notice issued
  10. Technology accreditation — Secretary of State must approve technologies before OFCOM can mandate them

Citation

Part 7, Chapter 5 — Terrorism and CSEA Content Notices, Online Safety Act 2023

Related:

Contains public sector information licensed under the Open Government Licence v3.0 where applicable. This is not legal advice. Always refer to official sources for authoritative text.

llms.txt