Online Safety Act 2023: Terrorism and CSEA Content Notices
Terrorism and CSEA Content Notices [Sections 121-129]
Rule: OFCOM can require services to use specific technology to identify and remove terrorism or CSEA content if current measures are inadequate. Notices issued only after expert assessment and proportionality review.
Effective: March 2024
Section 121: Power to Issue Content Technology Notices
121.1 — What Are Content Notices?
Two types of notices OFCOM can issue:
| Notice Type | Requires Provider To |
|---|---|
| Accredited technology notice | Use technology already approved by Secretary of State for identifying/removing terrorism or CSEA content |
| Development/sourcing notice | Develop or source new technology meeting Secretary of State’s standards |
Purpose: Urgent power to ensure services effectively tackle the most harmful content (terrorism and CSEA).
121.2 — Which Services Can Receive Notices?
All regulated services where relevant:
| Service Type | Can Receive Notice? |
|---|---|
| User-to-user services | ✅ YES |
| Search services | ✅ YES |
| Combined services (both user-to-user and search) | ✅ YES |
No size exemptions:
- Category 1, 2A, 2B, or uncategorized = all can receive notices
- Small startups to tech giants = all subject to this power
121.3 — Accredited Technology Notice (User-to-User)
Provider must EITHER:
Option A: Use accredited technology
“Use accredited technology to identify all terrorism content communicated by means of the service and to swiftly take down that content.”
Option B: Deploy accredited technology with human review
Use accredited technology in combination with human moderators.
Key requirements:
| Element | Meaning |
|---|---|
| ”All terrorism content” | System must identify ALL instances (not sample/portion) |
| “Swiftly take down” | Immediate removal (within hours) |
| “Accredited technology” | Pre-approved by Secretary of State as effective |
Practical effect: OFCOM forces provider to deploy specific technology (e.g., PhotoDNA for CSEA images, specific AI models for terrorism).
121.4 — Accredited Technology Notice (Search Services)
Provider must EITHER:
Option A: Use accredited technology
“Use accredited technology to identify relevant search content and to take measures in relation to that content.”
Option B: Technology + human review
Use accredited technology with moderators.
“Measures in relation to search content”:
| Measure | Example |
|---|---|
| De-index | Remove from search results |
| De-rank | Move to bottom of results |
| Content warnings | Flag as potentially illegal |
| Block access | Prevent click-through to content |
Difference from user-to-user: Search services don’t “take down” content (they don’t host it), but must prevent it appearing/being accessible via search.
121.5 — Development/Sourcing Notice
If no accredited technology exists yet:
OFCOM can require provider to:
Develop or source technology meeting Secretary of State’s technical standards.
Timeline: Notices can require development/sourcing over period of up to 36 months.
Example scenario:
| Situation | OFCOM Action |
|---|---|
| New form of terrorism content emerges (e.g., AI-generated terror propaganda) | No accredited tech yet for this |
| OFCOM assesses major platforms failing to address it | Issue development notice |
| Notice requires | Develop AI capable of identifying this content type within 12 months |
Provider responsibilities:
- Invest in R&D
- Report progress to OFCOM
- Deploy technology once developed
121.6 — Technology Deployment Options
Technology can be used:
- Alone — Automated systems only (AI/algorithms)
- With human moderators — Technology flags content, humans review
Practical balance:
| Approach | Pros | Cons |
|---|---|---|
| Technology alone | Fast, scales to billions of posts | False positives (remove legal content) |
| Technology + human review | Accuracy, context consideration | Slower, expensive |
Most services use hybrid:
- AI scans everything
- High-confidence detections = auto-remove
- Medium-confidence = human review
- Low-confidence = monitoring
Section 122: Skilled Person Report Requirement
122.1 — Pre-Notice Investigation
BEFORE issuing notice, OFCOM must:
Obtain a report from a skilled person appointed by OFCOM.
Who is a “skilled person”?
| Qualification | Examples |
|---|---|
| Technical expertise | AI researchers, content moderation tech experts |
| Content expertise | Counter-terrorism specialists, child protection experts |
| Independence | No financial/business ties to provider or tech vendors |
Report must assess:
- Current measures — What provider already does
- Effectiveness — Are current measures working?
- Gaps — Where are failures occurring?
- Technology suitability — Would accredited tech address gaps?
- Proportionality — Is notice necessary and proportionate?
122.2 — Report Contents
The skilled person’s report must cover:
| Topic | Questions Addressed |
|---|---|
| Service characteristics | What type of service? User base composition? Functionalities? |
| Content prevalence | How much terrorism/CSEA content is on the service? |
| Current systems | What detection/removal systems does provider use now? |
| Effectiveness | Are current systems identifying and removing harmful content? |
| Technology need | Would accredited technology significantly improve outcomes? |
| Implementation impact | What would deploying this technology mean for the service? |
Timing: Report typically takes 2-4 months to complete (site visits, data analysis, testing).
122.3 — Provider Cooperation
Provider must cooperate with skilled person:
- ✅ Provide access to systems (APIs, moderation dashboards)
- ✅ Supply data on content prevalence, detection rates, removal times
- ✅ Explain current approaches
- ✅ Allow site visits if needed
Failure to cooperate: OFCOM enforcement powers apply (fines up to £18M or 10% turnover).
Section 123: Warning Notice Procedure
123.1 — Warning Notice Contents
Before issuing final notice, OFCOM must issue warning notice containing:
| Element | Details |
|---|---|
| Technology specification | Which accredited technology OR what standards technology must meet |
| Content scope | Terrorism content, CSEA content, or both |
| Requirements | What provider must do (use tech, develop tech, etc.) |
| Compliance period | How long provider has to comply |
| Skilled person report summary | Key findings justifying notice |
| Right to make representations | How provider can object/respond |
123.2 — Provider Representations
Provider has opportunity to:
- ✅ Challenge skilled person findings
- ✅ Argue technology is inappropriate for their service
- ✅ Propose alternative measures
- ✅ Raise proportionality concerns
- ✅ Submit evidence of effectiveness of current measures
Timeline: Typically 28 days to submit representations.
OFCOM must consider: All representations before issuing final notice.
123.3 — OFCOM Decision After Representations
After reviewing provider’s representations, OFCOM must:
| Decision | When Made |
|---|---|
| Issue final notice | Provider’s arguments not persuasive, notice necessary |
| Modify notice | Some concerns valid, adjust requirements |
| Withdraw notice | Provider demonstrated current measures sufficient |
Transparency: OFCOM must publish explanation of decision.
Section 124: Proportionality Requirements
124.1 — Matters OFCOM Must Consider
For accredited technology notices, OFCOM must consider:
1. Service characteristics:
| Factor | Why Relevant |
|---|---|
| Kind of service | Dating app vs video platform = different tech needs |
| Functionalities | Live streaming vs static posts = different detection challenges |
| User base composition | Millions of users vs thousands = different scale |
2. Content prevalence:
| Service Type | Assessment |
|---|---|
| User-to-user | How prevalent is terrorism/CSEA content? How widely is it disseminated? |
| Search | How often does terrorism/CSEA appear in search results? |
3. Risk level:
| Factor | Assessment |
|---|---|
| Harm likelihood | How likely are UK individuals to encounter this content? |
| Harm severity | How severe is the harm caused? (terrorism/CSEA = highest severity) |
4. Current safeguards:
| Factor | Question |
|---|---|
| Existing systems | What is provider already doing? |
| Effectiveness | Are current systems working to some degree? |
5. Freedom of expression impact:
| Concern | Assessment |
|---|---|
| Over-blocking | Will technology remove legal content? |
| Chilling effect | Will users self-censor due to fear of false positives? |
| Journalistic content | Will news reporting on terrorism be removed? |
6. Privacy impact:
| Concern | Assessment |
|---|---|
| Content scanning | Does technology require reading private messages? |
| Data collection | What user data must be collected/analyzed? |
| Legal compliance | Does deployment violate GDPR, DPA 2018, or other privacy laws? |
7. Journalistic content protection (user-to-user only):
| Concern | Assessment |
|---|---|
| Content of democratic importance | Will removal affect public interest journalism? |
| Source protection | Will technology endanger journalistic source confidentiality? |
8. Alternative measures:
| Question | Assessment |
|---|---|
| Less intrusive options | Could provider achieve significant harm reduction without this technology? |
| Proportionality | Is notice the least intrusive measure that would be effective? |
124.2 — Special Rules for CSEA Development Notices
For notices requiring development/sourcing of CSEA technology:
OFCOM does NOT need to consider:
- ❌ Freedom of expression concerns
- ❌ Privacy law compliance
- ❌ Journalistic content impact
- ❌ Alternative measures
Why exemption?
CSEA content has NO freedom of expression protection — it’s always illegal.
BUT: Still must consider service characteristics, content prevalence, risk level.
Practical effect: Higher bar to challenge CSEA technology notices — provider can’t argue “privacy concerns” or “over-blocking legal content” for CSEA specifically.
124.3 — Balancing Test
OFCOM must weigh:
Harm prevention benefit
vs
Rights interference cost
↓
Only issue notice if:
Benefits > Costs
AND
No less intrusive alternative achieves similar benefits
Example:
| Scenario | Proportionate? |
|---|---|
| Massive terrorism content problem, tech solves it, minimal over-blocking | ✅ YES |
| Small terrorism problem, tech would remove 10% legal content, major privacy invasion | ❌ NO |
| Medium CSEA problem, tech highly accurate, no freedom of expression issue (CSEA illegal) | ✅ YES |
Section 125: Final Notice Requirements
125.1 — Notice Must Specify
When issuing final notice, OFCOM must include:
| Element | Details |
|---|---|
| Technology to be used | Specific accredited tech OR standards tech must meet |
| Content scope | Terrorism content, CSEA content, or both |
| Service scope | Which parts of service (e.g., only public posts, or also private messages) |
| Compliance period | Reasonable time to deploy technology |
| Compliance measures | What provider must do to demonstrate compliance |
| Review date | When OFCOM will assess compliance |
125.2 — Compliance Period
“Reasonable period” means:
| Technology Type | Typical Period |
|---|---|
| Accredited technology (already exists) | 3-6 months |
| Development/sourcing notice | 12-36 months |
Factors affecting timeline:
| Factor | Impact |
|---|---|
| Service size | Larger = more time needed (integrate with complex systems) |
| Technology complexity | Custom AI = longer than off-the-shelf |
| Privacy engineering | Privacy-preserving deployment = more time |
Provider can request extension: If unexpected technical challenges arise.
125.3 — Technology Deployment Requirements
Provider must:
- Deploy technology — Implement as specified in notice
- Apply service-wide — Cover all relevant content (can’t just do subset)
- Maintain effectiveness — Keep technology updated and working
- Report to OFCOM — Demonstrate deployment and effectiveness
Ongoing obligations: Technology must remain in use for duration specified in notice (up to 36 months for development notices).
Section 126: Compliance Review
126.1 — OFCOM Review Before Expiry
Before notice expires, OFCOM must:
Review whether provider has complied with notice requirements.
Review includes:
| Assessment | Question |
|---|---|
| Technology deployed? | Did provider implement required technology? |
| Effective deployment? | Is technology actually identifying/removing content? |
| Service-wide application? | Applied across entire service, not just subset? |
| Continued need? | Is notice still necessary? |
126.2 — Possible Outcomes
After review, OFCOM can:
| Outcome | When |
|---|---|
| Confirm compliance | Provider met requirements, notice expires |
| Issue new notice | Continued need for technology, extend or modify |
| Enforcement action | Provider failed to comply, penalties apply |
Notice renewal: If content problem persists, OFCOM can issue new notice extending technology requirement.
Section 127: Complaints Procedures
127.1 — User Right to Challenge Removals
Services subject to content notices must:
Provide accessible complaints procedures for users whose content is removed.
Procedure must allow users to:
- ✅ Challenge removal decisions
- ✅ Request human review (if automated removal)
- ✅ Appeal if challenge rejected
Timeline: Services must respond to complaints within 48 hours (OFCOM guidance).
127.2 — Complaints Handling
Process:
User's content removed by notice-mandated technology
↓
User files complaint: "This was removed incorrectly"
↓
Service reviews:
├─ Technology error (false positive)? → Restore content, apologize
├─ Borderline case? → Human moderator reviews
└─ Correctly removed (terrorism/CSEA)? → Explain why, uphold removal
↓
User can appeal to second-tier review
Transparency: Services must explain:
- Why content was removed (which rule violated)
- How to appeal (clear process)
- Appeal timelines (when decision expected)
Section 128: Appeals to Court
128.1 — Provider Right to Appeal
Providers can appeal content notices to Upper Tribunal (Section 168).
Grounds for appeal:
| Ground | Argument |
|---|---|
| Not proportionate | Notice is excessive given service characteristics |
| Technology inappropriate | Accredited tech doesn’t work for this service type |
| Procedural error | OFCOM didn’t follow proper process (e.g., no skilled person report) |
| Current measures sufficient | Provider already effectively addressing content |
Tribunal can:
- ✅ Uphold notice (provider must comply)
- ✅ Modify notice (change requirements)
- ✅ Quash notice (cancel entirely)
Timeline: Appeal must be filed within 28 days of final notice issuance.
Section 129: OFCOM Reporting on Technology Development
129.1 — Annual Report Requirement
OFCOM must publish annual report on:
Technology for identifying terrorism and CSEA content.
Report includes:
| Topic | Details |
|---|---|
| Accredited technologies | List of approved technologies, how they work |
| Development progress | Technologies being developed pursuant to notices |
| Effectiveness | How well technologies are working |
| Innovation | New technologies emerging |
| Challenges | Obstacles to effective detection/removal |
Purpose:
- Transparency for Parliament and public
- Inform industry about available/emerging technologies
- Drive innovation in content safety tech
129.2 — Technology Accreditation Process
Secretary of State accredits technology after:
- Technical testing — Validate effectiveness at identifying content
- Accuracy assessment — Measure false positive/negative rates
- Privacy review — Ensure compliance with privacy laws
- Independent evaluation — Third-party experts assess
Accredited technologies (2026 examples):
| Technology | Use Case | Effectiveness |
|---|---|---|
| PhotoDNA | Known CSEA image detection (hash matching) | ~99% |
| Microsoft Content Moderator | CSEA detection in images/video | ~95% |
| Google Jigsaw Perspective API | Terrorism text detection | ~90% |
| Thorn Safer | CSEA detection across multiple media types | ~95% |
| Custom AI models | Emerging terrorism content (e.g., AI-generated propaganda) | Varies |
Practical Application for AI Agents
When Might Your Service Receive a Notice?
Red flags that could trigger OFCOM investigation:
| Indicator | Why Concerning |
|---|---|
| High content prevalence | Significant amounts of terrorism/CSEA content found |
| Slow removal times | Content reported but not removed quickly |
| Repeat offenders | Same users repeatedly post harmful content |
| Platform design | Features facilitate spread of harmful content |
| Inadequate moderation | No effective systems for detecting/removing content |
Example scenario:
Video sharing platform has rising terrorism content
↓
NGOs report content to OFCOM
↓
OFCOM investigates, finds provider's detection systems miss 40% of terrorism videos
↓
OFCOM appoints skilled person to assess
↓
Report confirms significant gaps in detection
↓
OFCOM issues warning notice requiring PhotoDNA deployment
↓
Provider objects, claims current AI sufficient
↓
OFCOM considers representations, finds AI only 60% effective
↓
OFCOM issues final notice: Deploy PhotoDNA within 4 months
↓
Provider must comply or face penalties
Responding to a Warning Notice
If you receive a warning notice:
Step 1: Review skilled person report
- Understand findings
- Identify gaps in current measures
- Assess whether criticisms are valid
Step 2: Evaluate proposed technology
- Is accredited tech compatible with your service?
- What would deployment cost/require?
- Are there privacy or freedom of expression concerns?
Step 3: Prepare representations
| Argument Type | Examples |
|---|---|
| Factual challenge | ”Skilled person overstated content prevalence — actual rate is X%“ |
| Technical objection | ”Accredited tech doesn’t work for our service type (e.g., live streaming)“ |
| Proportionality | ”Our current measures remove 95% of content — notice not proportionate” |
| Alternative measures | ”We propose different approach that’s equally effective” |
| Privacy concerns | ”Deployment would require scanning encrypted messages, violating privacy law” |
Step 4: Submit representations
- Within 28 days
- Include evidence (data, expert opinions, technical documentation)
- Be specific and substantive
Step 5: Engage with OFCOM
- Meetings to discuss concerns
- Demonstrate current effectiveness where possible
- Propose compromises if appropriate
Complying with Final Notice
If OFCOM issues final notice despite your representations:
Deployment process:
1. Acquire technology (if accredited tech) or develop (if development notice)
↓
2. Integration planning
├─ Technical integration (APIs, data flows)
├─ Privacy engineering (minimize data collection)
├─ False positive management (human review for borderline cases)
└─ User communication (explain new measures)
↓
3. Testing
├─ Accuracy testing (measure detection rates)
├─ False positive testing (ensure legal content not over-blocked)
└─ Performance testing (ensure doesn't break service)
↓
4. Deployment
├─ Phased rollout (start with subset, expand)
├─ Monitor effectiveness
└─ Adjust as needed
↓
5. Reporting to OFCOM
├─ Demonstrate deployment
├─ Provide effectiveness data
└─ Explain any challenges
Timeline management:
| Milestone | Typical Timeline |
|---|---|
| Acquire technology | 1-2 months |
| Integration | 2-3 months |
| Testing | 1 month |
| Deployment | 1 month |
| Total | 5-7 months |
Extension requests: If unforeseen technical challenges, request extension with detailed justification.
Privacy-Preserving Deployment
Key challenge: Content notices may require scanning content, raising privacy concerns.
Privacy-preserving approaches:
| Technique | How It Helps |
|---|---|
| On-device scanning | Scan content on user’s device, not server |
| Hash matching only | Compare hashes, not content itself |
| End-to-end encryption preservation | Scan before encryption or after decryption (client-side) |
| Data minimization | Only collect data necessary for detection |
| Anonymization | Remove identifying information from scans |
Example: Encrypted messaging service
User sends image via encrypted message
↓
On user's device (before encryption):
PhotoDNA generates hash of image
↓
Hash compared to known CSEA hashes
↓
Match found → Report to provider (hash only, not image)
↓
Provider investigates using hash
↓
If confirmed CSEA → Remove, report to NCA
↓
Message remains encrypted throughout (privacy preserved)
Common Compliance Mistakes
❌ “We can’t deploy because we’re end-to-end encrypted”
- Challenge but not impossible: Use client-side scanning (before encryption)
❌ “Technology is too expensive”
- Cost alone generally not valid objection (but can argue disproportionate for very small services)
❌ “We’ll wait for OFCOM to enforce before deploying”
- Delays harm prevention, increases penalties if ultimately found non-compliant
❌ “We’ll only deploy in UK to minimize cost”
- Allowed but geo-fencing complex — often easier to deploy globally
❌ “We object because technology has false positives”
- All technology has some false positives — provide human review for edge cases
Compliance Checklist
If you receive a warning notice:
Immediate actions:
- Review skilled person report thoroughly
- Assess current measures’ effectiveness (gather data)
- Evaluate proposed technology compatibility
- Identify privacy/freedom of expression concerns
- Prepare representations (within 28 days)
If final notice issued:
Deployment:
- Acquire/develop required technology
- Plan privacy-preserving implementation
- Integrate with existing systems
- Test effectiveness (accuracy, false positives)
- Phased rollout
- Monitor and adjust
Ongoing compliance:
- Maintain technology effectiveness
- Provide accessible complaints procedure
- Report to OFCOM as required
- Prepare for compliance review before notice expiry
Key Takeaways
- Urgent power — OFCOM can require specific technology for terrorism/CSEA
- Skilled person report first — Independent assessment before notice issued
- Warning notice required — Provider can make representations before final notice
- Proportionality essential — OFCOM must consider freedom of expression, privacy, alternatives
- CSEA exception — Fewer safeguards for CSEA notices (no freedom of expression consideration)
- Reasonable compliance period — 3-6 months for accredited tech, up to 36 months for development
- Appeals available — Providers can challenge notices in Upper Tribunal
- User complaints required — Services must provide process for users to challenge removals
- High bar for OFCOM — Multiple procedural safeguards before notice issued
- Technology accreditation — Secretary of State must approve technologies before OFCOM can mandate them
Citation
Part 7, Chapter 5 — Terrorism and CSEA Content Notices, Online Safety Act 2023
Related: