DSA: Content Moderation Rules
Content Moderation Rules [Art 14-15, 20-21, 23]
Rule: The DSA sets requirements for how platforms moderate content, including terms and conditions, transparency reports, internal complaint mechanisms, and restrictions on suspension decisions.
Terms and Conditions [Art 14]
All intermediary services must include in their T&Cs information about:
| Required Information | Description |
|---|---|
| Content restrictions | Policies on content types, account suspension |
| Decision-making | Procedures and safeguards for restrictions |
| Human review | Where and how human review is available |
| Internal rules | Algorithmic decision-making and human review rules |
Enforcement Consistency [Art 14(4)]
Providers must act in a:
- Diligent manner
- Objective manner
- Proportionate manner
With due regard for the rights and legitimate interests of all parties, including fundamental rights.
Transparency Reporting [Art 15]
All intermediary services must publish annual transparency reports:
| Report Element | Applicable To |
|---|---|
| Orders received from authorities | All intermediaries |
| Notices received | Hosting providers |
| Content moderation at own initiative | Hosting providers |
| Complaints received | Platforms |
| Automated detection use | Where applicable |
| Out-of-court dispute outcomes | Platforms |
Internal Complaint Mechanism [Art 20]
Online platforms must provide access to an internal complaint-handling system:
Who Can Complain [Art 20(1)]
- Recipients affected by platform decisions
- Notice submitters whose notices were rejected
Decisions Covered [Art 20(1)]
Complaints allowed against:
- Removal/disabling of content
- Suspension/termination of service
- Suspension/termination of account
- Suspension of monetization
- Rejection of a notice
Requirements [Art 20(4)-(6)]
| Requirement | Detail |
|---|---|
| Free of charge | No fees for complainants |
| Easy to access | User-friendly interface |
| Timely | Decisions without undue delay |
| Not automated | Decisions cannot be solely automated if human review requested |
| Reverse option | Must be able to reverse decisions |
Out-of-Court Dispute Resolution [Art 21]
Recipients have right to select an out-of-court dispute settlement body:
- Certified by Digital Services Coordinator
- Independent and impartial
- Binding on the platform (if user accepts)
- Platform bears costs unless complaint manifestly unfounded
Suspension Restrictions [Art 23]
Platforms may only suspend/terminate accounts when:
Prior Warning Requirement [Art 23(1)]
Before suspension, must provide:
- Reasonable time to respond
- Statement of reasons
- Except for serious violations
Proportionality [Art 23(2)]
Suspension must be:
- Proportionate to the violation
- Consider all circumstances
- Not impose permanent suspensions for minor infractions
Misuse Suspension [Art 23(3)]
Platforms may suspend accounts that frequently submit manifestly unfounded notices or complaints:
- After issuing prior warning
- Proportionate duration
Automated Content Moderation [Art 14(5), Art 20(6)]
When using automated systems:
| Requirement | Detail |
|---|---|
| Transparency | Disclose use in T&Cs |
| Human review | Available for complaints |
| Accuracy | VLOP must report accuracy metrics |
| Not sole decision | Cannot rely solely on automation for complaints if human review requested |
Prohibition on General Monitoring [Art 8]
Important limitation:
No general obligation shall be imposed on providers to monitor information they transmit or store, nor actively to seek facts indicating illegal activity.
Content moderation is voluntary or in response to specific notices/orders.