EU

DSA: Content Moderation Rules

Content Moderation Rules [Art 14-15, 20-21, 23]

Rule: The DSA sets requirements for how platforms moderate content, including terms and conditions, transparency reports, internal complaint mechanisms, and restrictions on suspension decisions.

Terms and Conditions [Art 14]

All intermediary services must include in their T&Cs information about:

Required InformationDescription
Content restrictionsPolicies on content types, account suspension
Decision-makingProcedures and safeguards for restrictions
Human reviewWhere and how human review is available
Internal rulesAlgorithmic decision-making and human review rules

Enforcement Consistency [Art 14(4)]

Providers must act in a:

  • Diligent manner
  • Objective manner
  • Proportionate manner

With due regard for the rights and legitimate interests of all parties, including fundamental rights.

Transparency Reporting [Art 15]

All intermediary services must publish annual transparency reports:

Report ElementApplicable To
Orders received from authoritiesAll intermediaries
Notices receivedHosting providers
Content moderation at own initiativeHosting providers
Complaints receivedPlatforms
Automated detection useWhere applicable
Out-of-court dispute outcomesPlatforms

Internal Complaint Mechanism [Art 20]

Online platforms must provide access to an internal complaint-handling system:

Who Can Complain [Art 20(1)]

  • Recipients affected by platform decisions
  • Notice submitters whose notices were rejected

Decisions Covered [Art 20(1)]

Complaints allowed against:

  • Removal/disabling of content
  • Suspension/termination of service
  • Suspension/termination of account
  • Suspension of monetization
  • Rejection of a notice

Requirements [Art 20(4)-(6)]

RequirementDetail
Free of chargeNo fees for complainants
Easy to accessUser-friendly interface
TimelyDecisions without undue delay
Not automatedDecisions cannot be solely automated if human review requested
Reverse optionMust be able to reverse decisions

Out-of-Court Dispute Resolution [Art 21]

Recipients have right to select an out-of-court dispute settlement body:

  • Certified by Digital Services Coordinator
  • Independent and impartial
  • Binding on the platform (if user accepts)
  • Platform bears costs unless complaint manifestly unfounded

Suspension Restrictions [Art 23]

Platforms may only suspend/terminate accounts when:

Prior Warning Requirement [Art 23(1)]

Before suspension, must provide:

  • Reasonable time to respond
  • Statement of reasons
  • Except for serious violations

Proportionality [Art 23(2)]

Suspension must be:

  • Proportionate to the violation
  • Consider all circumstances
  • Not impose permanent suspensions for minor infractions

Misuse Suspension [Art 23(3)]

Platforms may suspend accounts that frequently submit manifestly unfounded notices or complaints:

  • After issuing prior warning
  • Proportionate duration

Automated Content Moderation [Art 14(5), Art 20(6)]

When using automated systems:

RequirementDetail
TransparencyDisclose use in T&Cs
Human reviewAvailable for complaints
AccuracyVLOP must report accuracy metrics
Not sole decisionCannot rely solely on automation for complaints if human review requested

Prohibition on General Monitoring [Art 8]

Important limitation:

No general obligation shall be imposed on providers to monitor information they transmit or store, nor actively to seek facts indicating illegal activity.

Content moderation is voluntary or in response to specific notices/orders.

Citation

Articles 14-15, 20-21, 23, Digital Services Act

Contains public sector information licensed under the Open Government Licence v3.0 where applicable. This is not legal advice. Always refer to official sources for authoritative text.

llms.txt