EU

DSA: Trusted Flaggers and Transparency Reporting

Trusted Flaggers and Transparency Reporting [Articles 22-25]

Rule: Online platforms must establish systems for trusted flaggers to submit priority notices, take measures against misuse of their systems, and publish detailed transparency reports on content moderation activities.

Trusted Flaggers [Article 22]

Article 22(1): Definition and Treatment

“Trusted flagger”: Entity that has demonstrated particular expertise and competence for purposes of detecting, identifying, and notifying illegal content.

Platforms must:

  • Enable entities to apply for trusted flagger status
  • Process notices from trusted flaggers with priority
  • Act upon notices without undue delay

“Priority” treatment means:

  • Trusted flagger notices reviewed before general user reports
  • Dedicated review process
  • Faster response times
  • Direct communication channel

Rationale: Organizations with expertise can help platforms identify illegal content more effectively

Article 22(2): Eligibility Criteria

To qualify as trusted flagger, entity must:

CriterionDetails
Particular expertiseSpecialized knowledge in detecting illegal content
IndependenceIndependent of platform
ObjectivityDiligent and objective reports
Quality track recordAccurate notices with low error rate

Examples of potential trusted flaggers:

  • NGOs focused on specific harms (child safety, hate speech, terrorism)
  • Government agencies (consumer protection, IP offices)
  • Industry associations (anti-counterfeiting groups)
  • Fact-checking organizations
  • Cybersecurity firms

Not eligible:

  • Entities with conflicts of interest
  • Those with poor accuracy record
  • Entities that misuse reporting systems

Article 22(3): Status Revocation

Platform must revoke trusted flagger status if entity:

  • No longer meets eligibility criteria
  • Submits insufficient number of notices (dormant)
  • Submits notices that are insufficiently precise or substantiated
  • Otherwise misuses status

Due process required:

  • Inform entity of intention to revoke
  • Provide opportunity to respond
  • State reasons for revocation

Article 22(4): Publication of Trusted Flaggers

Platforms must publish:

  • Names of trusted flaggers they work with
  • Updated at least every six months
  • Publicly accessible

Purpose: Transparency about which entities have priority access

Measures Against Misuse [Article 23]

Article 23(1): Suspension of Service

Platforms must suspend, for reasonable period and after prior warning:

  • Provision of service to recipients who frequently provide manifestly illegal content
  • Processing of notices and complaints from individuals or entities who frequently submit manifestly unfounded notices or complaints

“Frequently” means:

  • Pattern of repeated behavior
  • Not isolated incidents
  • Platform determines based on policies

“Manifestly illegal” means:

  • Content clearly illegal without detailed analysis
  • Obvious violations

“Manifestly unfounded” means:

  • Notices/complaints obviously without merit
  • Malicious or abusive reporting

Article 23(2): Proportionate Application

When deciding suspension, platform must consider:

FactorConsideration
Number of illegal itemsHow much manifestly illegal content
SeverityType of illegality
ConsequencesImpact of violations
Recipient’s intentDeliberate vs. negligent
Freedom of expressionRight to information, opinion

Suspension must be:

  • Proportionate to violation severity
  • After prior warning (except serious violations)
  • For reasonable period
  • Clearly communicated with reasons

Examples:

ScenarioProportionate Response
Individual uploads 3 videos promoting terrorismImmediate suspension, referral to authorities
User posts 20 copyright-infringing photosWarning, then temporary suspension if continues
Entity submits 50 false DMCA noticesSuspend notice submission privileges
User makes 10 groundless complaintsWarning, then suspend complaint access

Article 23(3): Statement of Reasons

Suspension decision must include:

  • Facts and circumstances leading to decision
  • Reference to violated terms or illegal content type
  • Information on redress
  • Duration of suspension

Same requirements as Art 17 (statement of reasons for content restrictions)

Article 23(4): Complaints About Suspension

Suspended recipients may file complaint through internal complaint-handling system (Art 20)

Platform must:

  • Process complaint fairly
  • Reverse suspension if unjustified
  • Restore access promptly if suspension lifted

Transparency Reporting for Online Platforms [Article 24]

Article 24(1): Annual Reporting Obligation

Online platforms must publish reports, at least once every six months, containing:

Clear, easily accessible, and machine-readable format

Article 24(2): Content to be Included

Report must contain information on:

(a) Content Moderation Activities

CategoryData Required
Orders from authoritiesNumber received by Member State and content type
Notices receivedNumber, legal ground invoked, median time for decision
Own-initiative removalsNumber of items removed or restricted, content type
Complaints receivedNumber through Art 20 system, basis, decisions, median time
Out-of-court dispute resolutionNumber of disputes submitted, outcomes
SuspensionsNumber under Art 23, reasons

(b) Content Moderation Automated Means

If using automated tools:

  • Purpose of each tool
  • Indicators of accuracy
  • Safeguards for errors

Examples:

  • Hash-matching for CSAM
  • Text filters for hate speech
  • Image recognition for violent content
  • Spam detection algorithms

(c) Trust and Safety Resources

ResourceDetails
Staff numbersContent moderators, by category and language
QualificationsTraining, expertise
Languages coveredLanguages moderation available in

Purpose: Demonstrate adequate resourcing for content moderation

Article 24(3): Disaggregation

Data must be disaggregated by:

  • Member State (where identifiable)
  • Content type (illegal content categories, terms violations)
  • Language of notice/complaint
  • Whether automated means used

Rationale: Enable analysis of patterns, effectiveness, potential biases

Article 24(4): Format and Accessibility

Reports must be:

RequirementDetails
Machine-readableStructured data (JSON, XML, CSV)
Publicly availableOn platform website
Easy to findClearly labeled, accessible without registration
ArchivedPrevious reports available for at least 5 years

Best practices:

  • Interactive dashboards
  • Downloadable datasets
  • API access for researchers
  • Visualizations of key trends

Article 24(5): First Report Deadline

First report due:

  • Within 4 months of regulation application (by June 17, 2024 for most platforms)
  • Then every 6 months thereafter

Additional Transparency for VLOPs [Article 25]

Article 25: Enhanced Transparency Obligations

Very Large Online Platforms (45M+ monthly EU users) must include additional information in transparency reports:

Additional RequirementDetails
Systemic risksOverview of measures to mitigate risks identified in Art 34 risk assessments
Content recommendationsNumber of impressions, main parameters for recommender systems
AdvertisingNumber of ads displayed, disaggregated by type, targeting used
Data accessInformation on data access granted to vetted researchers under Art 40

Purpose: Greater accountability for platforms with significant societal impact

Article 25 Reporting Requirements Detail

(a) Risk Mitigation Measures

Report must describe:

  • Risks identified in risk assessment (Art 34)
  • Measures implemented to mitigate risks
  • Effectiveness of mitigation measures
  • Changes to measures during reporting period

Types of systemic risks (from Art 34):

  • Dissemination of illegal content
  • Actual or foreseeable negative effects on fundamental rights
  • Manipulation of service (coordinated inauthentic behavior)
  • Public security, health, minors protection, civic discourse impacts

(b) Recommender Systems Transparency

Data on content ranking:

  • Number of content impressions
  • Main parameters used for recommendations
  • Options for modifying/influencing parameters
  • Changes to recommender systems

“Main parameters” examples:

  • Engagement signals (likes, shares, time spent)
  • User connections/network
  • Content characteristics
  • Recency
  • Personalization factors

(c) Advertising Transparency

Enhanced ad reporting:

  • Total number of ads displayed
  • Breakdown by ad type
  • Use of recommender systems for ad delivery
  • Use of profiling and targeting
  • Reach by demographic (aggregated)

Already covered by Art 26, but Art 25 requires aggregate reporting

(d) Data Access for Researchers

Report on vetted researcher access (Art 40):

  • Number of access requests
  • Number approved/denied
  • Types of data provided
  • Research topics
  • Safeguards applied

Purpose: Demonstrate compliance with researcher access obligations

Practical Compliance

Implementing Trusted Flagger System

Checklist:

  1. ✅ Create application process for trusted flagger status
  2. ✅ Establish criteria for assessment
  3. ✅ Review applications against criteria
  4. ✅ Designate internal team to handle trusted flagger notices
  5. ✅ Create priority review queue
  6. ✅ Establish faster SLAs for trusted flagger notices
  7. ✅ Monitor trusted flagger performance
  8. ✅ Publish list of trusted flaggers (update every 6 months)
  9. ✅ Create revocation procedure
  10. ✅ Document all decisions

Recommended trusted flagger agreement:

  • Responsibilities of flagger
  • Platform’s commitments
  • Data protection provisions
  • Status review process
  • Termination provisions

Implementing Suspension Measures

Policy development:

  1. ✅ Define “frequently” (e.g., 3+ violations in 30 days)
  2. ✅ Define “manifestly illegal” by content type
  3. ✅ Define “manifestly unfounded” for notices
  4. ✅ Establish warning system
  5. ✅ Create suspension duration guidelines
  6. ✅ Implement statement of reasons template
  7. ✅ Connect to complaint system (Art 20)
  8. ✅ Train moderators on policy
  9. ✅ Monitor for consistency

Proportionality assessment:

  • Severity scoring system
  • Escalation ladder (warning → temp suspension → permanent)
  • Manual review for serious cases
  • Fundamental rights considerations

Creating Transparency Reports

Data collection systems:

  1. ✅ Log all relevant events (notices, removals, complaints, orders)
  2. ✅ Tag with required metadata (Member State, content type, language)
  3. ✅ Track timing (receipt to decision)
  4. ✅ Record automated tool usage
  5. ✅ Maintain resource statistics (staff, languages)
  6. ✅ Export capabilities for reporting

Report production:

  1. ✅ Extract data for reporting period
  2. ✅ Disaggregate as required
  3. ✅ Calculate medians and aggregates
  4. ✅ Describe automated tools
  5. ✅ Include resource information
  6. ✅ For VLOPs: add systemic risk, recommender, ad, research data
  7. ✅ Generate machine-readable format
  8. ✅ Create human-readable version
  9. ✅ Internal review
  10. ✅ Publish on website
  11. ✅ Archive previous reports
  12. ✅ Notify Digital Services Coordinator

Publishing best practices:

  • Dedicated transparency page
  • Summary dashboard with key metrics
  • Downloadable structured data
  • Explanatory text for context
  • Comparative data (previous periods)
  • Interactive visualizations

Common Mistakes

Treating trusted flaggers same as users:

  • Must provide priority treatment
  • Faster review and action required
  • Direct communication channel

No clear criteria for trusted flagger status:

  • Must have objective eligibility criteria
  • Document assessment process
  • Transparent about decisions

Suspending without warning:

  • Prior warning required (except serious cases)
  • Clear communication of violations
  • Statement of reasons mandatory

Not considering proportionality:

  • Must balance severity, intent, fundamental rights
  • Escalation approach recommended
  • Permanent bans only for serious/repeated violations

Transparency reports missing required data:

  • All Art 24(2) elements mandatory
  • Must disaggregate by Member State, content type, language
  • Machine-readable format required

Reports not accessible:

  • Must be publicly available
  • Easy to find on website
  • No registration barriers
  • Archived for 5 years

VLOPs not including enhanced data:

  • Art 25 additional requirements mandatory
  • Systemic risk mitigation reporting
  • Recommender system data
  • Researcher access information

Citation

Sources

Contains public sector information licensed under the Open Government Licence v3.0 where applicable. This is not legal advice. Always refer to official sources for authoritative text.

llms.txt