DSA: Common Scenarios
Common Scenarios
Practical guidance for applying the DSA to real-world situations.
Scenario 1: User Reports Illegal Content
Question: A user reports a post containing hate speech. What must we do?
Answer:
- Acknowledge the notice without undue delay
- Review the content diligently and objectively
- Decide whether to remove/disable
- Inform both the reporter and the poster of the decision
- Statement of reasons if you act — explain the legal ground, facts, and remedies available
Citation: Art 16, Art 17
Scenario 2: Removing Content Without Notice
Question: Can we proactively remove content without waiting for a notice?
Answer: Yes. Platforms can moderate content on their own initiative. However:
- Still must provide statement of reasons to affected user
- Must follow T&C consistently
- Cannot be required to do general monitoring (Art 8)
- Voluntary moderation doesn’t create liability (Art 7)
Citation: Art 7, Art 8, Art 17
Scenario 3: Labeling Ads
Question: How must we label advertisements on our platform?
Answer:
- Mark clearly as “Ad”, “Sponsored”, or equivalent
- Show who paid for the ad (person/company name)
- Show why the user was targeted (main parameters)
- Display in real-time with each ad
- Do NOT target using sensitive categories (health, politics, religion)
- Do NOT target minors via profiling
Citation: Art 26, Art 28(2)
Scenario 4: Explaining the Algorithm
Question: A user asks why they see certain content. What must we explain?
Answer: Your T&Cs must disclose:
- Main parameters of your recommender system (engagement, interests, recency, etc.)
- Options for users to modify/influence recommendations
- For VLOPs: must offer at least one non-profiling option (e.g., chronological)
The explanation should be in plain, intelligible language.
Citation: Art 27, Art 38
Scenario 5: Are We a VLOP?
Question: We have 50 million users globally but only 30 million in the EU. Are we a VLOP?
Answer: No. The threshold is 45 million average monthly active recipients in the EU. Your EU user count (30M) is below threshold.
If you reach 45M EU users, you must:
- Report user numbers to Commission
- Await formal designation
- Comply within 4 months of designation
Citation: Art 33
Scenario 6: Small Platform Exemptions
Question: We’re a small hosting provider with 100 users. Do all DSA rules apply?
Answer: Reduced obligations apply. The DSA has tiered obligations:
| Your Size | Obligations |
|---|---|
| All intermediaries | Contact point, legal rep, T&C transparency |
| Hosting (any size) | + Notice-and-action, statement of reasons |
| Platforms (non-micro/small) | + Complaints, trusted flaggers, ad transparency |
| VLOPs (45M+ EU users) | + Risk assessment, audits, enhanced transparency |
Small/micro enterprises (< 50 employees, < €10M turnover) are exempt from some platform-specific obligations.
Citation: Art 19
Scenario 7: User Appeals Content Removal
Question: A user disagrees with our removal decision. What must we offer?
Answer:
- Internal complaint mechanism (Art 20)
- Free of charge
- Easy to access
- Timely decision
- Cannot be solely automated if human review requested
- Out-of-court dispute resolution (Art 21)
- User can choose certified body
- Binding on platform if user accepts
Citation: Art 20, Art 21
Scenario 8: Marketplace Seller Verification
Question: We run an online marketplace. What do we need from sellers?
Answer: Before allowing sellers to offer products, you must collect and verify:
| Required Information | Verification |
|---|---|
| Name, address, contact | Collect |
| ID document (individuals) | Collect |
| Trade register (companies) | Verify in database |
| Bank account details | Collect |
| Self-certification of compliance | Collect |
You must make seller identity visible to buyers and use best efforts to verify information.
Citation: Art 30-31
Scenario 9: Crisis Event Response
Question: There’s a public health emergency. Does the Commission have special powers?
Answer: Yes. During crises (war, terrorism, pandemic, natural disaster), the Commission can:
- Require VLOPs to assess their contribution to the threat
- Require specific, proportionate mitigation measures
- Require reporting on actions taken
This is temporary and must be necessary and proportionate.
Citation: Art 36
Scenario 10: Researcher Data Access
Question: A university researcher wants access to our platform data. Must we provide it?
Answer: For VLOPs: Yes, under certain conditions:
- Researcher must be “vetted” (affiliated with research organization, independent, nonprofit purpose)
- Request made through Digital Services Coordinator
- Data access subject to security and privacy safeguards
- Proportionate to research purpose
For non-VLOPs: No mandatory data access obligation.
Citation: Art 40
Quick Reference Table
| Scenario | Applies To | Key Articles |
|---|---|---|
| Content removal notice | Hosting + | Art 16-17 |
| Labeling ads | Platforms + | Art 26 |
| Recommender transparency | Platforms + | Art 27 |
| Non-profiling option | VLOPs only | Art 38 |
| Ad repository | VLOPs only | Art 39 |
| Risk assessment | VLOPs only | Art 34 |
| Seller verification | Marketplaces | Art 30-31 |
| Small enterprise exemption | Platforms | Art 19 |