DSA: Minors Protection and Online Marketplace Obligations
Minors Protection and Online Marketplace Obligations [Articles 28-32]
Rule: Online platforms accessible to minors must implement specific protections, all platforms must have clear terms and conditions, and online marketplaces must ensure trader traceability and compliance verification.
Protection of Minors [Article 28]
Article 28(1): High Level of Privacy, Safety, and Security
Online platforms accessible to minors must put in place appropriate and proportionate measures to ensure high level of:
- Privacy - Protect minors’ personal data
- Safety - Protect from harmful content and contact
- Security - Secure platform environment
“Accessible to minors” means:
- Platform does not exclude minors
- Can reasonably be expected to be used by minors
- Not limited to platforms targeting minors
Practically, most general platforms are covered (social media, video sharing, gaming, marketplaces)
Article 28(2): Prohibition of Targeting Minors with Ads
Platforms shall not present advertisements based on profiling using minor’s personal data.
“Profiling” means (GDPR Art 4(4)):
- Automated processing of personal data
- To evaluate personal aspects
- For targeted advertising
Prohibition covers:
- Behavioral advertising based on browsing history
- Interest-based advertising from data analysis
- Demographic targeting using minor’s data
- Location-based targeted ads using minor’s location
Not prohibited:
- Contextual advertising (based on content being viewed, not user)
- Non-targeted advertising (shown to all users)
- Age-appropriate ads without profiling
Examples:
| Scenario | Permitted? |
|---|---|
| Show toy ad during cartoon video | ✅ Yes (contextual, not profiling) |
| Show sneaker ad based on teen’s browsing history | ❌ No (profiling minor’s data) |
| Show educational app ad to all users under 18 | ✅ Yes (age-appropriate, not individual profiling) |
| Target specific minor based on interests | ❌ No (profiling) |
Measures to Implement
Practical measures for minors protection:
| Measure Category | Examples |
|---|---|
| Age verification | Robust age assurance mechanisms |
| Privacy by default | Restrictive privacy settings for minors |
| Content filtering | Filter age-inappropriate content |
| Contact restrictions | Limit who can contact minors |
| Reporting tools | Easy-to-use reporting for minors |
| Parental controls | Enable parent/guardian oversight |
| Design choices | Avoid addictive design patterns for minors |
| Education | Safety information for minors and parents |
Age-appropriate design:
- Simple, clear privacy information
- Prominent safety features
- Nudges toward safer behavior
- Time limits and usage management
- Community standards enforced
Terms and Conditions [Article 29]
Article 29(1): Clear, Plain, Intelligible Terms
Online platforms must include in terms and conditions, in plain and intelligible language:
- Information on content moderation policies
- Algorithmic decision-making
- Internal complaint-handling procedures
“Plain and intelligible” means:
- Easy to understand for average user
- Not legal jargon
- Structured logically
- Examples provided
- Translations where appropriate
Article 29(2): Content Restrictions Information
Terms must include information on:
| Category | Details Required |
|---|---|
| Prohibited content | What content/activities not permitted |
| Moderation policies | How content moderation conducted |
| Algorithmic tools | Automated content moderation explained |
| Human review | Role of human moderators |
Must describe:
- What is not allowed (specific categories)
- How violations detected
- What happens when content removed
- Appeal processes
Example categories:
- Illegal content (child sexual abuse, terrorism, hate speech)
- Terms violations (spam, harassment, impersonation)
- Copyright infringement
- Misinformation (platform-specific policies)
Article 29(3): Algorithmic Decision-Making Information
If using automated means, terms must explain:
- Main parameters considered
- Reasons for importance
- How user can influence recommendations
Applies to:
- Content ranking/recommendations
- Search result ordering
- Ad targeting
- Content moderation automation
Example explanation:
“We use algorithms to rank posts in your feed. Main factors include:
- How recently posted
- Your previous interactions with similar content
- Engagement from your connections You can adjust preferences in Settings > Feed Preferences.”
Article 29(4): Monetization Information
If platform allows monetization, terms must explain:
- Conditions for eligibility
- Revenue sharing arrangement
- Visibility/promotion effects of monetization
Covers:
- Ad revenue sharing (YouTube Partner, etc.)
- Subscriptions/tips
- Marketplace commissions
- Premium features
Article 29(5): Enforcement and Suspension
Terms must explain:
- How terms enforced
- Penalties for violations
- Suspension/termination policies
- Appeal rights
Clarity required on:
- Warning system
- Temporary vs. permanent suspensions
- What triggers each enforcement level
- How to appeal
Online Marketplace Traceability [Article 30]
Article 30(1): Traceability of Business Users
Online marketplaces (platforms allowing consumers to conclude distance contracts with traders) must obtain and verify:
Before allowing offering of products/services, collect:
| Information Category | Details |
|---|---|
| Identity | Name, address, telephone, email |
| Payment account | IBAN or equivalent |
| Trade register | Registration number if applicable |
| Self-certification | Declaration of being trader or not |
“Trader” means:
- Acting for purposes relating to trade, business, craft, profession
- Professional selling (not occasional private sales)
Article 30(2): Verification Obligation
Marketplace must make best efforts to verify information is:
- Reliable
- Complete
- Up-to-date
“Best efforts” means:
- Using reliable sources (official databases, registries)
- Cross-checking information
- Following up on inconsistencies
- Regular updates
Verification methods:
| Information Type | Verification Method |
|---|---|
| Business registration | Check trade register, company database |
| Identity | Request ID documents, cross-reference |
| Contact details | Verify email, call phone number |
| Payment account | Test transaction, verify account holder |
Article 30(3): Information Storage
Marketplace must store information in:
- Secure database
- Retained for 6 months after trader’s last offering
- Available to authorities on request
Security requirements:
- Access controls
- Encryption at rest
- Audit logs
- GDPR compliance
Article 30(4): Trader Information Display
For each offer, marketplace must display clearly:
- Whether offering person is trader or not
- If trader, information obtained under Art 30(1)
Display requirements:
- Visible on product/service listing
- Before consumer concludes contract
- Clear labeling (e.g., “Sold by [Business Name]”)
Example display:
Sold by: ABC Electronics Ltd
Business address: 123 High Street, London, UK
Registration: 12345678
Contact: support@abcelectronics.com
Article 30(5): Suspension for Non-Compliance
Marketplace must suspend trader for:
- Providing incomplete or incorrect information
- Not responding to verification requests
- Non-compliance after notice
Process:
- Notice to trader
- Opportunity to provide information
- Suspension if still non-compliant
- Statement of reasons
Compliance by Design [Article 31]
Article 31: Design to Comply with Law
Online platforms shall design terms and conditions and notices in manner that enables compliance with applicable law.
Means:
- Terms enable users to comply with consumer protection law
- Product descriptions enable compliance with product safety law
- Clear rules about what is not permitted
- Trader obligations explained
Examples:
- Marketplace terms require traders to provide consumer rights information
- Template helps traders comply with distance selling requirements
- Clear guidance on prohibited products (unsafe goods)
- Mandatory fields for required disclosures
Platform responsibility:
- Not liable for trader violations
- But must facilitate compliance through design
- Provide tools and information
Random Checks by Marketplaces [Article 32]
Article 32(1): Sampling and Testing Obligation
Online marketplaces must conduct random checks, including through mystery shopper programs, to identify:
- Illegal products
- Products infringing intellectual property rights
- Non-compliant products/services
“Random checks” means:
- Sample-based testing
- Not examining every listing
- Statistical approach to coverage
- Risk-based targeting
Article 32(2): Compliance with Product Safety Law
Checks must verify compliance with:
- Product Safety Regulation
- Machinery Regulation
- Other applicable Union product compliance law
Focus areas:
| Product Category | Compliance Concerns |
|---|---|
| Toys | Safety standards, age restrictions, choking hazards |
| Electronics | CE marking, electrical safety, RoHS compliance |
| Cosmetics | Ingredient restrictions, labeling requirements |
| Food | Food safety law, allergen labeling |
| Textiles | Flammability, chemical restrictions |
Article 32(3): Mystery Shopper Programs
Mystery shopper programs involve:
- Actually purchasing products anonymously
- Testing/analyzing what is received
- Comparing to listing description
- Identifying non-compliance
Best practices:
- Regular program (quarterly, semi-annual)
- Risk-based selection (high-risk categories, new sellers, complaints history)
- Professional testing labs where appropriate
- Document findings
- Follow up with enforcement
Article 32(4): Actions on Non-Compliance
When illegal/non-compliant products identified:
- Suspend listing immediately
- Notify trader of violation
- Provide opportunity to remedy (if possible)
- Permanent removal if serious or repeated
- Notify relevant authorities (product safety authorities)
- Notify affected consumers if sold
Serious violations:
- Product safety risks
- Counterfeit goods
- Prohibited products
Practical Compliance
Implementing Minors Protection (Art 28)
Checklist:
- ✅ Determine if platform accessible to minors
- ✅ Implement age verification/assurance
- ✅ Set restrictive default privacy settings for minors
- ✅ Disable profiling-based ads for minors
- ✅ Enable contextual ads only for minors
- ✅ Provide parental controls
- ✅ Filter age-inappropriate content
- ✅ Restrict contact from strangers for minors
- ✅ Prominent reporting tools
- ✅ Safety education resources
Age assurance methods:
- Self-declaration with verification
- Document verification (ID check)
- Age estimation technology
- Parental verification
- Third-party age verification services
Drafting Compliant Terms (Art 29)
Content moderation section:
- ✅ List prohibited content categories
- ✅ Explain detection methods
- ✅ Describe enforcement actions
- ✅ Explain appeal process
- ✅ Use plain language with examples
Algorithmic systems section:
- ✅ Describe recommender systems
- ✅ Explain main ranking parameters
- ✅ Provide user control options
- ✅ Explain how to customize
Monetization section:
- ✅ Eligibility requirements
- ✅ Revenue sharing rates
- ✅ Payment terms
- ✅ Tax responsibilities
Implementing Marketplace Traceability (Art 30)
Onboarding process:
- ✅ Create trader information collection form
- ✅ Require all mandatory fields
- ✅ Implement verification checks
- ✅ Cross-reference with official databases
- ✅ Request supporting documents
- ✅ Review and approve manually if needed
- ✅ Store information securely
- ✅ Enable trader listing only after verification
Ongoing management:
- ✅ Periodic re-verification (annually)
- ✅ Update prompts when information changes
- ✅ Monitor for red flags (complaints, suspicious activity)
- ✅ Suspend non-compliant traders
- ✅ Maintain database access for authorities
Listing display:
- ✅ Clear “Sold by” information
- ✅ Business name and address
- ✅ Contact information
- ✅ Registration number (if applicable)
- ✅ Standardized format across platform
Conducting Random Checks (Art 32)
Program design:
- ✅ Establish risk-based sampling methodology
- ✅ Define check frequency
- ✅ Select high-risk categories
- ✅ Budget for mystery shopping purchases
- ✅ Engage testing labs where needed
- ✅ Document procedures
Execution:
- ✅ Select sample randomly within risk categories
- ✅ Purchase anonymously
- ✅ Test against applicable standards
- ✅ Document findings
- ✅ Take enforcement actions
- ✅ Notify authorities of serious violations
- ✅ Report in transparency reports
Common Mistakes
Minors protection:
- Assuming age gate sufficient (must implement substantive protections)
- Continuing profiling-based ads for minors
- Not providing minor-specific safety features
Terms and conditions:
- Legal jargon incomprehensible to users
- Vague content policies
- No explanation of algorithms
- Hidden in lengthy documents
Marketplace traceability:
- Not verifying trader information (collection alone insufficient)
- Allowing sales before verification complete
- Not displaying trader information prominently
- Failing to suspend non-compliant traders
Random checks:
- Never actually conducting checks
- Only checking after complaints (not random)
- Not following up on identified violations
- Not notifying product safety authorities
Citation
- Digital Services Act - Article 28
- Digital Services Act - Article 29
- Digital Services Act - Article 30
- Digital Services Act - Article 31
- Digital Services Act - Article 32