EU

DSA: Minors Protection and Online Marketplace Obligations

Minors Protection and Online Marketplace Obligations [Articles 28-32]

Rule: Online platforms accessible to minors must implement specific protections, all platforms must have clear terms and conditions, and online marketplaces must ensure trader traceability and compliance verification.

Protection of Minors [Article 28]

Article 28(1): High Level of Privacy, Safety, and Security

Online platforms accessible to minors must put in place appropriate and proportionate measures to ensure high level of:

  • Privacy - Protect minors’ personal data
  • Safety - Protect from harmful content and contact
  • Security - Secure platform environment

“Accessible to minors” means:

  • Platform does not exclude minors
  • Can reasonably be expected to be used by minors
  • Not limited to platforms targeting minors

Practically, most general platforms are covered (social media, video sharing, gaming, marketplaces)

Article 28(2): Prohibition of Targeting Minors with Ads

Platforms shall not present advertisements based on profiling using minor’s personal data.

“Profiling” means (GDPR Art 4(4)):

  • Automated processing of personal data
  • To evaluate personal aspects
  • For targeted advertising

Prohibition covers:

  • Behavioral advertising based on browsing history
  • Interest-based advertising from data analysis
  • Demographic targeting using minor’s data
  • Location-based targeted ads using minor’s location

Not prohibited:

  • Contextual advertising (based on content being viewed, not user)
  • Non-targeted advertising (shown to all users)
  • Age-appropriate ads without profiling

Examples:

ScenarioPermitted?
Show toy ad during cartoon video✅ Yes (contextual, not profiling)
Show sneaker ad based on teen’s browsing history❌ No (profiling minor’s data)
Show educational app ad to all users under 18✅ Yes (age-appropriate, not individual profiling)
Target specific minor based on interests❌ No (profiling)

Measures to Implement

Practical measures for minors protection:

Measure CategoryExamples
Age verificationRobust age assurance mechanisms
Privacy by defaultRestrictive privacy settings for minors
Content filteringFilter age-inappropriate content
Contact restrictionsLimit who can contact minors
Reporting toolsEasy-to-use reporting for minors
Parental controlsEnable parent/guardian oversight
Design choicesAvoid addictive design patterns for minors
EducationSafety information for minors and parents

Age-appropriate design:

  • Simple, clear privacy information
  • Prominent safety features
  • Nudges toward safer behavior
  • Time limits and usage management
  • Community standards enforced

Terms and Conditions [Article 29]

Article 29(1): Clear, Plain, Intelligible Terms

Online platforms must include in terms and conditions, in plain and intelligible language:

  • Information on content moderation policies
  • Algorithmic decision-making
  • Internal complaint-handling procedures

“Plain and intelligible” means:

  • Easy to understand for average user
  • Not legal jargon
  • Structured logically
  • Examples provided
  • Translations where appropriate

Article 29(2): Content Restrictions Information

Terms must include information on:

CategoryDetails Required
Prohibited contentWhat content/activities not permitted
Moderation policiesHow content moderation conducted
Algorithmic toolsAutomated content moderation explained
Human reviewRole of human moderators

Must describe:

  • What is not allowed (specific categories)
  • How violations detected
  • What happens when content removed
  • Appeal processes

Example categories:

  • Illegal content (child sexual abuse, terrorism, hate speech)
  • Terms violations (spam, harassment, impersonation)
  • Copyright infringement
  • Misinformation (platform-specific policies)

Article 29(3): Algorithmic Decision-Making Information

If using automated means, terms must explain:

  • Main parameters considered
  • Reasons for importance
  • How user can influence recommendations

Applies to:

  • Content ranking/recommendations
  • Search result ordering
  • Ad targeting
  • Content moderation automation

Example explanation:

“We use algorithms to rank posts in your feed. Main factors include:

  • How recently posted
  • Your previous interactions with similar content
  • Engagement from your connections You can adjust preferences in Settings > Feed Preferences.”

Article 29(4): Monetization Information

If platform allows monetization, terms must explain:

  • Conditions for eligibility
  • Revenue sharing arrangement
  • Visibility/promotion effects of monetization

Covers:

  • Ad revenue sharing (YouTube Partner, etc.)
  • Subscriptions/tips
  • Marketplace commissions
  • Premium features

Article 29(5): Enforcement and Suspension

Terms must explain:

  • How terms enforced
  • Penalties for violations
  • Suspension/termination policies
  • Appeal rights

Clarity required on:

  • Warning system
  • Temporary vs. permanent suspensions
  • What triggers each enforcement level
  • How to appeal

Online Marketplace Traceability [Article 30]

Article 30(1): Traceability of Business Users

Online marketplaces (platforms allowing consumers to conclude distance contracts with traders) must obtain and verify:

Before allowing offering of products/services, collect:

Information CategoryDetails
IdentityName, address, telephone, email
Payment accountIBAN or equivalent
Trade registerRegistration number if applicable
Self-certificationDeclaration of being trader or not

“Trader” means:

  • Acting for purposes relating to trade, business, craft, profession
  • Professional selling (not occasional private sales)

Article 30(2): Verification Obligation

Marketplace must make best efforts to verify information is:

  • Reliable
  • Complete
  • Up-to-date

“Best efforts” means:

  • Using reliable sources (official databases, registries)
  • Cross-checking information
  • Following up on inconsistencies
  • Regular updates

Verification methods:

Information TypeVerification Method
Business registrationCheck trade register, company database
IdentityRequest ID documents, cross-reference
Contact detailsVerify email, call phone number
Payment accountTest transaction, verify account holder

Article 30(3): Information Storage

Marketplace must store information in:

  • Secure database
  • Retained for 6 months after trader’s last offering
  • Available to authorities on request

Security requirements:

  • Access controls
  • Encryption at rest
  • Audit logs
  • GDPR compliance

Article 30(4): Trader Information Display

For each offer, marketplace must display clearly:

  • Whether offering person is trader or not
  • If trader, information obtained under Art 30(1)

Display requirements:

  • Visible on product/service listing
  • Before consumer concludes contract
  • Clear labeling (e.g., “Sold by [Business Name]”)

Example display:

Sold by: ABC Electronics Ltd
Business address: 123 High Street, London, UK
Registration: 12345678
Contact: support@abcelectronics.com

Article 30(5): Suspension for Non-Compliance

Marketplace must suspend trader for:

  • Providing incomplete or incorrect information
  • Not responding to verification requests
  • Non-compliance after notice

Process:

  • Notice to trader
  • Opportunity to provide information
  • Suspension if still non-compliant
  • Statement of reasons

Compliance by Design [Article 31]

Article 31: Design to Comply with Law

Online platforms shall design terms and conditions and notices in manner that enables compliance with applicable law.

Means:

  • Terms enable users to comply with consumer protection law
  • Product descriptions enable compliance with product safety law
  • Clear rules about what is not permitted
  • Trader obligations explained

Examples:

  • Marketplace terms require traders to provide consumer rights information
  • Template helps traders comply with distance selling requirements
  • Clear guidance on prohibited products (unsafe goods)
  • Mandatory fields for required disclosures

Platform responsibility:

  • Not liable for trader violations
  • But must facilitate compliance through design
  • Provide tools and information

Random Checks by Marketplaces [Article 32]

Article 32(1): Sampling and Testing Obligation

Online marketplaces must conduct random checks, including through mystery shopper programs, to identify:

  • Illegal products
  • Products infringing intellectual property rights
  • Non-compliant products/services

“Random checks” means:

  • Sample-based testing
  • Not examining every listing
  • Statistical approach to coverage
  • Risk-based targeting

Article 32(2): Compliance with Product Safety Law

Checks must verify compliance with:

  • Product Safety Regulation
  • Machinery Regulation
  • Other applicable Union product compliance law

Focus areas:

Product CategoryCompliance Concerns
ToysSafety standards, age restrictions, choking hazards
ElectronicsCE marking, electrical safety, RoHS compliance
CosmeticsIngredient restrictions, labeling requirements
FoodFood safety law, allergen labeling
TextilesFlammability, chemical restrictions

Article 32(3): Mystery Shopper Programs

Mystery shopper programs involve:

  • Actually purchasing products anonymously
  • Testing/analyzing what is received
  • Comparing to listing description
  • Identifying non-compliance

Best practices:

  • Regular program (quarterly, semi-annual)
  • Risk-based selection (high-risk categories, new sellers, complaints history)
  • Professional testing labs where appropriate
  • Document findings
  • Follow up with enforcement

Article 32(4): Actions on Non-Compliance

When illegal/non-compliant products identified:

  1. Suspend listing immediately
  2. Notify trader of violation
  3. Provide opportunity to remedy (if possible)
  4. Permanent removal if serious or repeated
  5. Notify relevant authorities (product safety authorities)
  6. Notify affected consumers if sold

Serious violations:

  • Product safety risks
  • Counterfeit goods
  • Prohibited products

Practical Compliance

Implementing Minors Protection (Art 28)

Checklist:

  1. ✅ Determine if platform accessible to minors
  2. ✅ Implement age verification/assurance
  3. ✅ Set restrictive default privacy settings for minors
  4. ✅ Disable profiling-based ads for minors
  5. ✅ Enable contextual ads only for minors
  6. ✅ Provide parental controls
  7. ✅ Filter age-inappropriate content
  8. ✅ Restrict contact from strangers for minors
  9. ✅ Prominent reporting tools
  10. ✅ Safety education resources

Age assurance methods:

  • Self-declaration with verification
  • Document verification (ID check)
  • Age estimation technology
  • Parental verification
  • Third-party age verification services

Drafting Compliant Terms (Art 29)

Content moderation section:

  1. ✅ List prohibited content categories
  2. ✅ Explain detection methods
  3. ✅ Describe enforcement actions
  4. ✅ Explain appeal process
  5. ✅ Use plain language with examples

Algorithmic systems section:

  1. ✅ Describe recommender systems
  2. ✅ Explain main ranking parameters
  3. ✅ Provide user control options
  4. ✅ Explain how to customize

Monetization section:

  1. ✅ Eligibility requirements
  2. ✅ Revenue sharing rates
  3. ✅ Payment terms
  4. ✅ Tax responsibilities

Implementing Marketplace Traceability (Art 30)

Onboarding process:

  1. ✅ Create trader information collection form
  2. ✅ Require all mandatory fields
  3. ✅ Implement verification checks
  4. ✅ Cross-reference with official databases
  5. ✅ Request supporting documents
  6. ✅ Review and approve manually if needed
  7. ✅ Store information securely
  8. ✅ Enable trader listing only after verification

Ongoing management:

  1. ✅ Periodic re-verification (annually)
  2. ✅ Update prompts when information changes
  3. ✅ Monitor for red flags (complaints, suspicious activity)
  4. ✅ Suspend non-compliant traders
  5. ✅ Maintain database access for authorities

Listing display:

  1. ✅ Clear “Sold by” information
  2. ✅ Business name and address
  3. ✅ Contact information
  4. ✅ Registration number (if applicable)
  5. ✅ Standardized format across platform

Conducting Random Checks (Art 32)

Program design:

  1. ✅ Establish risk-based sampling methodology
  2. ✅ Define check frequency
  3. ✅ Select high-risk categories
  4. ✅ Budget for mystery shopping purchases
  5. ✅ Engage testing labs where needed
  6. ✅ Document procedures

Execution:

  1. ✅ Select sample randomly within risk categories
  2. ✅ Purchase anonymously
  3. ✅ Test against applicable standards
  4. ✅ Document findings
  5. ✅ Take enforcement actions
  6. ✅ Notify authorities of serious violations
  7. ✅ Report in transparency reports

Common Mistakes

Minors protection:

  • Assuming age gate sufficient (must implement substantive protections)
  • Continuing profiling-based ads for minors
  • Not providing minor-specific safety features

Terms and conditions:

  • Legal jargon incomprehensible to users
  • Vague content policies
  • No explanation of algorithms
  • Hidden in lengthy documents

Marketplace traceability:

  • Not verifying trader information (collection alone insufficient)
  • Allowing sales before verification complete
  • Not displaying trader information prominently
  • Failing to suspend non-compliant traders

Random checks:

  • Never actually conducting checks
  • Only checking after complaints (not random)
  • Not following up on identified violations
  • Not notifying product safety authorities

Citation

Sources

Contains public sector information licensed under the Open Government Licence v3.0 where applicable. This is not legal advice. Always refer to official sources for authoritative text.

llms.txt