Stock Markets May 12, 2026 06:26 AM

EU to curb addictive social media features aimed at children

Brussels to target endless scrolling, autoplay and push alerts; Meta faces probe over age enforcement

By Hana Yamamoto META

The European Commission plans new regulation this year to limit design elements on social platforms that encourage prolonged use by children. President Ursula von der Leyen outlined measures that would address features like endless scrolling, autoplay and push notifications, and said the Commission is investigating Meta over alleged failures to enforce its minimum age rules. An EU-developed age verification app with privacy standards is part of the proposal, which the Commission aims to finalise by summer pending expert advice.

EU to curb addictive social media features aimed at children
META

Key Points

  • The European Commission intends to regulate social media design features that can encourage addictive behavior among children, with measures targeting endless scrolling, autoplay and push notifications - sectors impacted: social media platforms and online safety policy.
  • The Commission is investigating Meta for reportedly not enforcing its 13-year minimum age requirement on Instagram and Facebook - sectors impacted: major digital platforms and regulatory compliance functions.
  • An EU-built age verification app with privacy safeguards will be made available for integration into member states' digital wallets, enabling platforms to enforce age checks - sectors impacted: digital identity, privacy technology and platform trust-and-safety tools.

Brussels moves to rein in addictive design

The European Commission will pursue regulation later this year aimed at social media design features that may foster addictive use among children, EU Commission President Ursula von der Leyen said on Tuesday. Speaking at the European Summit on Artificial Intelligence and Children in Denmark, von der Leyen singled out mechanics such as endless scrolling, autoplay and push notifications as targets for regulatory action, specifically naming TikTok as a platform under scrutiny.

Regulatory scrutiny of age enforcement

Von der Leyen also said the Commission is examining Meta for alleged lapses in enforcing its minimum age requirement of 13 on Instagram and Facebook. The inquiry is part of a broader probe into platforms that permit access by children to harmful material, including videos that promote eating disorders or self-harm, the Commission president said.

Technical response - age verification app

As part of its approach, the Commission has developed an age verification application built to privacy standards that member states can incorporate into their digital wallets. Online platforms would be able to use this verification mechanism to confirm user age, von der Leyen said.

Timeline and process

The Commission plans to prepare a legal proposal by summer, but that timeline is contingent on advice from its Special Panel of experts on Child Safety Online. The panel's guidance will inform the proposal that the Commission expects to present later in the season.

International context

Von der Leyen noted that this initiative comes amid a global wave of legislative attention on child safety and social media harms. She framed the move as part of coordinated efforts to examine how design choices and platform practices affect children.


Details in the coming months will depend on the expert panel's recommendations and how member states choose to adopt the Commission's technical solution for age checks.

Risks

  • Timing and content of the legal proposal remain uncertain - the Commission expects to prepare a proposal by summer but is awaiting advice from the Special Panel of experts on Child Safety Online - sectors impacted: regulatory and compliance teams at social platforms.
  • Adoption and integration of the EU age verification app across member states are not guaranteed - differences in implementation could affect how consistently platforms can enforce age checks - sectors impacted: digital identity providers and platform enforcement operations.
  • Ongoing investigations into how platforms permit children to access harmful content introduce regulatory and legal uncertainty for major social media companies - sectors impacted: social media operators and online content moderation services.

More from Stock Markets

Colombian equities retreat as COLCAP posts nearly 1% drop to three-month low May 12, 2026 Moscow market climbs as oil, mining and power stocks lead gains May 12, 2026 Red Cat Holdings Sees After-Hours Slide Following $200 Million Equity Offering Announcement May 12, 2026 FCC Signs Off on EchoStar’s $40 Billion Spectrum Sale to SpaceX and AT&T May 12, 2026 CFPB Leadership Moving to Bring Staff Back to Office After Year-Long Closure May 12, 2026