Stock Markets March 27, 2026

Governments Worldwide Tighten Age Limits and Controls on Children’s Social Media Use

From Australia’s comprehensive prohibition for under-16s to a patchwork of national rules across Europe, Asia and the Americas, policymakers are moving to restrict minors’ access and force platforms to verify age and remove addictive features

By Priya Menon GOOGL
Governments Worldwide Tighten Age Limits and Controls on Children’s Social Media Use
GOOGL

A growing number of countries have introduced or are preparing laws to restrict children’s access to social media platforms, citing concerns about health, safety and addictive product features. Measures range from full bans for users under certain ages to parental-consent regimes, mandatory age verification and curbs on features such as infinite scroll. Regulators are proposing fines and technical requirements, while platforms maintain minimum signup ages that many child protection advocates say are insufficient.

Key Points

  • Multiple countries have enacted or proposed laws to limit children’s access to social media, ranging from full bans for specific age groups to parental-consent and device-level restrictions - affecting major social media platforms and the digital advertising sector.
  • Regulatory tools include mandatory age verification, bans on addictive features such as infinite scroll, and financial penalties for non-compliance - measures that directly impact technology firms and compliance costs for platform operators.
  • International approaches are inconsistent, with a mix of national laws and a non-binding EU resolution, creating regulatory fragmentation that could complicate platform operations and enforcement across jurisdictions.

Governments across multiple continents are tightening rules governing children’s use of social media amid rising concern about health, safety and the design of platforms. Measures vary from outright bans for underage users to parental-consent schemes and device-level restrictions, and several governments have proposed penalties or technical requirements intended to force compliance.

Overview

One of the most far-reaching changes took effect in Australia, where a law requires major social media platforms to block users under the age of 16 from accessing services such as TikTok, YouTube and Instagram. Other countries have either enacted or proposed similar limits, while some governments are testing household-level interventions to measure effects on sleep, family life and schoolwork. Platform operators generally say the minimum age to sign up is 13, but child protection groups and data cited by national authorities indicate many younger children already have accounts.


Country-by-country measures

  • Australia - A law that came into force requires major social media platforms to block minors under 16 from December 10, 2025. Companies that fail to comply face penalties of up to A$49.5 million, equal to about $34.7 million at the stated conversion rate of $1 = 1.4267 Australian dollars.
  • Austria - The conservative-led, three-party government announced on March 27 that it will ban social media for children up to the age of 14. Vice Chancellor Andreas Babler and junior digitisation minister Alexander Proell said draft legislation would be finalised by June.
  • Brazil - The Digital Statute of Children and Adolescents came into force on March 17. It requires minors under 16 to link social media accounts to a legal guardian and bans addictive platform features such as infinite scroll.
  • Britain - Officials are considering an Australia-style ban on social media for children under 16 and tighter safety rules for AI chatbots for that age group, technology minister Liz Kendall said in February. The government also plans a pilot involving 300 teenagers to test bans, curfews and app time limits in the home to assess impacts on sleep, family life and schoolwork, the government said on March 24.
  • China - Regulators have implemented a "minor mode" programme that enforces device-level restrictions and app-specific rules to limit screen time according to age.
  • Denmark - The government said in November it would ban social media for children under 15, while allowing parents to grant access to some platforms for children aged 13 and 14.
  • France - In January the National Assembly approved legislation to ban children under 15 from social media. The bill still needs to pass the Senate before a final vote in the lower house.
  • Germany - Minors aged 13 to 16 may use social media only with parental consent, a restriction that child protection advocates say does not go far enough.
  • Greece - A senior government source told Reuters on February 3 that Greece is "very close" to announcing a ban for children under 15.
  • India - The state of Karnataka, which includes the tech hub Bengaluru, on March 6 became the first Indian state to ban social media for children under 16. The neighbouring states of Goa and Andhra Pradesh are also considering restrictions. India’s chief economic adviser called in January for age restrictions on social media platforms, describing them as "predatory" in how they keep users engaged.
  • Indonesia - The communications and digital ministry said on March 6 that it will restrict access for children under 16. Starting March 28, accounts owned by children under 16 on "high risk platforms" will be gradually deactivated. The minister named platforms that include TikTok, Facebook, Instagram and Roblox.
  • Italy - Children under 14 require parental consent to sign up for social media; no consent is required above that age.
  • Malaysia - The government said in November that it would ban social media for users under 16 starting in 2026.
  • Norway - In October 2024 the government proposed raising the age at which children can consent to social media terms to 15 from 13, while still allowing parents to sign on their behalf if the child is under the limit. Work has also begun on legislation to set an absolute minimum age of 15 for social media use.
  • Poland - The ruling party said on February 27 it is preparing legislation to ban access to social media for children under 15 and to require platforms to be responsible for age verification.
  • Portugal - Parliament approved a bill on February 12 requiring explicit parental consent for children aged 13 to 16 to access social media. Tech companies that fail to comply could face fines of up to 2% of their global revenue.
  • Slovenia - Deputy Prime Minister Matej Arcon said on February 6 that Slovenia is drafting a law to prohibit children under 15 from accessing social media.
  • Spain - Prime Minister Pedro Sanchez said early in February the government will ban social media for minors under 16 and will require platforms to implement age verification systems. It was not clear whether the measure would require approval by the lower house, which is highly fragmented.
  • The United States - The Children’s Online Privacy Protection Act prevents companies from collecting personal data from children under 13 without parental consent. Several states have passed laws requiring parental consent for minors to access social media, but those laws have faced court challenges on free speech grounds.

European Union-level action

The European Parliament in November agreed on a non-binding resolution calling for a minimum social media age of 16. The resolution also urged a harmonised EU digital age limit of 13 for social media access and a separate age limit of 13 for video-sharing services and "AI companions". Because the resolution is not legally binding, national approaches vary and a mix of proposals and existing laws coexist across member states.


Industry response and concerns

Major social media companies such as TikTok, Facebook and Snapchat state that users must be at least 13 to sign up. Child protection advocates assert those controls are insufficient, and official data in several European countries reportedly shows large numbers of children under 13 already have accounts. Governments are therefore pressing platforms to adopt more robust age verification and to remove or limit features deemed addictive.

Policymakers are using a range of tools - outright prohibitions, parental-consent requirements, device-level restrictions, feature bans and financial penalties - to force compliance. Enforcement mechanisms and the costs to platforms for meeting divergent national rules are among the practical issues that remain to be tested as the new measures are implemented.


Currency note

For reference, the article uses the conversion rate of $1 = 1.4267 Australian dollars when citing the maximum penalty in Australia.

Risks

  • Legal challenges - In the United States several state-level laws requiring parental consent have faced court challenges on free speech grounds; ongoing litigation could delay or alter enforcement and affects platforms and digital publishers.
  • Compliance and enforcement uncertainty - Divergent national rules, differing age thresholds and varying technical requirements for age verification raise uncertainty for technology companies and advertisers about implementation costs and operational burden.
  • Policy fragmentation - The coexistence of strict national laws and non-binding EU-level guidance increases the risk that platforms must follow multiple, potentially conflicting systems, affecting global platform design, moderation policies and cross-border services.

More from Stock Markets

Plug Power’s CEO Jose Luis Crespo Lays Out Execution Plan to Scale and Reach Profitability Mar 27, 2026 Carnival Lowers Full-Year Profit Guide After Fuel Costs Rise Mar 27, 2026 Mizuho Analyst Sees Market Paralysis as Geopolitical Strains Keep Investors on Sidelines Mar 27, 2026 Widespread Flight Cancellations Persist as Middle East Airspace Disruptions Continue Mar 27, 2026 One and One Green Technologies Shares Jump After Announcing Copper-Gold Tailings Recovery Venture Mar 27, 2026