Authorities in a growing number of jurisdictions are imposing tighter controls on who can use social media, and under what conditions. The changes include bans for certain age cohorts, requirements for parental consent, mandated age verification, limits on addictive features and device-level restrictions intended to reduce screen time for minors.
Australia set the most far-reaching example when legislation that took effect on December 10, 2025 compelled major social media companies to block users under 16 from accessing platforms such as TikTok, YouTube and Instagram. Companies that do not comply face penalties capped at A$49.5 million ($34.9 million).
National approaches at a glance
Governments are taking a range of approaches, often calibrated to local political priorities and legal frameworks. Several countries have opted for strict age cut-offs, others rely on parental consent or technical measures, and some are phasing in restrictions.
- Austria - The conservative-led, three-party coalition announced on March 27 a plan to bar children up to 14 from social media. Vice Chancellor Andreas Babler and junior digitisation minister Alexander Proell said a draft bill would be finalised by June.
- Brazil - The Digital Statute of Children and Adolescents came into effect on March 17, requiring accounts owned by minors under 16 to be linked to a legal guardian and banning platform features deemed addictive, such as infinite scroll.
- Britain - The government is considering measures similar to Australia’s, including a ban for under-16s and tightened AI safety rules for children. Technology minister Liz Kendall said in February that such measures could arrive as soon as this year. The government also announced plans to test social media time limits, curfews and app restrictions in the homes of 300 teenagers to study effects on sleep, family life and schoolwork.
- China - Authorities have implemented a "minor mode" that combines device-level restrictions with app-specific rules to limit children's screen time depending on age.
- Denmark - In November the government said it would ban children under 15 from social media, while allowing parents to grant access for children aged 13 and 14.
- France - The National Assembly approved legislation in January to ban social media for those under 15 amid concerns about online bullying and mental health; the bill still needs Senate consideration and a final lower-house vote.
- Germany - Minors aged 13-16 may use social platforms only with parental consent, though child protection groups argue current controls do not go far enough.
- Greece - Prime Minister Kyriakos Mitsotakis said on April 8 that social media access will be prohibited for children under 15 from January 1, 2027.
- India - Karnataka state, which includes the Bengaluru tech hub, became the first Indian state to bar social media use for under-16s on March 6. Goa and Andhra Pradesh are reported to be considering similar steps. India’s chief economic adviser urged age restrictions in January, describing platforms as "predatory" in how they maintain user engagement.
- Indonesia - The communications and digital ministry announced on March 6 that it will restrict platform access for those under 16. From March 28, accounts for children under 16 on "high risk platforms," including TikTok, Facebook, Instagram and Roblox, will be gradually deactivated, Communications and Digital Minister Meutya Hafid said.
- Italy - Users under 14 need parental consent to create social media accounts; no consent is required for older teenagers.
- Malaysia - The government said in November it would prohibit social media use for people under 16 beginning in 2026.
- Norway - In October 2024 the government proposed raising the age of consent for social media terms to 15 from 13, with parents still able to consent on behalf of younger children. Officials are also working on legislation to establish an absolute minimum age of 15 for social media use.
- Poland - The ruling party announced on February 27 plans to draft a law banning social media for children under 15 and to require platforms to verify users’ ages.
- Portugal - Parliament approved a bill on February 12 mandating explicit parental consent for users aged 13-16, and it gives regulators the power to fine tech firms up to 2% of global revenue for non-compliance.
- Slovenia - Deputy Prime Minister Matej Arcon said on February 6 that a law is being prepared which would bar those under 15 from accessing social media.
- Spain - Prime Minister Pedro Sanchez said in early February that social media access will be banned for minors under 16 and platforms will need to introduce age verification systems; it was unclear whether the measure requires approval by the lower house.
- United States - The Children's Online Privacy Protection Act already restricts data collection from children under 13 without parental consent. Several states have enacted laws requiring parental consent for minors to use social media, but those laws have faced court challenges on free speech grounds.
European Union stance
The European Parliament agreed on a non-binding resolution in November calling for a harmonised minimum age of 16 for social media use. The resolution also urged an EU-wide digital age limit of 13 for social media access and suggested age limits of 13 for video-sharing services and AI companions. As a resolution it does not carry legal force but signals the Parliament’s policy preferences.
Industry response and concerns
Major social platforms including TikTok, Facebook and Snapchat state that users must be at least 13 years old to register. Child protection advocates argue those measures are inadequate given data showing large numbers of children under 13 hold social media accounts in several European countries. Where governments require parental consent, linkage of minors’ accounts to legal guardians is being used as a tool to enforce restrictions, as in Brazil’s statute.
In some countries the emphasis is on technical enforcement. China’s minor mode imposes device-level and app-specific controls; Indonesia is deactivating accounts deemed to belong to under-16s on designated "high risk" platforms; and several governments are pushing for mandatory age verification systems to be implemented by platforms.
What this means for markets and sectors
Regulatory actions of this scale touch several parts of the digital economy. Social media platforms and their advertising models could face user-base adjustments and compliance costs. Age verification and parental-consent systems will likely create demand for verification technology and compliance services. Policymakers point to harms such as online bullying and mental health effects as key drivers behind the rules.
Note on currency: $1 = 1.4201 Australian dollars.