Stock Markets April 2, 2026

Australia Escalates Enforcement of Under-16 Social Media Ban as International Attention Mounts

Compliance report showing many minors still online prompts legal probes of major platforms amid rising global interest

By Leila Farooq META SNAP GOOGL
Australia Escalates Enforcement of Under-16 Social Media Ban as International Attention Mounts
META SNAP GOOGL

Australia, which implemented the world’s first ban on under-16s using popular social media apps in December, has moved from a cooperative posture to heightened enforcement after a government regulator found substantial noncompliance. With at least eight other countries expressing interest in similar measures and recent U.S. court rulings against major tech firms, Canberra is investigating Instagram, Facebook, TikTok, YouTube and Snapchat for possible breaches and preparing evidence for potential legal action.

Key Points

  • Australia implemented a ban in December on social media use by people under 16 and initially reported platforms had deactivated 4.7 million suspected underage accounts.
  • The eSafety regulators compliance report found nearly one-third of parents said their under-16 child still had at least one social media account, and two-thirds of those parents said platforms had not asked the childs age.
  • The government is investigating Metas Instagram and Facebook, TikTok, Alphabets YouTube and Snapchat for possible breaches and is gathering evidence for potential legal action; recent U.S. court rulings against Meta and Google have reinforced regulatory momentum.

Since Australia enacted a ban in December prohibiting children under 16 from using major social media platforms, the government has shifted from praising industry cooperation to initiating formal probes into platform compliance. The change follows a regulators report showing many minors remain active online and comes as multiple foreign jurisdictions signal they are watching Canberras policy closely.

In mid-January the government disclosed that platforms had deactivated 4.7 million suspected underage accounts, a figure that initially suggested meaningful industry cooperation and led some to expect an extended grace period for enforcement. But the first comprehensive compliance review by the eSafety regulator found that nearly one-third of parents reported their under-16 child still had at least one social media account. Among those cases, two-thirds of parents said the platform had not asked the childs age.

Responding to those findings, Communications Minister Anika Wells office said this week it is gathering evidence against a number of apps for possible breaches of the law and may pursue legal action. Platforms named in the investigation include Metas Instagram and Facebook, TikTok, Alphabets YouTube and Snapchat. The eSafety regulator had earlier indicated it would take enforcement action only in instances of systemic noncompliance; officials now appear to be preparing to escalate.

Government officials had previously framed the bans rollout as a cooperative effort with industry, and the January deactivations prompted expectations that regulators might offer platforms up to a year before strict enforcement. But a stream of headlines reporting that many minors continue to access social apps has complicated that calculus.

"The whole worlds watching Australia in this experiment, and therefore it looks like weak government to back down or pretend that the failures in reasonable efforts arent happening," said Jeannie Paterson, co-founder of the Centre for Artificial Intelligence and Digital Ethics, who regularly advises government on tech policy.

Paterson and other experts say international interest in Australias policy - at least eight countries have indicated they want similar restrictions - has likely reinforced Canberras decision to press platforms rather than ease enforcement. A spokesperson for Minister Wells declined to say whether global attention had altered her stance, saying only that she would not comment further on that point.

While the public response in Australia has been largely supportive - with the ban receiving strong parental backing - the technology companies subject to the regulation have resisted. Meta and Snap have stated they are committed to complying with the law. TikTok declined to comment on the government action, and Alphabet did not respond to requests for comment.

The eSafety compliance report also found that complaints about cyberbullying and image-based abuse - problems the government had argued the ban would help address - remained unchanged. The regulator reported parents were often unable to notify platforms that their underage children still maintained accounts. It also found minors who failed automated age checks were being prompted to repeat the test until they achieved a result that allowed access.

Under the legislation, platforms must take "reasonable steps" to prevent under-16s from holding accounts or face fines of up to A$49.5 million (approximately $34 million). Communications Minister Wells has said the primary problem is not lack of parental compliance but actions by some large technology firms that undermine the policy.

Observers say developments in U.S. courts have likely strengthened the Australian governments resolve. Last week a U.S. trial verdict ordered Meta to pay $375 million in penalties over safety lapses that allowed child exploitation on Facebook, Instagram and WhatsApp. Separately, another decision found Meta and Google negligent for designing social media platforms that have harmful effects on young people. Those outcomes, while limited to specific U.S. proceedings, have added momentum to arguments that platforms bear responsibility for youth well-being.

"The court cases in New Mexico and California have helped the court of public opinion," said Julian Sefton-Green, a professor of new media at Deakin University who is advising the commissioners two-year study on the bans impact. "Theyre jury decisions, that social media is liable for the well-being of young people, so I think the governments going to take heart from that."

Legal scholars and former government advisers say the combination of the regulators findings and the international judicial rulings creates a stronger basis for Australia to pursue enforcement. Angela Flannery, a former general counsel for the communications department who now works with private-sector clients, said the government is encouraged that other jurisdictions are considering similar age restrictions.

"The government is quite heartened generally by the number of other jurisdictions that are looking at imposing restrictions on the under-16s globally," Flannery said. "But given Tuesdays disheartening report on compliance, they probably want to be seen to be taking action to keep encouraging other jurisdictions to enforce or to enact similar bans."

Regulatory researchers also note the potential for legal pressures in one country to produce product design changes that reduce access for under-16s everywhere. Rob Nicholls, a researcher of regulation at the University of Sydney, said litigation in the United States may prompt platforms to redesign features or age-verification systems to avoid legal risk, and that such changes would likely have the effect of reducing access for younger users globally.

"The effect of that design change will be to reduce access for under-16s," Nicholls said. "If youve got to do it to avoid litigation in the States, you may as well do the same thing around the world."

As the government compiles evidence for potential legal action, regulators and industry will be monitored closely by foreign policymakers and the public. For now, Australia is positioning itself as the most active jurisdiction pursuing a legal and regulatory test of restricting social media access for under-16s, while confronting the practical challenges of enforcement and verifying user ages.

($1 = 1.4531 Australian dollars)

Risks

  • Continued noncompliance by platforms could prompt prolonged legal battles and regulatory fines, affecting social media companies and related technology sectors.
  • Unchanged levels of cyberbullying and image-based abuse after the ban raise uncertainty about the policys immediate effectiveness in improving child safety online, impacting policymakers and child-safety advocates.
  • Difficulties in verifying ages and preventing minors from circumventing age checks create enforcement challenges that could reduce the practical impact of the ban and complicate regulatory oversight.

More from Stock Markets

Wall Street futures hold steady ahead of Good Friday as markets await U.S. jobs data Apr 2, 2026 Majors Signal Interest in Majority Stake of Shenandoah Deepwater Field as Sale Process Begins Apr 2, 2026 Atmus Filtration Rises After S&P SmallCap 600 Announcement Apr 2, 2026 Moscow Stocks Slip as Energy, Power and Manufacturing Weigh on Index Apr 2, 2026 Starbucks Finalizes China Joint Venture with Boyu, Eyes Major Store Expansion Apr 2, 2026