Stock Markets May 14, 2026 11:12 AM

Italian parents’ group takes Meta and TikTok to Milan court over minors’ access to social media

MOIGE seeks stronger age checks, algorithm transparency and limits on features it says drive excessive use among children

By Marcus Reed META

An Italian parents’ association and several families initiated a class injunctive action in Milan’s business court seeking orders that would tighten age verification for users under 14, require removal of potentially manipulative algorithms, and mandate clearer information on harms from overuse. The case pits MOIGE and families against Meta and the owner of TikTok and raises jurisdictional and public health questions ahead of further hearings.

Italian parents’ group takes Meta and TikTok to Milan court over minors’ access to social media
META

Key Points

  • Parents' group MOIGE and multiple families filed a class injunctive action in Milan seeking stricter age verification for users under 14 and removal of potentially manipulative algorithms - impacts tech platforms and consumer protection policy.
  • MOIGE claims about 3.5 million Italian children aged 7-14 are active on social media in breach of platform rules, framing the case as a public health issue - impacts youth welfare and related regulatory oversight.
  • TikTok says it enforces Community Guidelines, removes over 99% of violating content and invests in safety measures; Meta did not respond to requests for comment - impacts corporate compliance and reputational considerations for platform operators.

An association representing parents in Italy and a group of families appeared before the business court in Milan on Thursday in the first hearing of a lawsuit aimed at restricting minors' access to major social media platforms.

The legal action is a class injunctive suit filed by MOIGE, an Italian parents' movement, together with several families, against the companies behind Facebook, Instagram and TikTok. The plaintiffs are asking the court to require the platforms to put in place more stringent age-verification systems for users under 14, to remove algorithmic features they consider potentially manipulative, and to provide clear, transparent information about the risks associated with excessive use.

MOIGE told the court it seeks protection for roughly 3.5 million Italian children between the ages of 7 and 14 who, the group says, are present on social media in contravention of platform rules. The parents' movement characterized the issue as one of public health and urged judges to adopt an expedited process because of the risks it attributes to children's exposure online.

During the hearing, representatives for Meta and TikTok raised preliminary objections. The companies disputed the competence and jurisdiction of Italian courts to adjudicate their conduct and also contested certain new documents filed by MOIGE's legal team. Those filings, according to the parents' group, include material indicating the firms were aware of potentially harmful impacts of their algorithms on underage users and of product features designed to boost engagement.

TikTok issued a statement in response to the litigation, saying the case is ongoing and noting that it applies its Community Guidelines strictly, including provisions intended to protect mental and behavioural health. The company added that it proactively removes more than 99% of content that breaches those rules. TikTok also said it continues to invest in safety measures intended to diversify recommended content, block potentially harmful searches and connect users who may be vulnerable with available support resources.

Meta did not immediately respond to a request for comment at the time of the hearing.

MOIGE's lawyers countered the platforms' jurisdictional challenges by arguing that Italian courts do have full authority to hear the matter and reiterated their position that it represents a public health concern. They asked judges to fast-track further proceedings in light of the stakes for children.

The Milan court is expected to set a schedule for additional hearings at a later date.


The case comes as regulators and policymakers across Europe and beyond move to address platform design and youth exposure. European Commission President Ursula von der Leyen said this week that the EU executive is targeting addictive and harmful design practices by social media companies in the forthcoming Digital Fairness Act. The article also notes comparable policy activity in Australia, France and Greece, and that Spain announced plans in February to bar teenagers from using social media.

Risks

  • Jurisdictional challenge by Meta and TikTok disputing Italian courts' competence could delay or narrow the scope of legal remedies - affects legal and compliance functions within social media firms.
  • Outcome depends on court scheduling and evidence; MOIGE submitted new documents it says show platform awareness of algorithmic harms, but companies challenged these filings - creates uncertainty for regulatory and public-health policy responses affecting tech and child welfare sectors.
  • Pending and evolving regulatory initiatives, including the EU's proposed Digital Fairness Act and national measures noted in Spain, Australia, France and Greece, may change the legal and compliance landscape for platforms - introduces regulatory risk for social media companies and related investors.

More from Stock Markets

Options Pricing Signals 5.2% Move for Deere Ahead of May 21 Results May 14, 2026 Cisco Sees AI-Led 'Networking Supercycle' as Sales and Guidance Surge May 14, 2026 Warsaw stocks finish higher as banking, tech and construction lift WIG30 May 14, 2026 Istanbul market edges higher as tech, sports and paper stocks lead gains May 14, 2026 Options Signal 4.1% Move for Walmart Ahead of May 21 Earnings May 14, 2026