A jury in Santa Fe has held Meta responsible for violating New Mexico's consumer protection statutes and for facilitating child sexual exploitation on its platforms, assessing a $375 million penalty. The verdict, handed down on March 26, advances the case to a second stage scheduled for May when a judge will hear the state's public nuisance claims in a bench trial - a phase that could authorize court-ordered changes to product design for services popular with teenagers.
The remedy phase distinguishes New Mexico's action from the many private suits naming Meta. While individual plaintiffs have pressed damages for harms such as addiction and mental health effects, the state's pursuit of structural changes to how platforms operate raises the possibility that a court could mandate design alterations to Facebook, Instagram and other apps aimed at youth.
What New Mexico is seeking
In an interview, New Mexico Attorney General Raúl Torrez outlined a broad set of potential remedies the state may request from the court. The measures under consideration include:
- limits on the types of content recommended to minors;
- restrictions on the frequency and timing of notifications that prompt teenagers to log on;
- a curtailing of the "infinite scroll" experience for children;
- strengthened age verification procedures;
- and a plan to remediate harm already suffered by New Mexico residents.
Torrez also said the state would likely ask Judge Bryan Biedscheid to appoint an independent monitor or special master to oversee Meta's compliance with New Mexico consumer protection law over a period of years. "It’s not out of the realm of possibility that we ask for and receive an even greater award" at the second stage of the trial than the first, Torrez said. "But my perspective has been to focus on the changes of the product itself."
Legal strategy and broader context
New Mexico is among a growing number of state attorneys general turning to public nuisance law - a legal doctrine that permits governments to sue over conduct they say unlawfully interferes with public health or safety - to press technology platforms for changes. That doctrine has previously been used in suits against industries accused of causing widespread societal harm.
Torrez acknowledged that using state courts to regulate global social media product design is "probably not the most efficient" route, but he said he did not want to "wait any longer for a system to deliver what it should have 15 years ago." While the New Mexico action centers on child predation and grooming, Torrez noted that many state attorneys general pursuing broader cases over youth mental health also hope to force changes to product features.
Since the verdict, Torrez said his office has received inquiries from other states and regulatory bodies overseas. "I have an expectation that Meta is in for a wave of litigation," he added. "I’ve been real clear with colleagues that they could set up undercover investigations on these platforms right now and yield the same results."
Meta's response and existing safety changes
Meta spokesperson Andy Stone said the company will appeal the jury verdict and that "we will continue to defend ourselves vigorously." The appeal is expected to raise questions about Section 230 of the Communications Decency Act, the federal law that has long limited platform liability for third-party content.
Stone also noted that since the lawsuit was filed Meta has implemented multiple safety upgrades that overlap with some measures sought by the state. The company has introduced dedicated accounts for teen users with nighttime notifications turned off by default, added age verification features and indicated plans to filter out content deemed age-inappropriate. Meta recently said it was removing end-to-end encryption from Instagram's messaging feature; the company said the change was driven by lack of use, and the move was welcomed by child safety advocates. Meta indicated it will continue to provide encrypted messaging on WhatsApp, while it has not clarified plans for Facebook Messenger.
Views on potential remedies
Not all observers expect the court to be able to impose sweeping changes at the level of algorithms. Max Willens, an analyst at eMarketer, expressed skepticism that New Mexico could compel modifications to the content recommendation systems that are central to Facebook and Instagram. "Algorithm modification is not a likely remedy, but it is among the list of possible changes that could be required," he said. "The second phase of this trial may be more consequential to social media platforms than the first."
Matthew Bergman of the Social Media Victims Law Center, who represented the plaintiff in a separate Los Angeles case alleging negligent product design by Meta, YouTube and other companies, observed that court-ordered relief can be harder to secure in individual suits. In that Los Angeles matter, a jury this week awarded a woman a combined $6 million judgment against Meta and Google in what is seen as a test case for similar claims.
Next steps and uncertainties
The May bench trial will examine New Mexico's public nuisance claims and could produce remedies that reach into how social media platforms recommend, present and verify content for minors. Torrez indicated the state is focused on securing design changes and mechanisms to enforce compliance, though he stopped short of detailing a specific enforcement plan at the outset.
Meta has indicated it will continue to contest the verdict through appeal while continuing to roll out safety features. How a higher court will weigh state-level public nuisance theories against federal protections such as Section 230, and whether a judge in the bench phase will order operational changes to complex algorithmic systems, are among the open questions facing the litigation.
The outcome of the second phase could reverberate beyond New Mexico's borders. State officials and regulators elsewhere are watching closely, and Torrez said his office has been contacted by counterparts both in the United States and internationally seeking guidance.