A New Mexico jury on Tuesday concluded that Meta Platforms violated the state consumer protection law and ordered the company to pay $375 million in civil penalties, finding that the social media company misrepresented the safety of its Facebook, Instagram and WhatsApp services and enabled sexual exploitation of children on those platforms.
The verdict, reached at the end of a six-week trial, represents the first jury decision on these particular claims against Meta. The case stems from litigation brought by the state attorney general, who argued the company allowed predators broad access to underage users and in many instances connected offenders with victims, sometimes producing real-world abuse and human trafficking.
New Mexico Attorney General Ra fal Torrez, a Democrat, led the prosecution. The state presented evidence from an undercover operation conducted by Torrez and his office in 2023 in which investigators set up accounts on Facebook and Instagram posing as users younger than 14. According to the state, these accounts received sexually explicit material and were contacted by adults seeking similar content, developments that led to criminal charges against multiple individuals.
State lawyers argued Meta publicly represented Instagram, Facebook and WhatsApp as safe for children and teenagers in New Mexico while concealing the scale of dangerous and sexual content present on its platforms. The state also pointed to internal Meta documents it said acknowledged problems with sexual exploitation and mental health harm, while the company failed to deploy basic safety measures such as age verification.
Prosecutors further alleged that product design choices intended to maximize engagement - including features like infinite scroll and auto-play videos - kept young users on the services and encouraged addictive usage patterns that can contribute to depression, anxiety and self-harm. The New Mexico complaint said those design features fostered behavior that increased the risk of real-world harm to children.
The state sought both monetary relief and court orders requiring changes to Meta platforms to improve safety for youth users. During closing arguments, Linda Singer, an attorney for New Mexico, framed the case as a decade-long failure of transparency and protection. "Over the course of a decade, Meta has failed over and over again to act honestly and transparently," Singer told jurors. "It s failed to act to protect young people in this state. It is up to you to finish this job." Singer also told the jury it could award more than $2 billion in damages.
Meta has denied the state's allegations and maintains it has extensive safeguards in place to protect younger users. The company has argued that it is not liable for the content posted by users and that constitutional free-speech protections, as well as Section 230 of the Communications Decency Act, limit the state's ability to hold the company responsible for user-generated content. Meta told jurors its algorithms and product features publish content and that the state's claims of harm cannot be separated from the content itself.
During closing statements, Kevin Huff, counsel for Meta, emphasized the company's disclosures and efforts to limit harmful content. "What the evidence shows is Meta s robust disclosures and tireless efforts to prevent harmful content. And these disclosures mean that Meta did not knowingly and intentionally lie to the public," Huff told the jury.
The New Mexico litigation unfolded against a backdrop of increased scrutiny of Meta and other social media firms over child and teen safety. Whistleblower testimony presented to Congress in 2021 helped spur public attention and regulatory interest, with critics alleging the companies understood potential harms to young users but did not take adequate steps to address them.
Separately, Meta faces thousands of other lawsuits in state and federal courts that contend the company and other social platforms intentionally engineered products to be addictive for young people, contributing to a nationwide mental health crisis. Several of those cases seek damages in the tens of billions of dollars, according to statements Meta has made in filings with financial regulators.
Meta has repeatedly invoked legal protections in its defense, including the First Amendment and Section 230, which generally limits liability for platforms based on third-party content. The company has argued that disclosures about limitations in content moderation demonstrate that it did not intentionally mislead the public.
In addition to the jury verdict, a separate bench trial is scheduled before Judge Bryan Biedscheid in May to decide the state's public nuisance claims. The state has said it will ask the judge to require Meta to change certain product features to ensure compliance with state law and to better protect the health and safety of residents.
The jury decision and the upcoming bench proceedings underscore the continuing legal and regulatory uncertainty facing major social platforms as they confront simultaneous allegations of facilitating illegal conduct and contributing to mental health harms among young people. Meta's legal strategy and its reliance on constitutional and statutory protections will likely play a central role in subsequent appeals and proceedings, which may determine whether the company must alter product design or pay additional damages beyond the civil penalties imposed by the jury.
Next steps
The May bench trial will address claims that Meta created a public nuisance affecting the health and safety of New Mexico residents and could result in court-ordered product changes if the judge finds in the state's favor. Meta may also pursue appeals and other legal defenses following the jury verdict.