New Mexico Jury Finds Meta Violated State Law on Child Safety

Mar 25, 2026, 2:29 AM
Image for article New Mexico Jury Finds Meta Violated State Law on Child Safety

Hover over text to view sources

A New Mexico jury has determined that Meta, the parent company of Facebook, Instagram, and WhatsApp, knowingly harmed children's mental health and concealed critical information regarding child sexual exploitation on its platforms. This verdict, reached after a nearly seven-week trial, highlights a growing trend of accountability for tech companies in the face of increasing scrutiny over their impact on youth mental health and safety.
The jury sided with state prosecutors who argued that Meta prioritized profits over safety, thus violating parts of the state's Unfair Practices Act. They found that Meta engaged in "unconscionable" trade practices that exploited the vulnerabilities of children and made false or misleading statements about the safety of its platforms.
In total, jurors identified thousands of violations, leading to a penalty of $375 million, which is significantly less than the $2 billion prosecutors were seeking. The jury opted for the maximum penalty of $5,000 per violation, believing it was warranted given the extent of harm to the teenagers involved.
The trial's outcome does not immediately force Meta to change its practices. A subsequent phase, slated for May, will determine whether the company created a public nuisance and if it should fund public programs to address the harm caused by its platforms.
Meta has stated its disagreement with the verdict and plans to appeal. A spokesperson for the company emphasized its commitment to safety and the ongoing challenges of managing harmful content on its platforms, asserting, "We work hard to keep people safe on our platforms".
The case is part of a broader wave of litigation against Meta, with more than 40 state attorneys general filing lawsuits alleging that the company contributes to a mental health crisis among young people by designing addictive features on its platforms. Critics argue that Meta's algorithms prioritize engagement over safety, leading to harmful content being proliferated among its younger users.
Sacha Haworth, executive director of The Tech Oversight Project, remarked on the implications of the verdict, stating, "Meta's house of cards is beginning to fall." She pointed to whistleblowers and documented evidence that highlight the company's failure to protect minors from online predators and harmful content.
The case relied on an undercover investigation where agents posed as children on social media to document sexual solicitations and assess Meta's response. New Mexico Attorney General Raúl Torrez filed the lawsuit in 2023, claiming that Meta has not adequately addressed the dangers of social media addiction, a point that Meta executives have contested, although they acknowledged the existence of "problematic use" during the trial.
The jury was presented with various testimonies, including that of Meta executives, platform engineers, and local educators who discussed the negative impacts of social media on children, including sextortion schemes. They also deliberated on whether users were misled by statements from Meta leadership regarding platform safety and the enforcement of age restrictions for users under 13.
ParentsSOS, a coalition of families affected by social media-related harm, hailed the verdict as a "watershed moment" in the fight for accountability from Big Tech. They expressed hope that this ruling underscores the dangers posed by social media to children and the need for stricter regulations and oversight.
As the legal landscape continues to evolve, the New Mexico jury's verdict may set a precedent for future cases against Meta and other tech companies, prompting a reevaluation of how social media platforms operate and their responsibilities toward young users. The outcome of the next phase of the trial could further influence the industry's approach to child safety and mental health in the digital age.

Related articles

Zuckerberg Discusses Teen Wellbeing with Apple CEO Tim Cook

Meta CEO Mark Zuckerberg testified in a landmark trial, revealing he reached out to Apple CEO Tim Cook regarding the wellbeing of teens and kids using social media. This discussion comes amid allegations that platforms like Instagram are harmful to young users, paralleling past legal battles against the tobacco industry.

Enhancing Mental Health Safety on Social Media: Key Strategies

Social media can significantly impact mental health, often negatively. To improve mental health safety online, users can adopt strategies like setting boundaries, curating their feeds, and fostering real-life connections. Understanding the risks associated with excessive use and prioritizing mental well-being are essential for healthier online experiences.

Zuckerberg Discusses Teen Wellbeing with Apple CEO Tim Cook Amid Landmark Trial

Mark Zuckerberg revealed in court that he reached out to Apple CEO Tim Cook in 2018 to discuss the wellbeing of teenagers and children using social media. This disclosure comes during a high-stakes trial examining the mental health impacts of platforms like Instagram, which faces allegations of contributing to addiction and harm among young users.

Enhancing Mental Health Safety on Social Media: Effective Strategies

As social media usage continues to rise, its impact on mental health cannot be overlooked. This article discusses effective strategies for improving mental health safety on social media, focusing on setting limits, fostering positive connections, and recognizing the realities of online interactions.

Meta and Google Found Liable for Damaging Young Woman's Mental Health

In a landmark trial, Meta and Google were held liable for the mental health struggles of a young woman who became addicted to their platforms. The jury awarded her $6 million, marking a significant step toward holding social media companies accountable for their role in youth mental health issues.