A New Mexico jury ruled Tuesday that Meta willfully failed to protect children from sexual predators on its platforms, ordering the company to pay $375 million in civil damages.
The verdict marked the first major jury decision in a wave of child safety trials targeting social media giants. The case, brought by New Mexico Attorney General Raul Torrez, stemmed from a 2023 undercover investigation in which state investigators created a fake profile of a 13-year-old girl on Meta’s platforms. Within hours, the account was flooded with inappropriate images and solicitations from predators.
“The jury’s verdict is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety,” Torrez said in a statement. “Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew.”
The jury found Meta violated New Mexico’s unfair practices act. Evidence presented at trial included internal company communications examining the fallout from CEO Mark Zuckerberg’s 2019 decision to implement end-to-end encryption by default on Facebook Messenger. Internal messages, according to prosecutors, showed Meta employees warning that the change would affect the company’s ability to report approximately 7.5 million instances of child sexual abuse material to law enforcement each year.
The trial is not finished. A second phase is scheduled to begin May 4, during which a judge, not a jury, will determine whether Meta created a public nuisance and whether the company must fund public programs to address the harms. New Mexico is also pushing for court-ordered reforms, including verified age checks, removal of predators, and changes to how encrypted messages can shield abuse from detection.
During closing arguments, New Mexico’s legal team had urged jurors to impose penalties exceeding $2 billion. The jury settled on $375 million based on the number of violations found.
“We respectfully disagree with the verdict and will appeal,” a Meta spokesperson said in a statement shared with CNBC. “We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online.”
A separate case over personal injury matters is underway in Los Angeles.





