Hot Posts

6/recent/ticker-posts

Meta hit with $375 million verdict in New Mexico child safety case


A jury in New Mexico delivered a major verdict on Tuesday, finding that Meta Platforms misled consumers about the safety of its social media platforms and failed to adequately protect children from sexual predators. The decision marks a significant moment in the growing legal battle over how technology companies safeguard younger users online.

The jury ordered Meta to pay $375 million in penalties, a figure well below the $2.2 billion initially sought by the state. The damages were calculated based on the number of violations identified under New Mexico’s unfair-practices act, a consumer protection law designed to prohibit deceptive or unconscionable business practices. Each violation carried a potential fine of $5,000, contributing to the final total.

At the center of the case was whether Meta had been transparent about the risks its platforms posed to minors. Jurors ultimately concluded that the company’s practices fell short, exposing children to harmful interactions and content. The ruling represents the first time a U.S. state has successfully taken a major technology company to trial and secured a verdict related to children’s safety on social media.

Meta has pushed back strongly against the outcome and plans to appeal. The company maintains that it has invested heavily in safety measures in recent years, including enhanced protections for younger users on Instagram. Features such as teen accounts, stricter privacy defaults, and improved content moderation tools have been highlighted as part of its ongoing efforts to address safety concerns.

The lawsuit was spearheaded by Raúl Torrez, who has made holding social media companies accountable a central focus of his tenure. The case stemmed from an investigation launched in 2023, during which state officials conducted an undercover operation. Investigators created a profile posing as a 13-year-old girl and documented interactions that allegedly demonstrated the prevalence of predatory behavior and explicit content on Meta’s platforms.

During the trial, the state presented a wide range of evidence, including internal company documents, testimony from former Meta employees, and input from independent experts. Prosecutors argued that these materials showed Meta was aware of risks to children but failed to act decisively. Meta countered that the evidence had been selectively presented and reflected issues the company had already worked to address.

The case also highlighted broader legal tensions surrounding the tech industry. Companies like Meta have long relied on Section 230 of the Communications Decency Act for protection from liability related to user-generated content. However, New Mexico’s legal strategy focused less on the content itself and more on the design and safety features of the platforms, signaling a shift in how such cases may be argued in the future.

The legal battle is far from over. A second phase of the case is scheduled to begin in May, when New Mexico will pursue additional claims under public nuisance law. The state is expected to seek further financial penalties and push for court-mandated changes to Meta’s platforms, including stricter age verification systems and stronger safeguards around private and encrypted messaging.

This case is one of several high-profile lawsuits against social media companies moving through the courts, reflecting increasing scrutiny of the industry. Observers have compared the growing wave of litigation to the legal battles faced by tobacco companies in the 1990s, suggesting that the outcome in New Mexico could have far-reaching implications for how tech companies operate and are regulated in the years ahead.