Technology

New Mexico jury finds Meta violated consumer law over child harm

A New Mexico jury ruled Meta violated state consumer protection law over child exploitation and harmful effects, setting up appeals and broader litigation.

A New Mexico jury has found that Meta violated the state’s consumer protection law, concluding the company harmed children’s mental health and failed to act on dangers tied to exploitation content.

The verdict. delivered after a nearly seven-week trial. centers on claims that Meta—owner of Instagram. Facebook and WhatsApp—put profits ahead of safeguards meant to protect minors.. The jury said Meta violated parts of the New Mexico Unfair Practices Act. including findings that the company hid or downplayed what it knew about risks of child sexual exploitation occurring on its platforms and how those risks can affect child mental health.

The jury also agreed with allegations that Meta made false or misleading statements and that it engaged in “unconscionable” trade practices by taking advantage of children’s vulnerability and inexperience.. Jurors determined there were thousands of violations, each counted separately toward a penalty of $375 million.. Meta disputes the outcome, saying it discloses risks, works to remove harmful content and will appeal.

This New Mexico decision lands inside a wider legal and political push focused on whether social platforms should be held accountable for harms involving children.. New Mexico’s case was described as one of the first to reach trial amid growing litigation targeting Meta and other major platforms.. At the same time. school districts and lawmakers have been debating stricter limits on smartphone use in classrooms—an indicator that the issue has moved beyond courtrooms and into everyday policy.

The underlying tension in these lawsuits is not just whether harmful content exists. but whether platform design and enforcement choices meaningfully contribute to harm.. Prosecutors argued that Meta’s systems—especially algorithms that shape what users see—help distribute material that can be dangerous for children.. Defense attorneys countered that safety efforts are both ethical and strategic. emphasizing that eliminating harmful actors and content is not only required but also beneficial to business.

While the New Mexico trial examines state consumer protection law, other cases are unfolding in parallel with different legal frameworks.. In federal court in Southern California. another jury has been deliberating over whether Meta and YouTube should be liable for harms connected to children’s use.. Testimony there included discussion of the age rules that aim to keep users under 13 off certain services—and the practical difficulty of enforcing them because some users lie about their age.

A key element of the New Mexico case involved an undercover investigation in which state agents created social media accounts posing as children to document sexual solicitations and evaluate Meta’s response.. That approach fed into broader arguments about detection and enforcement: what platforms do once suspicious activity appears. how effectively content is identified. and whether the company’s public statements match internal understanding of the risks.

For families, the verdict’s relevance is immediate rather than theoretical.. When young people encounter exploitation schemes such as sextortion. the damage can be rapid and deeply personal. often escalating across messages. images and attempts at coercion.. Schools have also described disruptions tied to these dynamics. adding a second layer of fallout that goes beyond online safety and into attendance. classroom focus and student wellbeing.

Legal accountability may also reshape how platforms measure “safety” as a product feature.. Meta’s position is that it invests in safety and that some harmful material still slips through—a common defense in platform cases.. But the jury’s findings suggest the court accepted that disclosure gaps and enforcement choices can be part of the harm. not merely an operational imperfection.. If appellate courts uphold the reasoning. it could intensify pressure for more aggressive detection. clearer user protections. and sharper internal governance around child safety risks.

Another major question is how much the outcome depends on the specific statutes and claims in New Mexico.. The trial included allegations tied to disclosures about enforcement of under-13 restrictions and the prevalence of content associated with teen suicide. along with arguments about algorithms prioritizing sensational or harmful material.. The jury reportedly used a checklist of claims to decide whether Meta misled users. mishandled known risks and failed to act in ways the state believed were required.

Meta has said it will appeal. and the next phase of the broader dispute could include additional court proceedings about remedies. including whether the case supports orders aimed at changing conduct.. In the meantime. the verdict adds weight to a mounting refrain from prosecutors and consumer advocates: that platforms should be judged not only by what they say they try to do. but by whether their systems and disclosures align with the real-world risks affecting children.

iPhone 17e review: Apple’s 60Hz move could hurt budget Android buyers

Apple Won’t Face Aadhaar Preinstall Mandate in India, Misryoum Reports

MacBook Neo shows Apple’s budget iPhone still misses the mark

Back to top button