Meta’s courtroom fight: child safety remedies at stake

Misryoum reports Meta faces a three-week public nuisance trial in New Mexico over proposed child safety changes, from age checks to CSAM detection.
Meta’s next courtroom round could prove far more costly than the headline $375 million—because it’s about what the judge might force Meta to change.
Misryoum reports that attorneys for Meta and New Mexico will return to a Santa Fe courthouse for a three-week public nuisance trial. with the state pushing for specific orders tied to Facebook. Instagram. and WhatsApp.. Among the requested remedies are age verification for New Mexico users. limits on how much time minors can spend on the apps. and restrictions on engagement features such as infinite scroll and autoplay.. The case also seeks a high bar for detecting new child sexual abuse material (CSAM). along with a prohibition on end-to-end encryption for users under 18.
This matters because, even if any ruling is limited to New Mexico, platform-wide design choices often spread well beyond one jurisdiction once companies decide how to comply.
While the earlier verdict set a major milestone. this phase focuses less on whether Meta is liable and more on whether the state’s proposed fixes are practical.. Misryoum notes that New Mexico is expected to call witnesses. including experts who will address feasibility. and fact witnesses who will support the state’s view of harm.. After presentations from both sides. Judge Bryan Biedscheid will decide which proposals are relevant and feasible. a step that could take time.
A sweeping order could also influence how other lawsuits and negotiations unfold. even if the decision doesn’t automatically apply elsewhere.. Meta has the option of adopting similar changes more broadly for consistency. or it could choose not to comply in New Mexico. a stance that could make the platform’s risk calculations even more visible.
At the same time. the dispute spotlights one of the hardest tradeoffs in online safety policy: balancing protection for minors against privacy and usability for everyone else.. If encryption or age verification requirements are enforced in one place. the approach can become a template that other regulators either copy or challenge.
Misryoum also highlights that several requests raise politically charged tech-policy questions.. Age verification could require collecting more personal data across adults and minors. an outcome privacy advocates have warned can increase exposure.. The state’s push to limit end-to-end encryption for teens faces additional scrutiny because the effectiveness of such restrictions can depend on how bad actors adapt and whether safety improvements shift users to other services.
On that point. Meta’s position is that the proposed demands are misguided and could create exposure for other forms of exploitation.. Misryoum reports that Meta also argues that it would be unable to prove certain accuracy thresholds for CSAM detection in the way the state frames them. because the testing logic would require knowing what it did not catch.. In contrast, New Mexico is seeking remedies that it argues are monitorable and that would be supported by court oversight.
In the background is a broader policy struggle over how far responsibility should extend when content and recommendations are powered by algorithms.. Misryoum will be watching closely how the judge treats not only the technology choices at issue. but also the operational details—how systems verify ages. how time limits are measured. how detection rates are assessed. and how compliance is audited.
In the end, this trial is less about one bill or one app feature and more about setting the “rules of the road” for how platforms are expected to handle child safety. Whatever the outcome, Misryoum expects it to shape the next wave of negotiations, litigation, and regulation across the industry.