SANTA FE, N.M. (AP) — Meta is raising the prospect of shutting down its social media services in New Mexico in response to a push by state prosecutors for fundamental changes to the company’s platforms, including Instagram, to protect the mental health and safety of children.
The possibility emerged amid legal gamesmanship in the runup to a bench trial next week on allegations that Meta poses a public nuisance. It’s the second phase of a case that already resulted in $375 million in civil penalties on a jury’s determination that Meta knowingly harmed children’s mental health and concealed what it knew about child sexual exploitation on its platforms.
Prosecutors are asking the court to order a series of changes to child accounts on social media aimed at reining in addictive features, improving age verification and preventing child sexual exploitation through default privacy settings and closer oversight.
Meta executives have emphasized that the company continuously improves child safety and addresses compulsive social media use. The company says its being singled out among hundreds of apps that teens use.
In a court filing unsealed Thursday, Meta said it was unfeasible for the company to meet a proposed requirement for 99% accuracy in verifying that child users are at least 13 years old, among other demands.
“As a practical matter, this requirement effectively requires Meta to shut down its services — for all users in the state — or else comply with impossible obligations,” Meta said in the filing.
Such a shutdown across a population of 2.1 million residents in New Mexico could silence personal communication on Meta’s immensely popular platforms, which also include Facebook and WhatsApp, and also impact their use for commercial advertising.
By withdrawing from New Mexico, Meta would satisfy any concerns about harm to children, but the message could appear intentionally hostile and might lead to unintended consequences, said Eric Goldman, codirector of the High Tech Law Institute at Santa Clara University School of Law in California.
Goldman noted that Canadian authorities accused Facebook in 2023 of putting profits over safety after the platform blocked local news content during record-setting wildfires and evacuations. Facebook was responding to a newly enacted law that requires tech giants to pay publishers for linking to or otherwise repurposing their content online.
A Los Angeles jury last month found both Meta and YouTube liable for harms to children using their services, validated longstanding concerns about the dangers of social media.






