Jurors wade through daunting evidence in high-stakes Meta trial about social media risks to children


SANTA FE, N.M. (AP) — A daunting stream of testimony and evidence has been presented in a New Mexico case that explores what the social media conglomerate Meta knew about the effects of its platforms on children.

State prosecutors allege Meta failed to disclose the risks that its platforms pose for children, including mental health problems and sexual exploitation. Meta’s attorneys have said the company has built-in protections for teenagers and weeds out harmful content but the company acknowledges some dangerous content gets past its safety nets.

Attorneys prepared for closing arguments to jurors next week after Meta on Friday closed out its showing of testimony and evidence and the trial completed its sixth week.

If jurors later find that Meta — which owns Instagram, Facebook and WhatsApp — violated New Mexico’s consumer protection laws, prosecutors say sanctions could add up to billions of dollars. Meta, however, says it would seek a different calculation.

The trial that started Feb. 9. is one of the first in a torrent of lawsuits against Meta and comes as school districts and legislators want more restrictions on the use of smartphones in classrooms.

A slated second phase of the trial, possibly in May before a judge with no jury, would determine whether Meta created a public nuisance with its social media platforms and should pay for public programs to fix matters.

Here’s what to know about the possible outcomes of the trial:

A reckoning in courts for social media platforms

Meta is confronting two counts of violating the New Mexico Unfair Trade Practices Act that protects consumers from deceptive or predatory business practices. An additional count was dropped Friday by the judge from a draft of jury instructions.

After closing arguments, jurors will weigh whether Meta knowingly misrepresented the risks on its platforms — by omission or active concealment at the least.

The case could sidestep immunity provisions that protect tech companies from liability for material posted on their social media platforms under Section 230, a 30-year-old provision of the U.S. Communications Decency Act, as well as a First Amendment shield.

In California, a jury already is sequestered in deliberations on whether social media companies should be liable for harms caused to children using their platforms, in one of three bellwether court cases that could set the course for thousands of similar lawsuits.

New Mexico’s case is built on a different foundation — including a state undercover investigation where agents created social media accounts posing as children to document sexual solicitations and the response from Meta.



Source link

  • Related Posts

    Business in Vancouver — March 23, 2026

    Business in Vancouver — March 23, 2026 Source link

    Canada Air Jet and Firetruck Collide at LaGuardia, Killing 2 Pilots

    IE 11 is not supported. For an optimal experience visit our site on another browser. UP NEXT Trump’s Dispatch of ICE to Airports Adds Pressure to DHS Battle 01:35 ICE…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Business in Vancouver — March 23, 2026

    Business in Vancouver — March 23, 2026

    CIJA urges government to prioritize promise to protect Ontario’s Jews on return to legislature

    A unique NASA satellite is falling out of orbit—this team is trying to rescue it

    A unique NASA satellite is falling out of orbit—this team is trying to rescue it

    Bill Self says coaching future at Kansas to be determined

    Bill Self says coaching future at Kansas to be determined

    UK PM Starmer says no threat from Iran on Britain, calls for de-escalation | Politics

    UK PM Starmer says no threat from Iran on Britain, calls for de-escalation | Politics

    Canada Air Jet and Firetruck Collide at LaGuardia, Killing 2 Pilots

    Canada Air Jet and Firetruck Collide at LaGuardia, Killing 2 Pilots