Jurors wade through daunting evidence in high-stakes Meta trial about social media risks to children


SANTA FE, N.M. (AP) — A daunting stream of testimony and evidence has been presented in a New Mexico case that explores what the social media conglomerate Meta knew about the effects of its platforms on children.

State prosecutors allege Meta failed to disclose the risks that its platforms pose for children, including mental health problems and sexual exploitation. Meta’s attorneys have said the company has built-in protections for teenagers and weeds out harmful content but the company acknowledges some dangerous content gets past its safety nets.

Attorneys prepared for closing arguments to jurors next week after Meta on Friday closed out its showing of testimony and evidence and the trial completed its sixth week.

If jurors later find that Meta — which owns Instagram, Facebook and WhatsApp — violated New Mexico’s consumer protection laws, prosecutors say sanctions could add up to billions of dollars. Meta, however, says it would seek a different calculation.

The trial that started Feb. 9. is one of the first in a torrent of lawsuits against Meta and comes as school districts and legislators want more restrictions on the use of smartphones in classrooms.

A slated second phase of the trial, possibly in May before a judge with no jury, would determine whether Meta created a public nuisance with its social media platforms and should pay for public programs to fix matters.

Here’s what to know about the possible outcomes of the trial:

A reckoning in courts for social media platforms

Meta is confronting two counts of violating the New Mexico Unfair Trade Practices Act that protects consumers from deceptive or predatory business practices. An additional count was dropped Friday by the judge from a draft of jury instructions.

After closing arguments, jurors will weigh whether Meta knowingly misrepresented the risks on its platforms — by omission or active concealment at the least.

The case could sidestep immunity provisions that protect tech companies from liability for material posted on their social media platforms under Section 230, a 30-year-old provision of the U.S. Communications Decency Act, as well as a First Amendment shield.

In California, a jury already is sequestered in deliberations on whether social media companies should be liable for harms caused to children using their platforms, in one of three bellwether court cases that could set the course for thousands of similar lawsuits.

New Mexico’s case is built on a different foundation — including a state undercover investigation where agents created social media accounts posing as children to document sexual solicitations and the response from Meta.



Source link

  • Related Posts

    Iowa’s stunning upset of No. 1 Florida caps off thrilling weekend of March Madness

    The Florida Gators will not be repeating as national champions. The Gators, the No. 1 seed in the South Region, exited the men’s NCAA Tournament in the second round after…

    Fink Says AI Threatens to Leave Masses Behind Unless They Invest

    In the more than a decade since Fink began writing high-profile annual letters to corporate executives, shareholders and investors, BlackRock’s client assets have surged, with considerable stakes in companies, private…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    What Happens When You Can’t Get a Death Certificate in Gaza

    What Happens When You Can’t Get a Death Certificate in Gaza

    Iowa’s stunning upset of No. 1 Florida caps off thrilling weekend of March Madness

    Iowa’s stunning upset of No. 1 Florida caps off thrilling weekend of March Madness

    Known as Blue Jays' spring home, Dunedin, Fla is also home to a long-established LGBTQ+ community

    Known as Blue Jays' spring home, Dunedin, Fla is also home to a long-established LGBTQ+ community

    Taiwan Debates Military Spending as Choices Over U.S. and China Loom

    Nurse practitioners a clear, effective solution: RNAO – The Hill Times

    Nurse practitioners a clear, effective solution: RNAO – The Hill Times

    Fink Says AI Threatens to Leave Masses Behind Unless They Invest