Meta’s AI sending ‘junk’ tips to DoJ, US child abuse investigators say | Technology


Meta’s use of artificial intelligence software to moderate its social media platforms is generating large volumes of useless reports about cases of child sexual abuse, which are draining resources and hindering investigations, said officers from the US Internet Crimes Against Children (ICAC) taskforce.

“We get a lot of tips from Meta that are just kind of junk,” Benjamin Zwiebel, a special agent with the ICAC taskforce in New Mexico, said last week during his testimony in the state’s trial against Meta. The state’s attorney general alleges the company’s platforms are putting profits over child safety. Meta disputes these allegations, citing changes it has introduced on its platforms, such as teen accounts with default protections. The ICAC taskforce is a nationwide network of law enforcement agencies coordinated with the US Department of Justice to investigate and prosecute online child exploitation and abuse cases.

Another ICAC officer, speaking on the condition of anonymity to discuss internal matters, said: “Meta is providing thousands of tips each month. It’s pretty overwhelming because we’re getting so many reports, but the quality of the reports is really lacking in terms of our ability to take serious action.” The ICAC officer added that the total number of cybertips their department had received doubled from 2024 to 2025.

The unviable tips from Instagram, Facebook and WhatsApp, in some instances contain information that is not criminal, both Zweibel and two ICAC officers said. Tips sometimes contain information indicating a crime may have occurred, yet vital images, videos or text are missing or redacted, the anonymous officers added.

“[Unviable tips from] Instagram have really skyrocketed recently, especially in the last couple of months, and that’s one of the biggest places where we’re seeing this information not being provided,” the ICAC officer added. “In those cases, we don’t have the information to further the investigation. It weighs on you to know that this crime occurred, but we can’t identify the perpetrator.”

Asked about Zweibel’s testimony and the ICAC officers’ remarks, a Meta spokesperson said: “We’ve supported law enforcement to prosecute criminals for years: the DoJ has repeatedly praised our fast cooperation that has helped lead to arrests, and NCMEC has praised our streamlined and ‘improve[d]’ tip reporting process. In 2024, we received over 9,000 emergency requests from US authorities and resolved them within an average of 67 minutes – and even more quickly for cases involving child safety and suicide. Consistent with applicable law, we also report apparent child sexual exploitation imagery to NCMEC and support them to prioritize reports, from helping build their case management tool to labeling cybertips so they know which are urgent.”

The company pointed out that the agent, Zweibel, recommended the use of Meta’s teen accounts feature during his testimony, which he said he did “because it is the only option available, assuming that teens will not abstain from the use of social media”.

Raúl Torrez, the New Mexico attorney general who is leading the case against Meta, has acknowledged the company’s cooperation in providing leads on child abuse at the trial: “I do want to credit some of the social media applications and platforms, including Meta, to a certain extent they do report images to the National Center for Missing and Exploited Children.”

Filings released in the case on Friday show Meta executives sounding internal alarms over the company’s ability to police child sexual abuse and alert law enforcement in early 2019. At the time, the company was preparing to enable end-to-end encryption in Facebook Messenger, which hides users’ messages from anyone but the intended recipient via cryptography.

Filings raise new questions

“We are about to do a bad thing as a company. This is so irresponsible,” wrote Monika Bickert, Meta’s head of content policy.

Bickert wrote that there would be “no way to find the terror attack planning or child exploitation” if Messenger’s contents were encrypted, which could hinder working with law enforcement. Bickert also said Meta was making “gross misstatements of our ability to conduct safety operations”, according to the internal documents.

In another document, Meta employees estimated that encrypting Messenger would have rendered the company “unable to provide data proactively to law enforcement in 600 child exploitation cases, 1,454 sextortion cases, 152 terrorist cases, 9 threatened school shootings”.

Andy Stone, a Meta spokesman, told Reuters: “The concerns raised in 2019 represent the very reason we developed a range of new safety features to help detect and prevent abuse, all designed to work in encrypted chats.”

Child safety groups criticized the encryption of Messenger, which eventually rolled out in 2023.

Reporting child abuse en masse

By law, social media companies based in the United States are required to report any detected child sexual abuse material (CSAM) on their platforms to the National Center for Missing & Exploited Children (NCMEC). NCMEC serves as a national clearinghouse for reports, which it forwards to the appropriate law enforcement agencies across the United States and internationally. NCMEC does not have the authority to filter out any tips that may be unviable before they are sent to the relevant law enforcement agencies.

Meta is by far the largest reporter to NCMEC. In its data report for 2024, NCMEC said Meta made 13.8m reports across Facebook, Instagram and Whatsapp, out of 20.5m tips it received in total.

NCMEC said that in 2024, more than 1m CyberTipline reports were linked to a specific US state, and those reports were made available to the ICAC taskforces around the country, as well as other federal, state, and local law enforcement agencies, for investigation.

Meta and other social media companies use AI to detect and report suspicious material on their sites and employ human moderators to review some of the flagged content before sending it to law enforcement. The Guardian has previously reported that tips generated by AI that haven’t also been reviewed by a social media company employee often cannot be opened by a law enforcement officer without a warrant because of fourth amendment protections. This extra step also slows investigations of potential crimes, lawyers involved in such cases say.

A Meta spokesperson said: “It’s unfortunate that court rulings have increased the burden on law enforcement by requiring search warrants to open identical copies of content we’ve already reviewed and reported. Our image-matching system finds copies of known child exploitation at a scale that would be impossible to do manually, and we work to detect new child exploitation content through technology, reports from our community, and investigations by our specialist child safety teams.”

Legislative change spurs avalanche of tips

Under the Report Act (Revising Existing Procedures On Reporting via Technology), which came into force in November 2024, online service providers must broaden and strengthen their reporting obligations by notifying NCMEC’s CyberTipline not only about child sexual abuse material but also about planned or imminent abuse, child sex trafficking and related exploitation; they must also preserve evidence for a longer period and face higher penalties if they knowingly fail to comply.

Since the act passed, the number of unviable tips supplied by Meta has increased dramatically, which could be because the company is acting to ensure it is not falling foul of the law, two ICAC officers said. Many of these tips could not be construed as a crime, such as adolescent girls talking about which celebrity they find most attractive.

“Based on my training and experience, it appears that they are being submitted through the use of AI, as these are common mistakes that an AI would make that a human observer would not,” Zwiebel said in court.

In contrast, Zwiebel’s department receives significantly fewer tips on legitimate cases of child sexual abuse material (CSAM) distribution from Meta than in previous years, he said.

Every tip that reaches an ICAC division must be reviewed, and the influx of unviable tips is taking time and resources away from investigating legitimate cases of child abuse, said two officers.

“It is killing morale. We are drowning in tips and we want to get out there and do this work,” said one ICAC officer. “We don’t have the personnel to sustain that. There’s no way that we can keep up with the flood that’s coming in.”



Source link

  • Related Posts

    Everyone Speaks Incel Now | WIRED

    At the beginning of the year, The Cut kicked off a brief discourse cycle by declaring a new lifestyle trend: “friction-maxxing.” The idea, in a nutshell, is that people have…

    The White House wants AI companies to cover rate hikes. Most have already said they would.

    The proliferation of AI data centers plugging into the national electrical grid has helped increase consumer electricity prices, driving up the average national electricity price by more than 6% in…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Bill and Hillary Clinton, battle-tested, gear up for another Washington fight

    Bill and Hillary Clinton, battle-tested, gear up for another Washington fight

    Granite REIT Announces the Renewal of Its At-The-Market Equity Program

    Chris Selley: Pity Ireland’s terrible burden — playing soccer against Israel

    Cuba says 4 killed, 6 wounded on US-registered boat in ‘confrontation’ off Cuba’s coast

    Cuba says 4 killed, 6 wounded on US-registered boat in ‘confrontation’ off Cuba’s coast

    Federal judge rules Trump admin may not remove individuals to third country without due process

    Federal judge rules Trump admin may not remove individuals to third country without due process

    Everyone Speaks Incel Now | WIRED

    Everyone Speaks Incel Now | WIRED