Anthropic accuses DeepSeek and other Chinese firms of using Claude to train their AI


Anthropic claims DeepSeek and two other Chinese AI companies misused its Claude AI model in an attempt to improve their own products. In an announcement on Monday, Anthropic says the “industrial-scale campaigns” involved the creation of around 24,000 fraudulent accounts and more than 16 million exchanges with Claude, as reported earlier by The Wall Street Journal.

The three companies — DeepSeek, MiniMax, and Moonshot — are accused of “distilling” Claude, or training a smaller AI model based on a more advanced one. Though Anthropic says that distillation is a “legitimate training method,” it adds that it can “also be used for illicit purposes,” including “to acquire powerful capabilities from other labs in a fraction of the time, and at a fraction of the cost, that it would take to develop them independently.”

Anthropic adds that illicitly distilled models are “unlikely” to carry over existing safeguards. “Foreign labs that distill American models can then feed these unprotected capabilities into military, intelligence, and surveillance systems — enabling authoritarian governments to deploy frontier AI for offensive cyber operations, disinformation campaigns, and mass surveillance,” Anthropic writes.

DeepSeek, which caused a stir in the AI industry for its powerful but more efficient models, held over 150,000 exchanges with Claude and targeted its reasoning capabilities, according to Anthropic. It’s also accused of using Claude to generate “censorship-safe alternatives to politically sensitive questions about dissidents, party leaders, or authoritarianism.” In a letter to lawmakers last week, OpenAI similarly accused DeepSeek of “ongoing efforts to free-ride on the capabilities developed by OpenAI and other U.S. frontier labs.”

Moonshot and MiniMax had more than 3.4 million and 13 million exchanges with Claude, respectively. Anthropic is calling on other members in the AI industry, cloud providers, and lawmakers to address distillation, adding that “restricted chip access” could limit model training and “the scale of illicit distillation.”



Source link

  • Related Posts

    Microsoft Researchers Figure Out How to Store Data Inside Glass Using Lasers

    Most of the world’s information is stored digitally right now. Every year, we generate more data than we did the year before. Now, with AI in the picture, a technology…

    The creators of Dark Sky have a new weather app that shares multiple predictions

    After selling their popular weather app to Apple in March 2020, where some of its core features were incorporated into Apple Weather, the creators of Dark Sky have left Apple…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Hetmyer hits 85 after being dropped twice

    Hetmyer hits 85 after being dropped twice

    Why The Boeing 767 Has Such Longevity In Service

    Why The Boeing 767 Has Such Longevity In Service

    Inmate escapes from Saskatchewan Penitentiary transport – Saskatoon

    Inmate escapes from Saskatchewan Penitentiary transport – Saskatoon

    Nier creator Yoko Taro is writing a new Evangelion series

    Nier creator Yoko Taro is writing a new Evangelion series

    Northeast Blizzard Forces Thousands of Flight Cancellations

    Northeast Blizzard Forces Thousands of Flight Cancellations

    Microsoft Researchers Figure Out How to Store Data Inside Glass Using Lasers

    Microsoft Researchers Figure Out How to Store Data Inside Glass Using Lasers