Anthropic accuses DeepSeek and other Chinese firms of using Claude to train their AI


Anthropic claims DeepSeek and two other Chinese AI companies misused its Claude AI model in an attempt to improve their own products. In an announcement on Monday, Anthropic says the “industrial-scale campaigns” involved the creation of around 24,000 fraudulent accounts and more than 16 million exchanges with Claude, as reported earlier by The Wall Street Journal.

The three companies — DeepSeek, MiniMax, and Moonshot — are accused of “distilling” Claude, or training a smaller AI model based on a more advanced one. Though Anthropic says that distillation is a “legitimate training method,” it adds that it can “also be used for illicit purposes,” including “to acquire powerful capabilities from other labs in a fraction of the time, and at a fraction of the cost, that it would take to develop them independently.”

Anthropic adds that illicitly distilled models are “unlikely” to carry over existing safeguards. “Foreign labs that distill American models can then feed these unprotected capabilities into military, intelligence, and surveillance systems — enabling authoritarian governments to deploy frontier AI for offensive cyber operations, disinformation campaigns, and mass surveillance,” Anthropic writes.

DeepSeek, which caused a stir in the AI industry for its powerful but more efficient models, held over 150,000 exchanges with Claude and targeted its reasoning capabilities, according to Anthropic. It’s also accused of using Claude to generate “censorship-safe alternatives to politically sensitive questions about dissidents, party leaders, or authoritarianism.” In a letter to lawmakers last week, OpenAI similarly accused DeepSeek of “ongoing efforts to free-ride on the capabilities developed by OpenAI and other U.S. frontier labs.”

Moonshot and MiniMax had more than 3.4 million and 13 million exchanges with Claude, respectively. Anthropic is calling on other members in the AI industry, cloud providers, and lawmakers to address distillation, adding that “restricted chip access” could limit model training and “the scale of illicit distillation.”



Source link

  • Related Posts

    Uber wants to be a Swiss Army Knife for robotaxis

    Uber has a pitch for autonomous vehicle makers: we got this. The ride-hailing and food delivery company has launched a new division called Uber Autonomous Solutions designed to take on…

    A new Evangelion series is coming from Studio Khara and Yoko Taro, creator of NieR

    Anime fans rejoice, Neon Genesis Evangelion series on the horizon. This was announced during a 30th anniversary event held in Japan. The bad news? Franchise creator Hideaki Anno won’t be…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    The upstart hard-right party on the march in Britain

    The upstart hard-right party on the march in Britain

    Pokémon FireRed And LeafGreen Are Already “Best Sellers” On The Switch eShop Charts

    Pokémon FireRed And LeafGreen Are Already “Best Sellers” On The Switch eShop Charts

    These Ukrainians Don’t Want to Be Traded to Russia for Peace

    Life…A Small Spark Between Two Eternities 2026-02-23 22:54:00

    Life…A Small Spark Between Two Eternities 2026-02-23 22:54:00

    Blizzard prompts Boston Globe management to call off printing paper for first time in 153 years

    Blizzard prompts Boston Globe management to call off printing paper for first time in 153 years

    Here’s Why Canada Is Moving Forward With Its Order For F-35 Fighter Jets

    Here’s Why Canada Is Moving Forward With Its Order For F-35 Fighter Jets