2 min read AI-generated

Judge calls Pentagon's Anthropic ban an 'attempt to cripple' the company

Copy article as Markdown

A federal judge sharply criticized the Trump administration's actions against Anthropic, questioning whether the AI company is being punished for standing up for safety.

Featured image for "Judge calls Pentagon's Anthropic ban an 'attempt to cripple' the company"

The Anthropic vs. Pentagon story just took a dramatic turn. At Tuesday’s federal court hearing, Judge Rita Lin didn’t hold back — and the government didn’t come out looking great.

What happened

Judge Lin called the Pentagon’s actions against Anthropic an ‘attempt to cripple’ the company. She said the restrictions don’t seem tailored to any stated national security concern — they look more like punishment.

Here’s the backstory: Anthropic refused to strip safety guardrails from Claude that prevent its use for autonomous weapons and mass surveillance. In response, Defense Secretary Pete Hegseth designated Anthropic as a ‘supply chain risk’ — a label normally reserved for foreign companies. Trump then ordered all federal agencies to stop using Claude.

Why it matters

The judge raised several critical points. First: the bar for the security risk designation seemed ‘pretty low.’ Second: there’s a real concern that Anthropic is being punished for publicly criticizing the government — which raises First Amendment questions.

What’s particularly striking: Microsoft, Google DeepMind, and even OpenAI employees filed supporting briefs for Anthropic. The entire tech industry is standing behind a competitor here — that alone tells you how serious this is.

What’s next

Judge Lin said she expects to issue a ruling within the next few days. Anthropic had asked for a decision by March 26.

My take: regardless of the outcome, this case is setting an important precedent. If the government can punish AI companies for drawing safety lines, that has implications for the entire industry. Anthropic drew a line that other companies haven’t dared to — and it might end up being vindicated for it.


Sources: