This story writes itself – and it’s so absurd you wouldn’t believe it if CBS News hadn’t confirmed it.
Claude on the Battlefield
Two sources familiar with the US military’s use of AI confirm: the United States used Anthropic’s Claude over the weekend during strikes on Iran. The military is using the tool for intelligence assessment, target identification, and battle scenario simulation.
That alone would be big news. But the context makes it explosive.
The Timing Is Everything
Just hours before Claude was deployed on the battlefield, Donald Trump had ordered all federal agencies to stop using Anthropic’s technology. The reason: Anthropic had refused to give the Pentagon unrestricted access to Claude – specifically for mass surveillance of US citizens and fully autonomous weapons.
Trump called Anthropic a ‘Radical Left AI Company’. And then the very military that started the dispute used the technology anyway.
What This Means
A few things become clear here. First, Claude is apparently so deeply embedded in military infrastructure that replacing it overnight isn’t possible. Estimates suggest it could take three months or longer for the Pentagon to replace Claude’s capabilities with another AI platform.
Second, the dispute between Anthropic and the Pentagon was never really about WHETHER the military can use AI. Anthropic never fundamentally objected to military use. It was about the red lines – no mass surveillance, no autonomous weapons. Gizmodo nailed this point in a sharp analysis.
Third, we’re now living in a world where AI is actively used in warfare. This isn’t science fiction anymore. It’s reality.
My Take
I still find Anthropic’s stance remarkable. Standing up to the Pentagon while knowing your technology is being used militarily – that’s a position you have to be able to hold. Whether you see it as brave or naive probably depends on where you stand.
What concerns me more: we’re talking about AI-assisted target identification in an actual war. That deserves a much broader societal debate than we’re currently having.
Sources: