2 min read AI-generated

OpenAI and the Pentagon: 'Just Trust Us' Isn't Good Enough

Copy article as Markdown

OpenAI is working with the Pentagon and asking the public to just trust them. The Intercept dug deeper.

Featured image for "OpenAI and the Pentagon: 'Just Trust Us' Isn't Good Enough"

OpenAI has a contract with the Pentagon. That’s been known for a while. What’s less known: the details of that contract remain opaque — and OpenAI’s response to questions boils down to: just trust us.

What we know

The Intercept published a detailed report over the weekend examining the lack of transparency around OpenAI’s collaboration with the US Department of Defense. Sam Altman has repeatedly stated that OpenAI doesn’t build weapons and that the technology is only used for “administrative tasks.” But what exactly does that mean? No concrete answers.

This isn’t a new playbook. Tech companies working with intelligence agencies and the military have always relied on trust over transparency. The Intercept draws parallels to earlier cases — from NSA surveillance to Palantir’s controversial government work.

Why this matters

The question isn’t whether AI will be used in defense — it will, one way or another. The question is whether the public has a right to know how. And that’s where OpenAI gets thin.

The company has revised its own usage policies multiple times to accommodate military partnerships. What was once a clear red line — no military, no surveillance — has become a gray area.

For the AI industry as a whole, this is a sensitive issue. When a company with “Open” in its name stonewalls questions about military contracts, it erodes trust — not just in OpenAI, but in the entire industry.

My take

I find this development concerning but not surprising. Anyone building AI at the level of GPT-4 and beyond will eventually work with government actors. That’s reality. But “just trust us” has never been a good answer — the tech industry has proven that extensively over the past twenty years.

What we need are independent audits, clear guidelines, and transparency about which boundaries are actually being respected. Nothing more, nothing less.

Sources: