As OpenAI locks in its Pentagon partnership, questions are mounting across the AI industry and beyond about the appropriate role of artificial intelligence in modern warfare. Anthropic had tried to draw clear lines around those questions and was expelled from the government market for its trouble — leaving OpenAI to answer them through a partnership whose ethical limits remain to be proven in practice.
Anthropic’s lines were specific and defensible. The company would not allow its Claude AI to be used in weapons systems that could make lethal decisions without human oversight, and it would not allow it to be used in programs designed to surveil civilian populations at mass scale. These were positions grounded in international law, ethical AI norms, and the fundamental principle that human beings should remain responsible for decisions about the use of lethal force.
Pentagon officials argued that these conditions limited military capability in unacceptable ways. The Trump administration gave the argument its most forceful expression by banning all federal use of Anthropic technology and framing the company’s ethical stance as ideological defiance of the military’s constitutional authority. The ban ended Anthropic’s government relationships immediately and definitively.
OpenAI’s Pentagon deal, announced hours later, came with assurances from Sam Altman that the company’s own positions on autonomous weapons and mass surveillance are written into the contract. He framed the deal as a model for how government and industry can work together within ethical limits, and he called for these limits to be standardized across all government AI partnerships.
The questions that Anthropic tried to answer contractually — and that OpenAI is now attempting to address through agreement — will not be resolved by any single deal. They require public debate, legislative action, and the kind of sustained accountability that contracts alone cannot provide. Anthropic’s expulsion from the market has, if nothing else, made these questions impossible to ignore.