This is very easy to explain. Anthropic outlines some limitations in their terms of service. Palantir accepted those terms. The DoD did not.
OpenAI claims their terms of service for DoD contain the same limitations as Anthropics proposed service agreement. Anthropic claims that this is untrue.
Now given that (a) the DoD terminated their deal with Anthropic, (b) stated that they terminated because Anthropic refused modify their terms of service, and (c) then signed a deal with openAI; I am inclined to believe that there is in fact a substantial difference between the terms of service offered by Anthropic and OpenAI.
Are you sure about that? Every information I’ve seen suggests that the DoD has been using Anthropic’s models through Palantir.
My understanding is that Anthropic requested visibility and a say into how their models were being used for classified tasks, while the DoD wanted to expand the scope of those tasks into areas that Anthropic found objectionable. Both of those proposals were unacceptable for the other side.
“We’ve actually held our red lines with integrity rather than colluding with them to produce ‘safety theater’ for the benefit of employees (which, I absolutely swear to you, is what literally everyone at [the Pentagon], Palantir, our political consultants, etc, assumed was the problem we were trying to solve),” Amodei reportedly wrote.
“The real reasons [the Pentagon] and the Trump admin do not like us is that we haven’t donated to Trump (while OpenAI/Greg have donated a lot),” he wrote, referring to Greg Brockman, OpenAI’s president, who gave a Pac supporting Trump $25m in conjunction with his wife.
https://www.theguardian.com/technology/2026/mar/04/sam-altma...
Sam donated $1M to Trump's inaugural fund. Dario did not.
Yeah, it never made sense when Sam immediately said that they had the same constraints yet de DoW immediately agreed with that.
From what I can see, OpenAI’s terms basically say “need to comply with the law”, which provides them with plenty of wiggle room with executive orders and whatnot.