Dario Amodei, Anthropic’s chief executive, has said he does not want the company’s A.I. to be used to surveil Americans or in autonomous weapons, saying this could “undermine, rather than defend, democratic values.”
Defense Secretary Pete Hegseth labeled the start-up a “supply chain risk,” a move that would sever ties between the company and the U.S. government.
Anthropic’s unwillingness to accede shows how the Department of Defense cannot easily force Silicon Valley firms to comply. Unlike defense contractors that have worked with the Pentagon for decades and are reliant on longstanding military contracts, the A.I. companies are contending with different internal pressures and external factors.


the article mentions that:
The real problem for Anthropic is the clearly vindictive “supply chain threat” designation they were immediately slapped with, which prohibits the Defense Department from buying services from anyone who uses Anthropic’s services themselves.
This can be contested in court, at least, and is almost sure to be ruled on in Anthropic’s favor since it’s so blatantly unjustified. But that might not matter. It’ll take a while (costing contracts and momentum) and once the ruling is made I wouldn’t bet on the Trump administration obeying it anyway.
When pressed for specifics on the nature of the safeguards, OpenAI’s Altman replied, “We’ve included the phrase ‘pretty please don’t use this for killing people or spying on Americans’ in our contract with Department of Defense. With this language in place we’re confident that our company values respecting human life and the privacy of all Americans is protected”. /s
I’d argue these quotes are on topic but don’t come close to addressing the logical inconsistencies.
That could depend on your take of this statement. I personally don’t understand how this could be done with high certainty and most AI researchers I respect seem to have a similar analysis
Humorously, Anthropocene wanted the gig. So, it might have been openAI who got the rally. They’re both the same people.