- The Pentagon and Anthropic are in a standoff over usage of Claude
- The AI model was reportedly used to capture Nicolás Maduro
- Anthropic refuses to let its models be used in “fully autonomous weapons and mass domestic surveillance”
A rift between the Pentagon and several AI companies has emerged over how their models can be used as part of operations.
The Pentagon has requested AI providers Anthropic, OpenAI, Google, and xAI to allow the use of their models for “all lawful purposes”.
Anthropic has voiced fears its Claude models would be used in autonomous weapons systems and mass domestic surveillance, with the Pentagon threatening to terminate its $200 million contract with the AI provider in response.
$200 million standoff over AI weapons
Speaking to Axios, an anonymous Trump administration advisor said one of the companies has agreed to allow the Pentagon full use of its model, with the other two showing flexibility in how their AI models can be used.
The Pentagon’s relationship with Anthropic has been shaken since January over the use of its Claude models, with the Wall Street Journal reporting that Claude was used in the US military operation to capture Venezuelan then-President Nicolás Maduro.
An Anthropic spokesperson told Axios that the company has “not discussed the use of Claude for specific operations with the Department of War”. The company did state that its Usage Policy with the Pentagon was under review, with specific reference to “our hard limits around fully autonomous weapons and mass domestic surveillance.”
Chief Pentagon spokesman Sean Parnell stated that “Our nation requires that our partners be willing to help our warfighters win in any fight.”
Security experts, policy makers, and Anthropic Chief Executive Dario Amodei have called for greater regulation on AI development and increased requirements on safeguarding, with specific reference to the use of AI in weapons systems and military technology.

The best parental controls for all budgets
https://cdn.mos.cms.futurecdn.net/scxFkyfYSQbrtGvqrmFqgU-2560-80.jpg
Source link
benedict.collins@futurenet.com (Benedict Collins)




