Politics March 24, 2026

Judge to Hear Anthropic Challenge to Pentagon's Supply-Chain Blacklist

Federal court in San Francisco to consider request to pause national security designation after Anthropic refused military use restrictions on Claude

By Jordan Park
Judge to Hear Anthropic Challenge to Pentagon's Supply-Chain Blacklist

A federal judge will hear oral arguments on Anthropic's bid to block a Pentagon supply-chain risk designation that prevents the AI developer from bidding on certain military contracts. The company says the designation was unlawful retaliation for its stance on AI safety and opposition to domestic surveillance; the government says the move responded to Anthropic's refusal to accept contractual terms that the Pentagon views as necessary for operational certainty.

Key Points

  • A federal judge in San Francisco will hear Anthropic's request to block a Pentagon supply-chain risk designation affecting its Claude AI model.
  • Anthropic contends the designation unlawfully retaliates against its stances on AI safety and domestic surveillance and violated its First and Fifth Amendment rights; the Justice Department says the action followed Anthropic's refusal to accept contractual terms the Pentagon views as necessary.
  • The designation prevents Anthropic from competing for certain military contracts and could also affect civilian contracting through a separate lawsuit in Washington, D.C.

A federal judge in San Francisco will hear oral arguments in a lawsuit by Anthropic over the Department of Defense's decision to label the artificial intelligence lab a national security supply-chain risk, the company said in filings that mirror assertions in its complaint. The hearing is scheduled for 1:30 p.m. Pacific Time (2030 GMT) and concerns Anthropic's request for an initial order to pause the designation while litigation proceeds.

The designation, announced by Defense Secretary Pete Hegseth, followed Anthropic's refusal to remove certain restrictions on its Claude AI model - specifically restrictions that would permit the military to use the model for U.S. surveillance or in autonomous weapons. The Pentagon applies the supply-chain risk label to companies it believes could expose military systems to potential infiltration or sabotage by adversaries.

Anthropic's complaint filed in a California federal court contends that Hegseth exceeded his authority in imposing the label. The company argues the decision was unlawful, lacked factual support and ran counter to previous military praise for Claude. Anthropic also says the designation blocked it from competing for some military contracts and could cost the company billions of dollars in lost business and damage to its reputation, a financial impact the company raised in court on March 9.

At the core of Anthropic's position is its assertion that modern AI models are not yet reliable enough for safe deployment in autonomous weapon systems and that allowing the technology to be used for domestic surveillance would violate rights. In the suit Anthropic asserts the designation represents retaliation for those positions and therefore violates its First Amendment right to free speech, the complaint says. The company further argues it was not afforded an opportunity to contest the designation, asserting a Fifth Amendment due process violation.

The San Francisco hearing will be presided over by U.S. District Judge Rita Lin, an appointee of former President Joe Biden. Anthropic has asked Judge Lin to issue an injunction preventing the supply-chain risk label from taking effect while the litigation continues.

The public designation marked the first time a U.S. company has been named a supply-chain risk under a relatively obscure government procurement statute intended to shield military systems from foreign sabotage, the filings note.

In response, the Justice Department argued in court papers that Anthropic's refusal to accept contract terms drove the designation. According to the government's filing, the company’s contractual stance could create uncertainty for the Pentagon about permissible uses of Claude and potentially risk disabling military systems during operations.

Anthropic also has a separate case pending in Washington, D.C., challenging another Pentagon supply-chain risk designation that could exclude the company from civilian government contracts. That second lawsuit addresses a distinct designation and the possible consequences for Anthropic's ability to do business with non-military federal agencies.


Context limitations - The litigation and filings presented to the court frame the competing positions: Anthropic asserts constitutional violations and factual insufficiency in the designation, while the government maintains the action responded to contractual refusals that, in its view, posed operational risks. The outcome of the hearing will determine whether the designation remains in place during the broader legal challenge.

Risks

  • Potential loss of military contracts and reputational harm for Anthropic - impacts defense contractors, AI developers and technology investors.
  • Operational uncertainty for the Pentagon if contractual terms governing Claude's use are not agreed - impacts defense operations and procurement planning.
  • Legal uncertainty while the designation remains in place during litigation - affects government contracting markets and companies that rely on federal procurement.

More from Politics

Delta Pauses VIP Services for Lawmakers as Shutdown Strains Airport Operations Mar 24, 2026 Trump Uses Mail-In Ballot in Florida Contest While Criticizing Mail Voting as 'Cheating' Mar 24, 2026 Saudi Arabia Moves Toward Active Role in U.S.-Led Response to Iran Mar 24, 2026 Packed Democratic primary in California raises prospect of two Republicans in November Mar 24, 2026 Supreme Court to Hear Challenge Over Government Power to Limit Asylum Processing Mar 24, 2026