Briefly
- Microsoft backed Anthropic in court docket to guard billions tied to Claude and Azure.
- The Pentagon blacklist might ripple throughout the whole AI contractor ecosystem.
- Microsoft argued the DoD used a foreign-adversary safety designation in an “unprecedented” method.
Microsoft has as much as $5 billion invested in Anthropic, whereas Anthropic has dedicated to purchase $30 billion in Azure compute beneath the partnership. That context makes its determination to file an amicus curiae temporary in help of Anthropic’s lawsuit in opposition to the U.S. Division of Protection look much less like altruism and extra like monetary self-defense.
The temporary, filed March 10 in San Francisco, argues {that a} momentary restraining order blocking enforcement of the Pentagon’s “provide chain threat” designation would serve the general public curiosity.
Microsoft itself is a significant DoD contractor, and that designation places its personal merchandise in danger. Protection Secretary Pete Hegseth directed that no contractor, provider, or associate doing enterprise with the U.S. army might conduct any business exercise with Anthropic—a sweep doubtlessly broad sufficient to catch Microsoft’s personal Copilot and Azure merchandise, which ship with help for Claude.
The temporary highlights a procedural contradiction that has acquired little consideration in mainstream protection: The Division of Protection gave itself a six-month phase-out interval to transition away from Anthropic’s instruments, however utilized the designation to contractors instantly with no equal runway.
Microsoft’s legal professionals referred to as this out immediately, noting that tech suppliers should now scramble to audit, re-engineer, and reprocure merchandise on a timeline the federal government did not impose on itself.
Microsoft additionally raised an alarm that cuts to the guts of the authorized dispute. The availability chain threat authority invoked—10 U.S.C. § 3252—has traditionally been reserved for overseas adversaries. Just one such designation has ever been issued publicly beneath associated statutes, and that was in opposition to Acronis AG, a Swiss software program agency with Russian ties. Utilizing it in opposition to a San Francisco AI startup is, as Microsoft put it, “unprecedented.”
The temporary’s most pointed argument is structural. If a contract dispute between one company and one firm can set off a national-security blacklist, then each firm doing enterprise with the federal authorities simply inherited a brand new class of existential threat. Microsoft’s legal professionals described an trade mannequin constructed on interconnected companies, the place one banned element can freeze total product strains.
There’s an irony right here that is laborious to disregard. Microsoft is concurrently OpenAI’s greatest backer—with investments valued at roughly $135 billion—and now one in all Anthropic’s loudest courtroom defenders.
OpenAI, for its half, rushed to signal a take care of the DoD hours after the Anthropic blacklist dropped, a transfer that drew inner backlash and led to public acknowledgment from OpenAI CEO Sam Altman that the announcement “seemed opportunistic and sloppy.” Microsoft backed each horses.
Right here is re-post of an inner publish:
We’ve got been working with the DoW to make some additions in our settlement to make our ideas very clear.
1. We’re going to amend our deal so as to add this language, along with every little thing else:
“• Per relevant legal guidelines,…
— Sam Altman (@sama) March 3, 2026
The temporary stops in need of endorsing Anthropic’s particular AI security positions on autonomous weapons and mass surveillance—the 2 pink strains that triggered the standoff. As an alternative, it frames the case in phrases any authorities contractor can perceive: due course of, orderly transitions, and the results of weaponizing procurement regulation over coverage disagreements.
Microsoft’s request is a brief restraining order, not a verdict. The tech big desires the clock slowed down sufficient for the events to barter—and for its personal merchandise to remain legally deployable whereas they do.
What’s at stake goes past one firm’s contract. If courts permit the Pentagon’s transfer to face, then each AI firm promoting into the federal government simply realized that security guardrails may be reframed as nationwide safety threats. Microsoft’s temporary makes clear that lesson is not misplaced on the broader tech trade—and that the corporate is not prepared to be taught it quietly.
Every day Debrief Publication
Begin day by day with the highest information tales proper now, plus unique options, a podcast, movies and extra.

