{"id":46472,"date":"2026-02-28T04:21:12","date_gmt":"2026-02-28T04:21:12","guid":{"rendered":"https:\/\/agooka.com\/news\/business\/anthropic-hits-back-after-us-military-labels-it-a-supply-chain-risk\/"},"modified":"2026-02-28T04:21:12","modified_gmt":"2026-02-28T04:21:12","slug":"anthropic-hits-back-after-us-military-labels-it-a-supply-chain-risk","status":"publish","type":"post","link":"https:\/\/agooka.com\/news\/business\/anthropic-hits-back-after-us-military-labels-it-a-supply-chain-risk\/","title":{"rendered":"Anthropic Hits Back After US Military Labels It a &#8216;Supply Chain Risk&#8217;"},"content":{"rendered":"<p>Save StorySave this storySave StorySave this story<\/p>\n<p>United States Secretary of Defense Pete Hegseth directed the Pentagon to designate Anthropic as a \u201csupply-chain risk\u201d on Friday, sending shockwaves through Silicon Valley and leaving many companies scrambling to understand whether they can keep using one of the industry\u2019s most popular AI models.<\/p>\n<p>\u201cEffective immediately, no contractor, supplier, or partner that does business with the United States military may conduct any commercial activity with Anthropic,\u201d Hegseth wrote in a social media post.<\/p>\n<p>The designation comes after weeks of tense negotiations between the Pentagon and Anthropic over how the US military could use the startup\u2019s AI models. In a blog post this week, Anthropic argued its contracts with the Pentagon should not allow for its technology to be used for mass domestic surveillance of Americans or fully autonomous weapons. The Pentagon asked that Anthropic agree to let the US military apply its AI to \u201call lawful uses\u201d with no specific exceptions.<\/p>\n<p>A supply chain risk designation allows the Pentagon to restrict or exclude certain vendors from defense contracts if they are deemed to pose security vulnerabilities, such as risks related to foreign ownership, control, or influence. It is intended to protect sensitive military systems and data from potential compromise.<\/p>\n<p>Anthropic responded in another blog post on Friday evening, saying it would \u201cchallenge any supply chain risk designation in court,\u201d and that such a designation would \u201cset a dangerous precedent for any American company that negotiates with the government.\u201d<\/p>\n<p>Anthropic added that it hadn\u2019t received any direct communication from the Department of Defense or the White House regarding negotiations over the use of its AI models.<\/p>\n<p>\u201cSecretary Hegseth has implied this designation would restrict anyone who does business with the military from doing business with Anthropic. The Secretary does not have the statutory authority to back up this statement,\u201d the company wrote.<\/p>\n<p>The Pentagon declined to comment.<\/p>\n<p>&quot;This is the most shocking, damaging, and over-reaching thing I have ever seen the United States government do,\u201d says Dean Ball, a senior fellow at the Foundation for American Innovation and the former senior policy advisor for AI at the White House. \u201cWe have essentially just sanctioned an American company. If you are an American, you should be thinking about whether or not you should live here 10 years from now.&quot;<\/p>\n<p>People across Silicon Valley chimed in on social media expressing similar shock and dismay. \u201cThe people running this administration are impulsive and vindictive. I believe this is sufficient to explain their behavior,\u201d Paul Graham, founder of the startup accelerator Y Combinator said.<\/p>\n<p>Boaz Barak, an OpenAI researcher, said in a post that \u201ckneecapping one of our leading AI companies is right about the worst own goal we can do. I hope very much that cooler heads prevail and this announcement is reversed.\u201d<\/p>\n<p>Meanwhile, OpenAI CEO Sam Altman announced on Friday night that the company reached an agreement with the Department of Defense to deploy its AI models in classified environments, seemingly with carveouts. \u201cTwo of our most important safety principles are prohibitions on domestic mass surveillance and human responsibility for the use of force, including for autonomous weapon systems,\u201d said Altman. \u201cThe DoW agrees with these principles, reflects them in law and policy, and we put them into our agreement.\u201d<\/p>\n<h2>Confused Customers<\/h2>\n<p>In its Friday blog post, Anthropic said a supply chain risk designation, under the authority 10 USC 3252, only applies to Department of Defense contracts directly with suppliers, and doesn\u2019t cover how contractors use its Claude AI software to serve other customers.<\/p>\n<p>Three experts in federal contracts say it\u2019s impossible at this point to determine which Anthropic customers, if any, must now cut ties with the company. Hegseth\u2019s announcement \u201cis not mired in any law we can divine right now,\u201d says Alex Major, a partner at the law firm McCarter &amp; English, which works with tech companies.<\/p>\n<p>Amazon, Microsoft, Google, and Nvidia\u2014all companies that provide services to the US military and work with Anthropic\u2014did not immediately respond to WIRED\u2019s request for comment. Anduril and Shield AI, two prominent AI-focused defense-tech companies, both declined to comment.<\/p>\n<p>Supply chain risk designations typically do not go into effect immediately, and the US government is required to complete risk assessments and notify Congress before military partners have to cut ties with a company or its products, according to Charlie Bullock, senior research fellow at the Institute for Law and AI.<\/p>\n<p>But the situation could still discourage other tech companies from working with the Pentagon, according to Greg Allen, senior adviser at the Wadhwani AI Center at the Center for Strategic and International Studies (CSIS). \u201cThe Defense Department just sent a huge message to every company that if you dip your toe in the defense contracting waters, we will grab your ankle and pull you all the way in, anytime we want,\u201d he says.<\/p>\n<p>Several legal experts tell WIRED that Anthropic is likely to sue the government. Hegseth previously suggested that the DoD could attack Anthropic by invoking the Defense Production Act, which would force the company to provide its technology to the Pentagon. Allen says the flip-flopping undermines the Pentagon\u2019s argument that Anthropic is a genuine supply chain risk.<\/p>\n<p>A lawsuit could take months or years to resolve, however, and Anthropic\u2019s business could suffer in the meantime if companies are forced to sever ties.<\/p>\n<p>The dispute raises critical questions for a plethora of prominent US military partners, such as Nvidia, Amazon, Google, and Palantir, which work closely with Anthropic.<\/p>\n<p>One tech executive, whose company\u2019s software is used by the US military and requested anonymity because of the sensitivity of the situation, said that until the Department of Defense\u2019s\u2019s directive goes beyond a post on social media, their company is in a holding pattern and has lawyers examining the issue.<\/p>\n<p>As a comparison, the executive pointed to Section 889 of the National Defense Authorization Act, a procurement prohibition that bars federal agencies from contracting with companies that use certain Chinese telecom equipment as a \u201csubstantial or essential component\u201d of any system.If this new mandate is similar, that could be a \u201chigh bar to clear,\u201d the executive says, because even if a tech company is using Anthropic\u2019s Claude Code internally, it might not be defined as an \u201cessential\u201d part of the product it is ultimately selling to the government.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Save StorySave this storySave StorySave this story United States Secretary of Defense Pete Hegseth directed the Pentagon to designate Anthropic as a \u201csupply-chain risk\u201d on Friday, sending shockwaves through Silicon Valley and leaving many companies scrambling to understand whether they can keep using one of the industry\u2019s most popular AI models. \u201cEffective immediately, no contractor, [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":46473,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[36],"tags":[],"class_list":{"0":"post-46472","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-business"},"_links":{"self":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts\/46472","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/comments?post=46472"}],"version-history":[{"count":0,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts\/46472\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/media\/46473"}],"wp:attachment":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/media?parent=46472"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/categories?post=46472"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/tags?post=46472"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}