{"id":38487,"date":"2025-11-13T11:31:41","date_gmt":"2025-11-13T11:31:41","guid":{"rendered":"https:\/\/agooka.com\/news\/business\/openais-open-weight-models-are-coming-to-the-us-military\/"},"modified":"2025-11-13T11:31:41","modified_gmt":"2025-11-13T11:31:41","slug":"openais-open-weight-models-are-coming-to-the-us-military","status":"publish","type":"post","link":"https:\/\/agooka.com\/news\/business\/openais-open-weight-models-are-coming-to-the-us-military\/","title":{"rendered":"OpenAI\u2019s Open-Weight Models Are Coming to the US Military"},"content":{"rendered":"<p>Save StorySave this storySave StorySave this story<\/p>\n<p>When OpenAI unveiled its first open-weight models in years this August, it wasn\u2019t just tech companies that were paying attention. The release also excited US military and defense contractors, which saw a chance to use them for highly secure operations.<\/p>\n<p>Initial results show that OpenAI\u2019s tools lag behind competitors in desired capabilities, some military vendors tell WIRED. But they are still pleased that models from a key industry leader are finally an option for them.<\/p>\n<p>Lilt, an AI translation company, contracts with the US military to analyze foreign intelligence. Because the company\u2019s software handles sensitive information, it must be installed on government servers and work without an internet connection, a practice known as air-gapping. Lilt previously developed its own AI models or used open source options such as Meta\u2019s Llama and Google\u2019s Gemma. But OpenAI\u2019s tools were off the table because they were closed source and could only be accessed online.<\/p>\n<p>The ChatGPT maker\u2019s new open-weight models, gpt-oss-120b and gpt-oss-20b, changed that. Both can run locally, meaning users have the freedom to install them on their own devices without needing a cloud connection. And with access to the models\u2019 weights\u2014key parameters that determine how they react to different prompts\u2014users can tailor them for specific purposes.<\/p>\n<p>OpenAI\u2019s return to the open-source market could ultimately increase competition and lead to better performing systems for militaries, health care companies, and others working with sensitive data. In a recent McKinsey survey of roughly 700 business leaders, more than 50 percent said their organizations use open source AI technologies. Models have different strengths based on how they were trained, and organizations often use several together, including open-weight ones, to ensure reliability across a wide variety of situations.<\/p>\n<p>Doug Matty, chief digital and AI officer for the so-called Department of War, the name the Trump administration is using for the Department of Defense, tells WIRED that the Pentagon plans to integrate generative AI into battlefield systems and back-office functions like auditing. Some of these applications will require models that are not tied to the cloud, he says. \u201cOur capabilities must be adaptable and flexible,\u201d Matty says.<\/p>\n<p>OpenAI did not respond to requests for comment about how its open source models may be used by the defense industry. Last year, the company reversed a broad ban on its technology being used for military and warfare applications, a move that prompted criticism from activists concerned about harms caused by AI.<\/p>\n<p>For OpenAI, offering a free and open model could have several benefits. The ease of access could cultivate a larger community of experts in its technologies. And because users don\u2019t have to sign up as formal customers, they may be able to operate with secrecy, which could keep OpenAI from facing criticism over potentially controversial customers\u2014like, say, the military.<\/p>\n<p>Earlier this year, Matty\u2019s unit at the Pentagon struck one-year deals worth up to $200 million each with OpenAI, Elon Musk\u2019s xAI, Anthropic, and Google. The goal is to create prototypes of AI systems for different purposes, including automating war-fighting tools. Until OpenAI\u2019s recent launch, Google was the only new tech partner that offered a cutting-edge open model as an option. The other companies license models that are run from the cloud and can\u2019t be customized to the same extent as open models.<\/p>\n<p>In Lilt\u2019s case, CEO Spence Green says a military analyst may input a prompt like, \u201cTranslate these documents to English and ensure that there are no mistakes. Then have the most knowledgeable person about hypersonics check the work.\u201d Lilt\u2019s proprietary models, which are trained for government applications, handle the translation. Google\u2019s Gemma currently automates routing which information goes to models, analysts, and other teams. The aim is to address a shortage of language experts and a backlog of data to process.<\/p>\n<p>OpenAI\u2019s latest open source models aren\u2019t well suited for Lilt\u2019s needs. They process only text, and the military needs to also sort through images and audio. Lilt also found the models underperform in some languages and in situations with limited computing power. But the results haven\u2019t demoralized Green. \u201cWith gpt-oss, there\u2019s a lot of model competition right now,\u201d Green says. \u201cMore options, the better.\u201d<\/p>\n<p>Other companies that work with the military say they got good results from the gpt-oss models, but they aren\u2019t aware of any Pentagon projects using them that have moved past the demo stage. \u201cIt\u2019s pretty early,\u201d says Jordan Wiens, cofounder of Vector 35, which supplies reverse engineering tools to the Pentagon and has integrated gpt-oss into its offerings.<\/p>\n<p>EdgeRunner AI, which is developing a virtual personal assistant for the military that doesn\u2019t require a cloud connection, says it achieved sufficient performance with gpt-oss after feeding it a cache of military documents to modify its capabilities, according to a paper the company published in October. The US Army and the Air Force will begin testing the modified model this month, says Tyler Saltsman, EdgeRunner\u2019s CEO.<\/p>\n<p>Open models may be particularly valuable in situations that require an immediate response or when internet interference could be an issue. That includes AI systems running on drones or satellites, says Kyle Miller, a research analyst at Georgetown University\u2019s Center for Security and Emerging Technology. Open source AI models offer the military \u201ca degree of accessibility, control, customizability, and privacy that is simply not available with closed models,\u201d he says.<\/p>\n<p>Beyond direct deals with AI providers, the military also has access to about 125 open source models and about 25 closed options through an intermediary AI platform called Ask Sage, says Nicolas Chaillan, the company\u2019s founder and a former chief software officer for the US Air Force and Space Force.<\/p>\n<p>Chaillan says there are serious drawbacks to using open source models, particularly for the US military. They hallucinate and make incorrect predictions more often than the best commercial models, he claims. And while they are often free for most uses, the infrastructure needed to run the biggest models may end up costing the same or more than licensing a commercial model over the cloud. \u201cIt\u2019s like going from PhD level to a monkey,\u201d Chaillan says. \u201cIf you spend more money and get a worse model, it makes no sense.\u201d<\/p>\n<p>He believes that the military should keep an eye on open models, but focus its efforts on using the more capable options that Microsoft, Amazon, and Google offer through cloud networks developed specifically for sensitive government tasks.<\/p>\n<p>Other military suppliers and experts disagree, contending that closed models can lead to dependence issues and won\u2019t meet the boutique needs of the armed forces.<\/p>\n<p>Pete Warden, who runs the transcription and translation technology developer Moonshine, says his contacts in the defense world have become more cautious about trusting big tech companies after seeing how Musk used his Starlink satellite network to influence government leaders. \u201cIndependence from suppliers is key,\u201d Warden says. His solution has been letting government agencies control a perpetual copy of Moonshine\u2019s model in exchange for a one-time fee.<\/p>\n<p>William Marcellino, who develops AI applications for the research group RAND, says open models that can be more easily controlled would help the military and spy agencies with projects such as translating materials for influence operations into regional dialects, a task that general commercial models may struggle to execute with precision. \u201cIt\u2019s good to have choices,\u201d he says.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Save StorySave this storySave StorySave this story When OpenAI unveiled its first open-weight models in years this August, it wasn\u2019t just tech companies that were paying attention. The release also excited US military and defense contractors, which saw a chance to use them for highly secure operations. Initial results show that OpenAI\u2019s tools lag behind [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":38488,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[36],"tags":[],"class_list":{"0":"post-38487","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-business"},"_links":{"self":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts\/38487","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/comments?post=38487"}],"version-history":[{"count":0,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts\/38487\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/media\/38488"}],"wp:attachment":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/media?parent=38487"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/categories?post=38487"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/tags?post=38487"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}