{"id":44393,"date":"2026-02-01T22:51:11","date_gmt":"2026-02-01T22:51:11","guid":{"rendered":"https:\/\/agooka.com\/news\/technologies\/microsoft-will-continue-buying-from-nvidia-despite-custom-chip-launch\/"},"modified":"2026-02-01T22:51:11","modified_gmt":"2026-02-01T22:51:11","slug":"microsoft-will-continue-buying-from-nvidia-despite-custom-chip-launch","status":"publish","type":"post","link":"https:\/\/agooka.com\/news\/technologies\/microsoft-will-continue-buying-from-nvidia-despite-custom-chip-launch\/","title":{"rendered":"Microsoft will continue buying from Nvidia despite custom chip launch"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/dataconomy.com\/wp-content\/uploads\/2026\/01\/1125622.jpg\" alt=\"Microsoft will continue buying from Nvidia despite custom chip launch\" title=\"Microsoft will continue buying from Nvidia despite custom chip launch\"\/><\/p>\n<p>Microsoft deployed its first batch of homegrown Maia 200 AI chips in one data center this week to handle AI inference tasks amid Nvidia supply shortages, with CEO Satya Nadella affirming continued purchases from Nvidia and AMD.<\/p>\n<p>The company plans to roll out additional Maia 200 chips in the coming months. Microsoft describes the Maia 200 as an \u201cAI inference powerhouse.\u201d This design optimizes the chip for the compute-intensive work of running AI models in production environments. Microsoft released processing-speed specifications showing the Maia 200 outperforms Amazon\u2019s latest Trainium chips and Google\u2019s latest Tensor Processing Units, or TPUs.<\/p>\n<p>Cloud giants including Microsoft pursue their own AI chip designs due to the difficulty and expense of obtaining Nvidia\u2019s latest chips. A persistent supply crunch persists without signs of easing. Other providers face similar constraints in acquiring high-performance Nvidia hardware for their data centers.<\/p>\n<p>Nadella emphasized Microsoft\u2019s ongoing reliance on external suppliers despite the in-house development. He stated, \u201cWe have a great partnership with Nvidia, with AMD. They are innovating. We are innovating.\u201d Nadella addressed competition in the sector, saying, \u201cI think a lot of folks just talk about who\u2019s ahead. Just remember, you have to be ahead for all time to come.\u201d He clarified the company\u2019s strategy with the remark, \u201cBecause we can vertically integrate doesn\u2019t mean we just only vertically integrate,\u201d referring to building systems entirely in-house without external components.<\/p>\n<p>The Maia 200 supports Microsoft\u2019s Superintelligence team, which focuses on developing the company\u2019s frontier AI models. Mustafa Suleyman leads this team. Suleyman previously co-founded Google DeepMind. The team uses the chips to advance Microsoft\u2019s internal AI capabilities.<\/p>\n<blockquote>\n<p>It&#039;s a big day. Our Superintelligence team will be the first to use Maia 200 as we develop our frontier AI models. https:\/\/t.co\/3Vt38ajobR<\/p>\n<p>\u2014 Mustafa Suleyman (@mustafasuleyman) January 26, 2026<\/p>\n<\/blockquote>\n<p>Microsoft develops these models to reduce dependence on external providers. The company currently partners with OpenAI and Anthropic, along with other model makers, for advanced AI systems.<\/p>\n<p><strong>Featured image credit<\/strong><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Microsoft deployed its first batch of homegrown Maia 200 AI chips in one data center this week to handle AI inference tasks amid Nvidia supply shortages, with CEO Satya Nadella affirming continued purchases from Nvidia and AMD. The company plans to roll out additional Maia 200 chips in the coming months. Microsoft describes the Maia [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":44394,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[37],"tags":[],"class_list":{"0":"post-44393","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-technologies"},"_links":{"self":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts\/44393","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/comments?post=44393"}],"version-history":[{"count":0,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts\/44393\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/media\/44394"}],"wp:attachment":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/media?parent=44393"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/categories?post=44393"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/tags?post=44393"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}