{"id":46519,"date":"2026-02-28T18:41:13","date_gmt":"2026-02-28T18:41:13","guid":{"rendered":"https:\/\/agooka.com\/news\/technologies\/meta-signs-multibillion-dollar-deal-to-rent-google-tpus-for-ai-training\/"},"modified":"2026-02-28T18:41:13","modified_gmt":"2026-02-28T18:41:13","slug":"meta-signs-multibillion-dollar-deal-to-rent-google-tpus-for-ai-training","status":"publish","type":"post","link":"https:\/\/agooka.com\/news\/technologies\/meta-signs-multibillion-dollar-deal-to-rent-google-tpus-for-ai-training\/","title":{"rendered":"Meta signs multibillion-dollar deal to rent Google TPUs for AI training"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/dataconomy.com\/wp-content\/uploads\/2026\/02\/1125218.jpg\" alt=\"Meta signs multibillion-dollar deal to rent Google TPUs for AI training\" title=\"Meta signs multibillion-dollar deal to rent Google TPUs for AI training\"\/><\/p>\n<p>Meta Platforms has signed an agreement to rent Google\u2019s tensor processing units through Google Cloud to develop new AI models. The deal marks a major expansion of Google\u2019s TPU commercialization strategy, which previously restricted the chips to its own internal use and select cloud customers. Meta is also in discussions to purchase TPUs outright for installation in its own data centers as soon as next year.<\/p>\n<p>The agreement allows Meta to diversify its compute supply beyond Nvidia and AMD, supporting a projected $135 billion in AI infrastructure spending by 2026. Google is forming a joint venture to lease TPUs to other AI customers, underscoring a broader push to challenge Nvidia\u2019s dominance in the AI hardware market. Some Google Cloud executives estimate expanding TPU sales could capture as much as 10% of Nvidia\u2019s annual revenue.<\/p>\n<p>On February 24, Meta and AMD announced a multiyear deal for up to six gigawatts of AMD\u2019s Instinct GPUs. The deal is valued at up to $60 billion over five years and includes an equity warrant that could give Meta as much as 10% of AMD\u2019s stock. A week earlier, Meta expanded its partnership with Nvidia to deploy millions of Blackwell and next-generation Vera Rubin processors.<\/p>\n<p>Meta CEO Mark Zuckerberg stated the AMD deal is an important step for Meta to diversify its compute and deliver personal superintelligence. Morningstar analysts described Meta\u2019s approach as a \u201cmultipronged silicon strategy.\u201d The strategy leverages Nvidia for frontier model training, AMD for inference needs, Google\u2019s TPUs for possible Llama workloads, and Meta\u2019s in-house MTIA chips for core recommendation algorithms.<\/p>\n<p>Google first developed TPUs more than a decade ago for internal AI workloads. In October 2025, Anthropic signed a deal worth tens of billions of dollars for access to up to one million TPUs. In December, Google collaborated with Meta on a project called \u201cTorchTPU\u201d to ensure full compatibility between TPUs and the PyTorch software framework.<\/p>\n<p>The Wall Street Journal reported that Google has been expanding financial support for data center partners and exploring an investment in cloud startup Fluidstack to broaden TPU demand. The Guardian reported this week that Meta\u2019s multi-vendor approach reflects a scale of operations that demands several alternatives. Analyst Nguyen, quoted in the report, said Meta has already grown large enough to require several alternatives.<\/p>\n<p>Meta has plans for 30 data centers, with 26 located in the United States. The company\u2019s AI infrastructure spending is projected to reach as much as $135 billion in 2026.<\/p>\n<p><strong>Featured image credit<\/strong><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Meta Platforms has signed an agreement to rent Google\u2019s tensor processing units through Google Cloud to develop new AI models. The deal marks a major expansion of Google\u2019s TPU commercialization strategy, which previously restricted the chips to its own internal use and select cloud customers. Meta is also in discussions to purchase TPUs outright for [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":46520,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[37],"tags":[],"class_list":{"0":"post-46519","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-technologies"},"_links":{"self":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts\/46519","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/comments?post=46519"}],"version-history":[{"count":0,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts\/46519\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/media\/46520"}],"wp:attachment":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/media?parent=46519"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/categories?post=46519"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/tags?post=46519"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}