OpenAI isn’t spending $100 billion to buy chips, it’s paying cash to lease 0 whole deal with Nvidia is built on spreading costs over time, and not dropping billions 1 artificial intelligence company wants to access Nvidia’s top-tier GPUs, but instead of buying them outright, it’s locking into long-term lease 2 way, the money goes out slowly, and the risk shifts to 3 arrangement is 4 each new AI data center goes live, OpenAI gets access to more 5 first center, being built in Abilene, Texas, is expected to go online in the second half of 2026. That’s when the cash starts 6 exact price of each center is still unknown, but OpenAI isn’t taking ownership of the hardware.
It’s renting the 7 GPU deployed will be leased, with payments spread across their useful life, around five 8 delays costs by leasing Nvidia chips instead of buying Jensen Huang, the CEO of Nvidia, described the deal as “monumental in size.” He said building a single gigawatt AI data center could cost about $50 9 of that, around $35 billion goes straight to Nvidia for its 10 remaining is for everything 11 OpenAI isn’t paying that up 12 leasing the GPUs instead, the company avoids taking a financial hit all at 13 will get an initial $10 billion from the deal 14 money helps kick off the first wave of 15 while some of the funds will be used for hiring, operations, and other expenses, the majority of it will go straight to 16 specifically, to Nvidia’s 17 GPUs are the engines behind AI training, powering models like ChatGPT and everything that runs on 18 Friar, OpenAI’s chief financial officer, said in Abilene that the plan wouldn’t be possible without 19 pointed to Oracle, which is leasing the Abilene data center, and Nvidia , which is providing equity up front in return for long-term payments.
“Folks like Oracle are putting their balance sheets to work to create these incredible data centers you see behind us,” Sarah said. “In Nvidia’s case, they’re putting together some equity to get it jumpstarted, but importantly, they will get paid for all those chips as those chips get deployed.” Debt talks begin while Nvidia chips eat most of OpenAI’s cash OpenAI is not 20 doesn’t have positive cash flow, and it doesn’t hold investment-grade credit. That’s why financing data centers through equity is 21 inside the company said they’re preparing to take on debt to handle the rest of the 22 thanks to the lease structure with Nvidia, banks are more comfortable 23 terms look better when a company isn’t trying to buy everything 24 said the compute shortage is the bigger issue.
“What I think we should all be focused on today is the fact that there’s not enough compute,” she said. “As the business grows, we will be more than capable of paying for what is in our future — more compute, more revenue.” But not everyone is thrilled about the way this is structured. Nvidia’s $4.3 trillion market cap has been built on selling chips to OpenAI, Google, Meta, Microsoft, and 25 the same time, OpenAI’s $500 billion private valuation is only possible because of cash injections from Microsoft and 26 money doesn’t sit 27 goes right back to 28 Zakalik, an analyst at Neuberger Berman, told CNBC the deal shows OpenAI raising capital and pouring it straight into the same company providing the tech.
“It’s goosing up everyone’s earnings and everyone’s numbers,” Jamie said. “But it’s not actually creating anything.” When asked about those concerns, OpenAI CEO Sam Altman didn’t push back. “We need to keep selling services to consumers and businesses — and building these great new products that people pay us a lot of money for,” Sam said. “As long as that keeps happening, that pays for a lot of these data centers, a lot of chips.” Want your project in front of crypto’s top minds?
Feature it in our next industry report, where data meets impact.
Story Tags

Latest news and analysis from Cryptopolitan



