OpenAI and Nvidia Announce Partnership to Deploy 10GW of Nvidia Systems
95 comments
·September 22, 2025ddtaylor
kingstnap
It's a ridiculous amount claimed for sure. If its 2 kW per it's around 5 million, and 1 to 2 kW is definitely the right ballpark at a system level.
The NVL72 is 72 chips is 120 kW total for the rack. If you throw in ~25 kW for cooling its pretty much exactly 2 kW each.
ProofHouse
How much cable (and what kind) to connect them all? That number would be 100x the number of gpus. I would think they just clip on metal racks no cables but then I saw the xai data center that can blue wire cables everywhere
thrtythreeforty
Safely in "millions of devices." The exact number depends on assumptions you make regarding all the supporting stuff, because typically the accelerators consume only a fraction of total power requirement. Even so, millions.
cj
"GPUs per user" would be an interesting metric.
(Quick, inaccurate googling) says there will be "well over 1 million GPUs" by end of the year. With ~800 million users, that's 1 NVIDIA GPU per 800 people. If you estimate people are actively using ChatGPT 5% of the day (1.2 hours a day), you could say there's 1 GPU per 40 people in active use. Assuming consistent and even usage patterns.
That back of the envelope math isn't accurate, but interesting in the context of understanding just how much compute ChatGPT requires to operate.
Edit: I asked ChatGPT how many GPUs per user, and it spit out a bunch of calculations that estimates 1 GPU per ~3 concurrent users. Would love to see a more thorough/accurate break down.
skhameneh
Before reading your comment I did some napkin math using 600W per GPU: 10,000,000,000 / 600 = 16,666,666.66...
With varying consumption/TDP, could be significantly more, could be significantly less, but at least it gives a starting figure. This doesn't account for overhead like energy losses, burst/nominal/sustained, system overhead, and heat removal.
boringg
whats the time frame?
iamgopal
and How much is that in terms of percentage of bitcoin network capacity ?
mrb
Bitcoin mining consumes about 25 GW: https://ccaf.io/cbnsi/cbeci so this single deal amounts to about 40% of that.
To be clear, I am comparing power consumption only. In terms of mining power, all these GPUs could only mine a negligible fraction of what all specialized Bitcoin ASIC mine.
cedws
I'm also wondering what kind of threat this could be to PoW blockchains.
seanwessmith
somewhere between 20 and 25 ZFLOPS. they did say its on the new architecture which is roughly double the Blackwell architecture
isodev
> Strategic partnership enables OpenAI to build and deploy at least 10 gigawatts of AI datacenters with NVIDIA systems representing millions of GPUs
I know watts but I really can’t quantify this. How much of Nvidia is there in the amount of servers that consume 10GW? Do they all use the same chip? What if there is newer chip that consumes less, does the deal imply more servers? Did GPT write this post?
nick__m
A 72 GPUs NVL72 rack consumes up to 130kW, so it's a little more than 5 500 000 GPUs
mr_toad
You don’t need AI to write vague waffly press releases. But to put this in perspective an H100 has a TDP of 700 watts, the newer B100s are 1000 watts I think?
Also, the idea of a newer Nvidia card using less power is très amusant.
fufxufxutc
In accounting terms, this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product. It allows you to count your revenue multiple times.
landl0rd
Nvidia has consistently done this with Coreweave, Nscale, really most of its balance sheet investments are like this. On the one hand there's a vaguely cogent rationale that they're a strategic investor and it sort of makes sense as an hardware-for-equity swap; on the other, it's obviously goosing revenue numbers. This is a bigger issue when it's $100B than with previous investments.
It's a good time to gently remind everyone that there are a whole pile of legal things one can do to change how a security looks "by the numbers" and this isn't even close to the shadiest. Heck some sell-side research makes what companies themselves do look benign.
rzerowan
Its the same loop de loop NVIDIA is doing with Coreweave as i understand.'Investing' in coreweave which then 'buys' NVIDIA merch for cloud rental , resulting in Coreweave being the top 4 customers of NVIDIA chips.
FinnKuhn
They for example did a similar deal with Nscale just last week.
https://www.cnbc.com/2025/09/17/ai-startup-nscale-from-uk-is...
Aurornis
This is being done out in the open (we’re reading the press announcement) and will be factored into valuations.
Also, investing in OpenAI means they get equity in return, which is not a worthless asset. There is actual mutually beneficial trade occurring.
klysm
Is it counting revenue multiple times? It's buying your own products really, but not sure how that counts as double counting revenue
landl0rd
It's booking equity as an asset then the money used to buy it as revenue. This is a weaker form of something like if I counted transfer pricing "revenue" in my top line.
rsstack
Customer A pays you $100 for goods that cost you $10. You invest $100-$10=$90 in customer B so that they'll pay you $90 for goods that cost you $9. Your reported revenue is now $100+$90=$190, but the only money that entered the system is the original $100.
FinnKuhn
And your evaluation also rises as a consequence of your increased revenue.
lumost
It's real revenue, but you are operating a fractional reserve revenue operation. If the person your investing in has trouble, or you have trouble - the whole thing falls over very fast.
fufxufxutc
The "investment" came from their revenue, and will be immediately counted in their revenue again.
weego
In this case it seems that if we're being strict here the investment could then also show up as fixed assets on the same balance sheet
Mistletoe
Isn’t our stock market basically propped up on this AI credits etc. house of cards right now?
selectodude
This is some Enron shit. Lets see NVDA mark to market these profits. Keep the spice flowing.
moduspol
Waiting patiently for the Ed Zitron article on this...
hooloovoo_zoo
These $ figures based on compute credits or the investor's own hardware seem pretty sketchy.
TheRealGL
Did I miss the part where they mention the 10 large nuclear plants needed to power this new operation? Where's all the power coming from for this?
HDThoreaun
Build this thing in the middle of the desert and you would need around 100 sq mile of solar panels + a fuck load of batteries for it to be energy independent. The solar farm would be around $10 billion which is probably far less than the gpus cost
boringg
Won't get you the necessary 4 9's uptime and energy sadly. Im still 100% for this -- but need another model for energy delivery.
xnx
Dissipating 10GW of heat is also a challenge in a sunny, hot, dry environment.
ecshafer
The desert is a vulnerable ecosystem. 100 square miles is a very large installation that would have large effects.
HDThoreaun
This is a man's world
retr0rocket
[dead]
Aeolun
On sand?
jonfromsf
Environmentalists are just against progress. A few desert species going extinct is not a big deal. It's an arid wasteland. When we eventually terraform it (with desalinated water from solar / fusion) those species are going to die out anyway.
nutjob2
Also, the fact they they announce not how much computing power they are going to deploy but rather how much electricity it's going to use (as if power usage is a useful measurement of processing power) is kind of gross.
"Good news everybody, your power bills are going up and your creaking, chronically underfunded infrastructure is even closer to collapse!"
catigula
Consumer electric grids.
davis
Exactly this. This is essentially a new consumer tax in your electrical bill. The buildout of the electrical grid is being put on consumers essentially as a monthly tax with the increase in electrical costs. Everyone in the country is paying for the grid infrastructure to power these data centers owned by trillion dollar companies who aren't paying for their needs.
delfinom
Yep. Consumers are screwed and $500/month electric bills are coming for the average consumer within a year or two. We do not have the electricity available for this.
DebtDeflation
Wouldn't Nvidia be better served investing the $100B in expanding GPU manufacturing capacity?
ecshafer
By investing in TSMC? By buying TSMC? I don't think $100B would buy them enough current generation capacity to make a difference from scratch.
me551ah
So OpenAI is breaking up with Microsoft and Azure?
freedomben
They've been sleeping with Oracle too recently, so I don't think they're breaking up, just dipping a toe in the poly pool
jsheard
It's resembling a Habsburg family tree more at this point
https://bsky.app/profile/anthonycr.bsky.social/post/3lz7qtjy...
(pencil in another loop between Nvidia and OpenAI now)
FinnKuhn
I would say Microsoft cheated on OpenAI first ;)
https://www.reuters.com/business/microsoft-use-some-ai-anthr...
Handy-Man
It was more like Microsoft refused to build the capacity OpenAI was asking for, so they gave them blessing to buy additional compute from others.
It does seem like Satya believes models will get commoditized, so no need to hitch themselves with OpenAI that strongly.
aanet
I'm old enough to remember when vendor financing was both de rigueur and also frowned upon... (1990s: telecom sector, with all big players like Lucent, Nortel, Cisco, indulging in it, ending with the bust of 2001/2002, of course)
alephnerd
This absolutely feels like the Telco Bubble 2.0, and I've mentioned this on HN as well a couple times [0]
boringg
For sure a great infrastructure build out -- lets hope the leftover are better energy infrastructure so that whatever comes next in 7 years after the flame out has some great stuff to build on (similar to telco bubble 1.0) and less damaging to planet earth in the long arc.
zuInnp
Yeah, who cares about the enviroment... who needs water and energy, if you AI agent can give you better pep talk
xnx
What does this mean? "To support the partnership, NVIDIA intends to invest up to $100 billion in OpenAI progressively as each gigawatt is deployed."
vlovich123
Nvidia is buying their own chips and counting it as a sale. In exchange they’re maybe getting OpenAI stock that will be worth more in the future. Normally this would count as illegally cooking the books I think but if the OpenAI investment pays off no one will care.
toomuchtodo
What if it doesn't?
vlovich123
Still unlikely they’d get prosecuted because they’re not trying to hide how they’re doing this and there’s no reasonable expectation that OpenAI is likely to fold. I doubt they’d improperly record this in their accounting ledger either.
nutjob2
It's a good question since it's probably the 99% case.
patapong
Perhaps it means OpenAI will pay for the graphics card in stock? Nvidia would become an investor in OpenAI thereby moving up the AI value chain as well as ensuring demand for GPUs, while OpenAI would get millions of GPUs to scale their infrastructure.
dtech
They're investing in kind. They're paying with chips instead of money
solarexplorer
That they will invest 10$ in OpenAI for each W of NVIDIA chips that is deployed? EDIT: In steps of 1GW it seems.
jstummbillig
I am confused as to what the question is.
losteric
so nvidia's value supported by the value of AI companies, which nvidia then supports?
dsr_
It means this is a bubble, and Nvidia is hoping that their friends in a white house will keep them from being prosecuted, of at least from substantial penalties.
re-thc
> What does this mean?
> to invest up to
i.e. 0 to something something
lawlessone
Nvidia if you're listening give me 10K and i'll bu...*invest 10K+ 10 euro worth of cash in your product.
For someone who doesn't know what a gigawat worth of Nvidia systems is, how many high-end H100 or whatever does this get you? My estimates along with some poor-grade GPT research leads me to think it could be nearly 10 million? That does seem insane.