Nvidia is gearing up to sell servers instead of just GPUs and components
26 comments
·November 14, 2025hvenev
Don't they already sell servers? https://www.nvidia.com/en-us/data-center/dgx-platform/
p4ul
I had the same reaction. Haven't they been selling DGX boxes for almost 10 years now? And they've been selling the rack-scale NVL72 beast for probably a few years.[1]
What is changing?
reactordev
Cutting out the Vendor like SuperMicro or HPE, they are going straight to consumer now.
AlanYx
When nVIDIA sells DGX directly they usually still partner with SuperMicro, etc. for deployment and support. It sounds like they're going to be offering those services in-house now, competing with their resellers on that front.
thesuperbigfrog
What software will those Nvidia servers run?
Are they creating their own software stack or working with one or more partners?
kj4ips
They have a Ubuntu derivative called DGX OS, that they use on their current lines.
jpecar
Servers? I thought they left even racks behind, they're now selling these "AI factories".
dmboyd
Aren’t they already supply constrained? Seems like this would be counterproductive in further limiting supply vs a strategy of commoditizing your complements. This seems closer to PR designed to boost share price rather than a cogent strategy.
mikeryan
Huh. I view it the other way. If you’re supply constrained go straight to the consumer and capture the value that the middlemen building on top of your tech are currently enjoying.
energy123
> Further limiting supply
Even if they don't increase their GPU production capacity, that's not "limiting" supply. It's keeping it the same. Only now they can sell each unit for a larger profit margin.MattRix
They’re only supply constrained on the chips themselves. Selling fully integrated racks allows them to get even more money per chip.
dboreham
In MBA-speak this is "capturing more of the value chain".
czbond
Didn't they watch Silicon Valley to learn that lesson? Don't sell the box.
re-thc
Soon Nvidia will sell AI itself instead of servers.
Cthulhu_
To a point / by some definitions of the phrase AI they already do: https://en.wikipedia.org/wiki/Deep_Learning_Super_Sampling
I wouldn't be surprised if we see some major acquisitions or mergers happening in the next few years by one of the independent AI vendors like OpenAI and Nvidia.
michaelbuckbee
Considering they have a pure service in selling Geforce Now (game streaming), that doesn't seem in any way far fetched.
Palomides
why? selling GPUs is way more profitable
giuliomagnifico
Sell a whole infrastructure is more profitable than sell components, and also put the customers in “sandboxes” with you.
reactordev
Why sell when you can rent?
lvl155
We’re not far from Nvidia exclusively bundling ChatGPT. It’s a classic playbook from Microsoft.
gruturo
ChatGPT doesn't really have much of a moat. If it becomes Microsoft or Nvidia exclusive, it just opens an opportunity for its competitors. I barely notice which LLM I'm using unless it's something super specific where one is known to be stronger.
mcintyre1994
I'm pretty sure they'd like to keep selling chips to all of OpenAI's competitors too.
ptero
Chatgpt is not the only game in town. Any exclusivity deal will likely backfire against chatgpt.
MangoToupe
Why would Nvidia ever agree to that?
null
It's my opinion that nvidia does good engineering at the nanometer scale, but it gets worse the larger it gets. They do a worse job at integrating the same aspeed BMC that (almost) everyone uses than SuperMicro does, and the version of Aptio they tend to ship has almost nothing available in setup. With the price of a DGX, I expect far better. (Insert obligatory bezel grumble here)