The post-GeForce era: What if Nvidia abandons PC gaming?
174 comments
·December 20, 2025TehCorwiz
wnevets
You're not thinking big enough. Their ultimate goal is gaming (or any computing really) available only in the cloud.
cyber_kinetist
I think China will then try to sell their own PC parts instead, their semiconductor industry is catching up so who knows in a decade.
But perhaps then the US will probably reply with tariffs on the PC parts (or even ban them!) Which is slowly becoming the norm for US economic policy, and which won't reverse even after Trump.
Imustaskforhelp
There is definitely a part of me which feels like with the increasing ram prices and similar. Its hard for people to have a home lab.
To me what also feels is that there becomes more friction in an already really competitive and high-friction business of creating cloud.
With increasing ram prices which I (from my knowledge) would only decrease in 2027-2028 or when this bubble pops, It would be extremely expensive for a new entry of cloud provider in this space.
When I mention cloud provider, what I mean aren't the trifecta of AWS,Azure or GCP but rather all the other providers who bought their own hardware and are co-locating it to a datacenter and selling their services targeted at low/mid-range vps/vds servers
I had previously thought about creating cloud but in this economy and the current situations, I'd much rather wait.
The best bet right now for most people creating cloud /providing such services is probably whitewashing any other brand and providing services on top that make you special.
The servers are still rather cheap but the mood that I can see in providers right now is that they are willing to hold the costs for some time to not create a frenzy (so they still have low prices) but they are cautiously waiting and looking for the whole situation and if recent developments continue happening in such a way, I wouldn't be surprised if server providers might raise some prices because the effective underlying hardware's ram/prices increased too.
thewebguyd
Feel the same way here. Can't help but get the vibe that big tech wants to lock consumers out, eliminate the ability to have personal computing/self-hosted computing. Maybe in tandem with governments, not sure, but it's certainly appetizing to them from a profit perspective.
The end goal is the elimination of personal ownership over any tech. They want us to have to rent everything.
ogogmad
Maybe instead of proposing something "schizo" about government conspiracies, you might realise it's far more profitable right now for companies to sell only to data centres. That way, they don't need to spend money on advertising, or share their revenues with third-party sellers. So it's more profitable. I don't know what culture gets people this paranoid.
sombragris
I doubt that this would ever happen. But...
If it does, I think it would be a good thing.
The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.
Right now, most recent games (for example, many games built on Unreal Engine 5) are unplayable on onboard GPUs. Game and engine devs simply don't bother anymore to optimize for the low end and thus they end up gatekeeping games and excluding millions of devices because for recent games, a discrete GPU is required even for the lowest settings.
mikepurvis
They're not targeting high-end PCs. They're targeting current generation consoles, specifically the PS5 + 1080p. It just turns out that when you take those system requirements and put them on a PC—especially a PC with a 1440p or 2160p ultrawide—it turns out to mean pretty top of the line stuff. Particularly if as a PC gamer you expect to run it at 90fps and not the 30-40 that is typical for consoles.
nerdsniper
Without disagreeing with the broad strokes of your comment, it feels like 4K should be considered standard for consoles nowadays - a very usable 4K HDR TV can be had for $150-500.
futureshock
Thats a waste of image quality for most people. You have to sit very close to a 4k display to be able to perceive the full resolution. On PC you could be 2 feet from a huge gaming monitor, but an extremely small percentage of console players have the tv size and distance ratio where they would get much out of full 4k. Much better to spend the compute on higher framerate or higher detail settings.
vachina
You wish. Games will just be published cloud-only and you can only play them via thin clients.
treyd
It's pretty consistently been shown that this just can't provide low-enough latency for gamers to be comfortable with it. Every attempt at providing this has experience has failed. There's few games where this can even theoretically be viable.
The economics of it also have issues, as now you have to run a bunch more datacenters full of GPUs, and with an inconsistent usage curve leaving a bunch of them being left idle at any given time. You'd have to charge a subscription to justify that, which the market would not accept.
Imustaskforhelp
I am pretty sure that the current demand of gpu's can pretty much eat the left idle time issue at major datacenters because of the AI craze.
Not that its good or bad tho but we could probably have something more akin to spot instances of gpu being given for gaming purposes.
I do see a lot of company are having GPU access costs per second/instant shutdown/restart I suppose but overall I agree
My brother recently came for the holidays and I played ps5 for the first time on his mac connected to his room 70-100 kms away and honestly, the biggest factor of latency was how far the wifi connection (which was his phone's carrier) and overall, it was a good enough experience but I only played mortal kombat for a few minutes :)
iwontberude
This hurt my soul. Kudos.
georgefrowny
One wonders what would happen in a SHtF situation or someone stubs their toe on the demolition charges switch at TSMC and all the TwinScans get minced.
Would there be a huge drive towards debloating software to run again on random old computers people find in cupboards?
gunalx
Until we end up spending trillions recreating the fab capacity of tsmc, they dont have a full monppoly (yet)
Bridged7756
True. Optimization is completely dead. Long gone are the days of a game being amazing because the devs managed to pull crazy graphics for the current hardware.
Nowadays a game is only poorly optimized if it's literally unplayable or laggy, and you're forced to constantly upgrade your hardware with no discernible performance gain otherwise.
djmips
A lot of people who were good at optimizing games have aged out and/or 'got theirs' and retired early or just got out of the demanding job and secured a better paying job in a sector with more economic upside and less churn. On the other side there's an unending almost exponential group of newcomers into the industry who are believe the hype given by engine makers who hide the true cost of optimimal game making and sell on 'ease'.
batiudrami
Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying. The “optimisation” you talk about was the CPU in the ps4 generation was so weak and tech was moving so fast that any pc bought in 2015 onwards would easily brute force overpower anything that had been built for that generation.
ronsor
> Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying.
Not because the developers were lazy, but because newer GPUs were that much better.
bombcar
Obsolete in that you’d probably not BUY it if building new, and in that you’d probably be able to get a noticeably better one, but even then games were made to run in a wide gamut of hardware.
For awhile there you did have noticeable gameplay differences- those with GL quake could play better kind of thing.
justsomehnguy
> your GPU could be obsolete 9 months after buying
Or even before hitting the shelves, cue Trio3D and Mystique, but tha's another story.
archagon
I feel like Steam Deck support is making developers optimize again.
pzmarzly
> Long gone are the days of a game being amazing because the devs managed to pull crazy graphics for the current hardware.
DOOM and Battlefield 6 are praised for being surprisingly well optimized for the graphics they offer, and some people bought these games for that reason alone. But I guess in the good old days good optimization would be the norm, not the exception.
venturecruelty
I haven't been on HN even 60 seconds this morning and I've already found a pro-monopoly take. Delightful.
bilegeek
> The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.
They'll just move to remote rendering you'll have to subscribe to. Computers will stagnate as they are, and all new improvements will be reserved for the cloud providers. All hail our gracious overlords "donating" their compute time to the unwashed masses.
Hopefully AMD and Intel would still try. But I fear they'd probably follow Nvidia's lead.
pegasus
Is remote rendering a thing? I would have imagined the lag would make something like that impractical.
WackyFighter
The lag is high. Google was doing this with stadia. A huge amount of money comes from online multiplayer games and almost all of them require minimal latency to play well. So I doubt EA, Microsoft, Activision is going to effectively kill those cash cows.
Game streaming works well for puzzle, story-esque games where latency isn't an issue.
gs17
GeForce NOW is supposedly decent for a lot of games (depending on connection and distance to server), although if Nvidia totally left gaming they'd probably drop the service too.
vachina
It will be if personal computing becomes unaffordable. The lag is simply mitigated by having PoP everywhere.
preisschild
I agree re "optimizations", but I dont think there should be compromises on quality (if set to max/ultra settings)
forrestthewoods
> I think it would be a good thing.
This is an insane thing to say.
> Game and engine devs simply don't bother anymore to optimize for the low end
All games carefully consider the total addressable market. You can build a low end game that runs great on total ass garbage onboard GPU. Suffice to say these gamers are not an audience that spend a lot of money on games.
It’s totally fine and good to build premium content that requires premium hardware.
It’s also good to run on low-end hardware to increase the TAM. But there are limits. Building a modern game and targeting a 486 is a wee bit silly.
If Nvidia gamer GPUs disappear and devs were forced to build games that are capable of running on shit ass hardware the net benefit to gamers would be very minimal.
What would actually benefit gamers is making good hardware available at an affordable price!
Everything about your comment screams “tall poppy syndrome”. </rant>
georgefrowny
> This is an insane thing to say.
I don't think it's insane. In that hypothetical case, it would be a slightly painful experience for some people that the top end is a bit curtailed for a few years while game developers learn to target other cards, hopefully in some more portable way. But also feeling hard done by because your graphics hardware is stuck at 2025 levels for a bit is not that much of hardship really, is it? In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.
It's not inconceivable that the overall result is a better computing ecosystem in the long run. The open source space in particular, where Nvidia has long been problematic. Or maybe it'll be a multi decade gaming winter, but unless gamers stop being willing to throw large amounts of money chasing the top end, someone will want that money even if Nvidia didn't.
forrestthewoods
There is a full actual order of magnitude difference between a modern discrete GPU and a high end card. Almost two orders of magnitude (100x) compare to an older (~2019) integrated GPU.
> In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.
Nah. The stone doesn’t have nearly that much blood to squeeze. And optimizations for ultralow-end may or may not have any benefit to high end. This isn’t like optimizing CPU instruction count that benefits everyone.
bombcar
I wonder what Balatro dos that wouldn’t be possible on a 486.
duskwuff
The swirly background (especially on the main screen), shiny card effects, and the CRT distortion effect would be genuinely difficult to implement on a system from that era. Balatro does all three with a couple hundred lines of GLSL shaders.
(The third would, of course, be redundant if you were actually developing for a period 486. But I digress.)
oivey
Virtually all the graphics? Modern computers are very fast.
woah
I always chuckle when I see an entitled online rant from a gamer. Nothing against them, it's just humorous. In this one, we have hard-nosed defense of free market principles in the first part worthy of Reagan himself, followed by a Marxist appeal for someone (who?) to "make hardware available at an affordable price!".
venturecruelty
TIL: being anti-monopoly is both entitled and Marxist.
filleduchaos
...what exactly about "make hardware available at an affordable price" is "Marxist"?
fxtentacle
I don’t think they can.
NVIDIA, like everyone else on a bleeding edge node, has hardware defects. The chance goes up massively with large chips like modern GPUs. So you try to produce B200 cores but some compute units are faulty. You fuse them off and now the chip is a GP102 gaming GPU.
The gaming market allows NVIDIA to still sell partially defective chips. There’s no reason to stop doing that. It would only reduce revenue without reducing costs.
cwzwarich
Nvidia doesn't share dies between their high-end datacenter products like B200 and consumer products. The high-end consumer dies have many more SMs than a corresponding datacenter die. Each has functionality that the other does not within an SM/TPC, nevermind the very different fabric and memory subsystem (with much higher bandwidth/SM on the datacenter parts). They run at very different clock frequencies. It just wouldn't make sense to share the dies under these constraints, especially when GPUs already present a fairly obvious yield recovery strategy.
jsheard
You can't turn a GB200 into a GB202 (which I assume is what you meant since GP102 is from 2016), they are completely different designs. That kind of salvage happens between variants of the same design, for example the RTX Pro 6000 and RTX 5090 both use GB202 in different configurations, and chips which don't make the cut for the former get used for the latter.
PunchyHamster
> So you try to produce B200 cores but some compute units are faulty. You fuse them off and now the chip is a GP102 gaming GPU.
B200 doesn't have any graphics capabilities. The datacenter chips don't have any graphical units, it's just wasted die space.
As long as gaming GPUs will compete for same wafer space that AI chips use, the AI chips will be far more profitable to NVIDIA
woah
Why don't they sell these to datacenters as well, which could run a "low core section" with reduced power and cooling?
iwontberude
Well the good thing for NVIDIA AI business is that most of your chips can sit unused in warehouses and still get rich. 6 million H100s sold but infrastructure (water cooled dc) for only a third of them exists in the world.
jaapz
AMD will be very happy when they do. They are already making great cards, currently running an RX7800XT (or something like that), and it's amazing. Linux support is great too
acheron
I got an.. AMD (even today I still almost say “ATI” every time) RX6600 XT I think, a couple years ago? It’s been great. I switched over to Linux back in the spring and yes the compatibility has been fine and caused no issues. Still amazed I can run “AAA” games, published by Microsoft even, under Linux.
keyringlight
Similar with nvidia, you've got to consider what partner companies AMD likes working with. AMD/nvidia design chips, contract TSMC to make them, then sell the chips to the likes of ASUS/MSI/gigabyte/etc to put them on cards the consumer buys. The other market AMD serves is Sony/MS for their consoles and I'd argue they're a major motivator driving radeon development as they pay up-front to get custom APU chips, and there's synergy there with Zen and more recently the AI demand. Ever since ATi bought up the company (ArtX) that made the Gamecube GPU it seems to me that the PC side is keeping the motor running in-between console contracts as far as gaming demands go, given their low market share they definitely don't seem to prioritize or depend on it to thrive.
chasd00
My very gaming experienced and data oriented 13 year old wants to switch from Nvidia to AMD. I don’t understand all his reasons/numbers but I suppose that’s as good an endorsement as any for AMDs GPUs.
moffkalast
AMD will certainly be very happy to raise prices significantly when they have a defacto monopoly over the market segment alright.
speedgoose
If it’s too expensive, I will play on my phone or my macbook instead of a gaming pc. They can’t increase the prices too much.
snvzz
RX 7900gre, can confirm as much.
mdip
Wow, yeah, I picked up one of these a few months before the new generation came out for $350. Everything shot up after that.
My son is using that card, today, and I'm amazed at everything that card can still power. I had a 5080 and just comparing a few games, I found if he used the SuperResolution correctly, he can set the other game settings at the same as mine and his frame-rate isn't far off (things like Fortnite, not Cyberpunk 2077)
There are many caveats there, of course. AMD's biggest problem is in the drivers/implementation for that card. Unlike NVidia's similar technology, it requires setting the game at a lower resolution which it then "fixes" and it tends to produce artifacts depending on the game/how high those settings go. It's a lot harder to juggle the settings between the driver and the game than it should be.
the_pwner224
For games that have FSR built-in you can enable it in the game settings, then it'll only scale up the game content while rendering the HUD at native resolution. And can use the better upscaling algorithms that rely on internal game engine data / motion vectors, should reduce artifacts.
The other cool things is they also have Frame Gen available in the driver to apply to any game, unlike DLSS FG which only works on a few games. You can toggle it on in the AMD software just below the Super Res option. I quickly tried it in a few games and it worked great if you're already getting 60+ FPS, no noticeable artifacts. Though going from 30=>60 doesn't work, too many artifacts. And the extra FPS are only visible in the AMD software's FPS counter overlay, not in other FPS counter overlays.
I recently got a Asus Rog Flow Z13 gaming "tablet" with the AMD Strix Halo APU. It has a great CPU + shared RAM + ridiculously powerful iGPU. Doesn't have the brute power of my previous desktop with a 4090, but this thing can handle the same games at 4k with upscaling on high settings (no raytracing), it's shockingly capable for its compact form factor.
thom
Is there any path for Microsoft and NVIDIA to work together and resurrect some sort of transparent SLI layer for consumer workloads? It’d take the pressure off the high end of the market a little and also help old cards hold value for longer, which would be a boon if, for example, your entire economy happened to be balanced on top of a series of risky loans against that hardware.
ryandrake
It would be great if more GPU competition would enter the field instead of less. The current duopoly is pretty boring and stagnant, with prices high and each company sorta-kinda doing the same thing and milking their market.
I'm kind of nostalgic for the Golden Age of graphics chip manufacturers 25 years ago, where we still had NVIDIA and ATI, but also 3DFX, S3, Matrox, PowerVR, and even smaller players, all doing their own thing and there were so many options.
venturecruelty
We'd need our government to actually enforce antitrust laws that have been on the books for about a century. Good luck.
yegle
I've heard good things about Moore Threads. Who knows, maybe the consumer GPU market is not a duopoly after all, Nvidia exiting the market would be a good thing longer term by introducing more competitions.
My general impression is that the US technology companies either treat competition from China seriously and actively engage, or Chinese tech companies will slowly and surely eat the cake.
There are numerous examples: the recent bankruptcy of iRobot, the 3D printer market dominated by Bambu Labs, the mini PC market where Chinese brands dominates.
resfirestar
This is just DRAM hysteria spiraling out to other kinds of hardware, will age like fine milk just like the rest of the "gaming PC market will never be the same" stuff. Nvidia has Amazon, Google, and others trying to compete with them in the data center. No one is seriously trying to beat their gaming chips. Wouldn't make any sense to give it up.
wmf
It's not related to the DRAM shortage. Gaming dropped to ~10% of Nvidia's revenue a year or two ago due to AI and there was controversy years before that about most "gaming" GPUs going to crypto miners. They won't exit the gaming market but from a shareholder perspective it does look like a good idea.
Animats
> Gaming dropped to ~10% of Nvidia's revenue a year or two ago due to AI
Well, actually it's that the AI business made NVidia 10x bigger. NVidia now has a market cap of $4.4 trillion. That's six times bigger than General Motors, bigger than Apple, and the largest market cap in the world. For a GPU maker.
htrp
yet another reason to not listen to your shareholders.
if it were up to them, cuda would be a money losing initiative that was killed in 2009
internet101010
Furthermore, I would wager a giant portion of people who have entered the ML space in the last five years started out by using CUDA on their gaming rigs. Throwing away that entrenchment vector seems like a terrible idea.
willis936
It's a bad idea and yet everyone does it.
mananaysiempre
Took what, four years for PC cases to get back to reasonable prices after COVID? And that’s a relatively low-tech field that (therefore) admits new entrants. I don’t know, I’m not feeling much optimism right now (haven’t at any point after the crypto boom), perhaps because I’ve always leaned towards stocking up on (main) RAM as a cheap way to improve a PC’s performance.
venturecruelty
Yeah, sure, every tech company now acts like a craven monopolist hellbent on destroying everything that isn't corporate-driven AI computing, but not this time! This time will be different!
rhco
If Nvidia did drop their gaming GPU lineup, it would be a huge re-shuffling in the market: AMD's market share would 10x over night, and it would open a very rare opportunity for minority (or brand-new?) players to get a foothold.
What happens then if the AI bubble crashes? Nvidia has given up their dominant position in the gaming market and made room for competitors to eat some (most?) of their pie, possibly even created an ultra-rare opportunity for a new competitor to pop up. That seems like a very short-sighted decision.
I think that we will instead see Nvidia abusing their dominant position to re-allocate DRAM away from gaming, as a sector-wide thing. They'll reduce gaming GPU production while simultaneously trying to prevent AMD or Intel from ramping up their own production.
It makes sense for them to retain their huge gaming GPU market share, because it's excellent insurance against an AI bust.
oersted
I don’t understand why most people in this thread thinks this would be such a big deal. It will not change the market in significant negative or positive ways. AMD has been at their heals for a couple of decades and are still very competitive, they will simply fill their shoes, most games consoles have been AMD centric for a long time regardless, they are fairly dominant in the mid range and they’ve lost had the best price/performance value.
Overall, I think that AMD is more focused and energetic than their competitors now. They are very close of taking over Intel on the CPU race, both on datacenter and consumer segments, and Nvidia might be next in the next 5 years, depending on how the AI bubble develops.
wewewedxfgdf
AMD would do the same thing as Nvidia but $50 cheaper.
0dayz
It remains to be seen to be fair.
But if this does happen it will be in my opinion the start of a slow death of the democratization of tech.
At best it means we're going to be relegated to last tech if even that, as this isn't a case of SAS vs s-ata or u.2 vs m.2, but the very raw tech (chips).
venturecruelty
Some of the pro-monopoly takes in this thread are mindblowing. We get precisely what we deserve.
If they do it'll likely be part of an industry wide push to kill off the home-built PC market. It's no secret that MS and others want the kind of ecosystem Apple has and governments want more backdoor access to tech. And which mfg wouldn't want to eliminate partial upgrades/repairs. Imagine that the only PC you could buy one day has everything tightly integrated with no user serviceable or replaceable parts without a high-end soldering lab. Now, since it's impractical to build your own they can raise the price to purchase one above reach of most people and the PC market succeeds in their rental PC aspirations.