Skip to content(if available)orjump to list(if available)

CPUs and GPUs to Become More Expensive After TSMC Price Hike in 2026

phkahler

The world is probably going to be split between EUV and non-EUV lithography for a long time. I'd like to see things like the Raspberry Pi stay on the older nodes and just wring out the performance. For reference the Zen1 CPUs were on 14/12nm from GF and I believe they are still faster than the pi, so there is room for improvement from them on old nodes.

wrigby

That’s true, but Zen1 TDP is a lot higher than an RPi. Hitting that level of performance in a power and heat envelope that works for a Pi probably demands a smaller process node, unfortunately.

wmf

I think it will split between TSMC (sold out forever) and Samsung/Intel (please please be our customer).

null

[deleted]

jmward01

I have been interested in city planning for a while and the idea of an 'urban growth barrier' is a key concept with massive benefits. Basically, when you limit a resource, like available land, it simplifies thinking, forces inward development and and very often the 'impossible' problems go away because creative solutions come out. Maybe the chip industry will see something similar if a barrier of 'just wait till the new smaller process hits' gets put in place.

purrcat259

A bit OT but I live on a very land constrained island with the highest population density in Europe (see Malta).

There has been major increase in demand for housing and supply cannot be built fast enough to match. Its turned most of the island into a construction site so rampant that I made an online tracker for urban planning permits so folks can get ahead on knowing whats going on around them.

Idk if you have any wisdom but there's no creative solutionising happening, just the rich able to buy whatever property they want causing prices to rise which is pricing out the middle class, causing a whole lot of grief and downstream issues (such as plummeting fertility rate because homes are too expensive).

Is there a magic toggle we missed to unlock this creativity or am I being realistic by being skeptical that limiting important resources just leads to harsher inequality?

IncreasePosts

Maybe Malta is just too small of a world for that magic toggle, since entrenched interests can basically just prevent the government from approving certain strategies, like building more large buildings with apartments like the Mercury Tower. But on the global stage, not even the biggest player(US) can dictate what China, India, the EU does, so if the US self-limits because of entrenched interests, that's just more juice for the squeeze for other players not aligned with the US.

nostrademons

So I remember urban growth barriers being all the rage in the late 00s, and Portland specifically was held up as a shining example. Many Western cities (eg. Portland, Seattle, SF & the Peninsula, Salt Lake City) followed this general pattern though.

What I actually see, living in one of those metro regions that is land-constrained, is very little infill development and actually implemented creative thinking, and lots of homelessness and unaffordable homes. Sure, being supply-constrained gets people thinking about infill development and mass transit and densification and walkable neighbors. But there's always a barrier, usually multiple ones, to actually putting that into practice. Instead, you get the obvious effects: the supply of homes is constrained, their price rises to the point where only rich people can afford them, everyone else either goes homeless or moves out of the area. Sometimes, very often, a supply constraint simply means that people do without. Even when there is multi-family infill development, people don't want to live there. Everybody wants their SFH, even though intellectually they know it's unsustainable.

I suspect it's going to be the same here. More expensive CPUs and GPUs are just going to mean more expensive CPUs and GPUs with no real silver lining.

lotsofpulp

> I suspect it's going to be the same here. More expensive CPUs and GPUs are just going to mean more expensive CPUs and GPUs with no real silver lining.

Presumably, higher profits will incentivize other players to enter the market and increase supply, and/or the company earning high profits plows at least some back into R&D to at least create better chips, which can result in eventually lower prices for the previous generation of chips.

This isn’t a possibility with land and land rights, however, so I wouldn’t expect the same dynamics to play out.

TylerE

That's just delaying progress for everyone as TSMC basically doesn't have alternatives for the processes that are actually good. Definition of cutting off your nose to spite your face.

amelius

Yes. They should open a FabStore and make Apple pay 30% of their revenue.

PaulKeeble

The Nvidia 3000 series were on Samsung's process since it was quite a bit cheaper than the TSMC process at the time. Samsung has already said it intends to undercut TSMC so assuming they are fairly equivalent it doesn't seem unreasonable for consumer CPUs and GPUs to be made on their process, it gives a price advantage to anyone that makes the move.

joaohaas

>post decides to use the most crusty GenAI image possible

Man, what a cancer. Straight up using the bare TSMC logo here would work just fine.

null

[deleted]

imbnwa

Looks like I’ll need to put that home server together sooner rather later

null

[deleted]

mouse_

Most consumers would do fine with like .25 Apple M1's worth of compute.

makeitdouble

Price of new hardware increasing will stop old hardware from deprecating as much, meaning higher prices across the board.

jjgghhggc

I agree. Even my quad core i7 from 2011 is still fast enough for everything average consumer does. Heck, even average i5 laptop from 2015 or so is more than sufficient.

SchemaLoad

idk about that. I had an i5 2019 macbook and that thing was so incredibly slow and hot. When the M1 came out it felt like jumping 10 years foward.

znpy

I only partially agree. I recently got nostalgic and bought a 2nd-gen Core-i7 ThinkPad X220 from ebay (similar to what i had back in my university days).

It's generally very usable util you open firefox and browse a modern website. Chrome/chromium aren't any better.

The issue with old hardware is web browser and web pages, particularly web pages. Modern websites are incredible resource drains, it's unbelievable.

I can easily read documentation from old-school cool websites like https://pubs.opengroup.org/onlinepubs/9699919799/ but anything fancier than that gets the fan wheels spinning...

ndriscoll

I don't notice any issues with an i5-6600k or an n100. Both are 4c/4t and extremely snappy. The only ways to get them above like 0.5 load are CPU intensive tasks like compiling, video editing, or games. The only thing I've found I can't do is real time viewing of 200 Mbit/s 4k60 10 bit 4:2:2 h265 off my camera because it's software decoded (software can only do ~half speed).

DrProtic

Does it have the Ultrabook version of the i7?

dangus

I disagree. I would submit that the average consumer utilizes a lot of computing power and that you might actually be using less than that as a more technical power user with more discerning tastes in applications and a viewpoint that sees the computer as more of a tool than an entertainment and lifestyle device.

I would also sumbit that the person who is still using a computer from 2011 would never have bought an i7 in the first place. They’re not going to follow an upgrade path where they say “I had a core i3 in 2011 that is too slow now so I’ll replace it with something that had a 2011 i7 that I couldn’t justify purchasing at the time.” Instead they’re more likely to say “I bought this laptop 15 years ago and I’ve battered the hell out of it and I’ve been frugal and it’s time for a new one,” and when they buy that new one it will also be a very modest configuration.

So the fact that a top of the line processor from 15 years ago is still serviceable is not really all that relevant. It’s going to not even be worth it on power consumption alone if you’re running it 24/7.

In other words, there’s more to a chip than raw performance.

zoeysmithe

Even the mythical "average user" uses tons of processing. Just "a few tabs" is like having several large applications open.

nomel

Hasn't been true for about a decade. Modern browsers suspend most processor activity of background tabs, so it's mostly just memory pressure.

devilsdata

This could finally force the developers of cross-platform GUI and webview libraries, drunk on Moore’s Law, to sober up.

Barrin92

decent chunk of RAM maybe but my parents are happily browsing away on decade+ old chrome books. Ordinary web browsing you can do on a potato machine from 2008.

Even if you're a gamer performance on the desktop has eclipsed demand so much that you can play anything on a few years old machine. It used to be way worse in the "olden" days when only 10% of people had a system that was able to play Crysis.

tempest_

Most use phones.

KPGv2

Alongside presumably more RAM that came on a laptop with an M1. Certainly more GPU than came on a laptop with an M1.

unethical_ban

This is what I don't get about laptop power consumption.

I would love an x86 laptop that can operate at 5W or less, even if it isn't blazing fast. If they can squeeze more performance out of a CPU for the same power budget, why can't they make a CPU with the same capability with less power use?

makeitdouble

5W for a laptop is raspberry pi territory, and we're adding a screen to it. I hope we ever get there for a 14" fully usable laptop, but that sounds bleeding edge territory at this point.

The CPU alone can operate below 5W:

https://www.phoronix.com/review/lunar-lake-profiles/6

bluGill

People say that, but money talks and the people don't buy those.

saltcured

Honestly, that's not a CPU problem.

A laptop running below 5W needs to turn down its storage, RAM, GPU, LAN, WiFi, USB controllers, etc. The OS needs to act more like a smartphone, put everything to sleep, and use consolidated polling and interrupts for async behaviors.

I think this is the real advantage Apple has with their vertical integration. They can play a lot of complementary games between hardware and OS.

zoeysmithe

The steam deck's chip runs at 15w maximum and has I believe a 4, 6, and 8w mode. Maybe not 5w but pretty close. The linux desktop on it is a perfectly unimpressive but usable daily driver kind of thing. The problem is consumers dont want this. The failure of the netbook era proved it. People dont want to deal with slow speeds so this kind of thing wont be a mainstream mass produced item.

ARM faking x86 is probably the way to go for this case.

znpy

You should look into chinese laptops using the intel n150/n250 cpus. Those have a 6W TDP.

The issue with those cpus is that they're pretty much never used in laptops meant to do anything serious. They usually are employed into cheap plastic laptops using ridiculous batteries.

I would really like the form factor of 2011-2013 thinkpads, with the chonky 90WHr battery on the back and the 12" form factor.

With a modern cpu (low tdp and low power draw) that would be unbeatable everyday carry.

umanwizard

Isn’t the Apple M series basically the best of both worlds? Extremely high performance, and extremely low power use when not making use of it.

ReptileMan

Have you checked RAM prices lately ...

freshpots

I bought 32 GB of DDR5 RAM two weeks ago and it is $100 more already.

null

[deleted]

drnick1

Thanks, TSMC.

dmitrygr

Absolutely nobody stops you from forming a competing company and charging less for equally good or better results

carlhjerpe

Absolutely everything in the modern economy stops him from forming a competing company, this naive stupid market is the solution idea needs to stop being regurgitated, it's toxic to society.

nomel

I think most everything shows that it's just really hard (see Intel) and they're really good at it, so have a moat that they can take advantage of.

whatsupdog

[flagged]

UpsideDownRide

Chip making has very has barrier of entry. Between fab cost and access to talent you are really not looking at an enticing business opportunity.

Free market is not as free as the name would imply.

anon291

Freedom refers to a potentiality. You are conflating a potential (freedom) for a hoped-for benefit (highly competitive). A free market is a means towards a competitive market because it lowers the artificial barriers to entry into the market. Those barriers imposed by the universe simply are and while we can invest into R&D and education, market forces cannot change the laws of physics.

lotsofpulp

High barriers to entry have nothing to do with free markets.

>In economics, a free market is an economic system in which the prices of goods and services are determined by supply and demand expressed by sellers and buyers.

https://en.wikipedia.org/wiki/Free_market

As long as a buyer and seller can negotiate whatever number they want, or walk away, it is a free market.

tailrecursion

Judging by the response to the parent come back, evidently TSMC deserves the high prices it will be charging. Why?

It's unrealistic to expect just anyone to start up a new fab. But if not one person among billions will start up a new fab it points to intrinsic difficulty or unpleasantness or lack of prestige in the task. A correctively high price has all kinds of advantages for the society.

I haven't seen any argument that difficult tasks shouldn't be priced high. Only name calling.

PaulKeeble

In this case its more its a 100+ billion investment to possibly produce a competing fab and its encumbered with many thousands if not more patents the likes of which will produce a minefield for any entrant. So far the silicon product market has just shrunk over the years with more and more failing to keep up with the latest process technology as the price of entrance and the patent warheads has risen.

denkmoon

Yeah I'll get a loan off Dad for my first EUV machine and we'll snowball from there

marcodiego

Entering a multi billion risky market is no easy task even if you have investira and teams with deep knowledge. That's the reason monopolise should be avoided.

umanwizard

TSMC isn’t a monopoly, they are just better than all their competitors. Nothing forces you to buy TSMC; you can buy Intel, Samsung, SMIC, GlobalFoundries and various others.

lotsofpulp

This monopoly is unavoidable due to the limited number of people in the world with the know-how to make these products, with the second constraint being that you need to risk $10B+ and 10+ years with a risk that you fail.

Edit: per link below, seems like you need $100B+ and 10+ years

wmf

I'm rooting for Rapidus. A lot of people will look foolish if they succeed.

Eisenstein

Great idea. Can you lend me $10 billion?

jgeada

Piddling sum in this domain, that amount won't even buy you enough for 1 production line, let alone where to put in, staff it and get all the necessary collateral to allow people to design to it.

And no amount of money will get you one of ASML EUV machines any time soon as the entire production line for the next handful of years has already been spoken for.

bluGill

I think that is a on the cheap side. Unless you are content with 1990s node sizes

iwontberude

Except for me. I am single handedly holding off all of these people from building a fab.