Skip to content(if available)orjump to list(if available)

The Unsustainability of Moore's Law

HarHarVeryFunny

Well, yeah. Moore's "law" is subject to the actual laws of physics, and the linearity of advances in transistor density over time is due to humans making it so - human chosen targets for each next generation, but as we come up against the laws of physics and cost of battling them then of course this linear trend will become an asymptote.

Clearly what is driving advances in compute nowadays is not single-chip transistor density but instead multi-chiplet datacenter processors and much higher level multi-chip connectivity such as TPU pods and AI datacenter designs.

frognumber

> Another possibility that has long been on my personal list of “future articles to write” is that the future of computing may look more like used cars. If there is little meaningful difference between a chip manufactured in 2035 and a chip from 2065, then buying a still-functional 30-year-old computer may be a much better deal than it is today. If there is less of a need to buy a new computer every few years, then investing a larger amount upfront may make sense – buying a $10,000 computer rather than a $1,000 computer, and just keeping it for much longer or reselling it later for an upgraded model.

This seems improbable.

50-year-old technology works because 50 years ago, transistors were micron-scale.

Nanometer-scale nodes wear out much more quickly. Modern GPUs have a rated lifespan in the 3-7 year range, depending on usage.

One of my concerns is we're reaching a point where the loss of a fab due to a crisis -- war, natural disaster, etc. -- may cause systemic collapse. You can plot lifespan of chips versus time to bring a new fab online. Those lines are just around the crossing point; modern electronics would start to fail before we could produce more.

skissane

> Nanometer-scale nodes wear out much more quickly. Modern GPUs have a rated lifespan in the 3-7 year range, depending on usage.

I recently bought a new MacBook, my previous one having lasted me for over 10 years. The big thing that pushed me to finally upgrade wasn’t hardware (which as far as I could tell had no major issues), it was the fact that it couldn’t run latest macOS, and software support for the old version it could run was increasingly going away.

The battery and keyboard had been replaced, but (AFAIK) the logic board was still the original

chii

> it couldn’t run latest macOS, and software support for the old version it could run was increasingly going away.

which is very annoying, as none of the newer OS versions has anything that warrants dumping hardware to buy brand new to run them with! With the exception of security upgrades, which i find dubious for a company to stop creating (as they would need to do so for their newer OS versions just as well, so the cost of maintaining security patches ought to not be much, if at all), it is definitely more likely to be a dark-pattern to force hardware upgrades.

speed_spread

That's not just a dark pattern, it's the logical conclusion to Apple's entire business model. It's what you get for relying on the proprietary OS supplied by a hardware manufacturer. It's why Asahi Linux is so important.

scarface_74

You mean besides the fact that they completely transitioned to a new processors and some of the new features use hardware that is only available on their ARM chips?

Also he said that software from third parties also don’t support the older OS so even if Apple did provide security updates, he would still be in the same place.

sapiogram

> Modern GPUs have a rated lifespan in the 3-7 year range, depending on usage.

That statement absolutely needs a source. Is "usage" 100% load 24/7? What is the failure rate after 7 years? Are the failures unrepairable, i.e. not just a broken fan?

Mistletoe

I’ve never heard of this and I was an Ethereum miner. We pushed the cards as hard as they would go and they seemed fine after. As long as the fan was still going they were good.

Symmetry

> In a transistor, the voltage of the gate lying on top of the channel controls the conductivity of the channel beneath it, either creating an insulator or “depletion region”, or leaving the silicon naturally conductive.

That's... not how this works at all. Eventually the depletion region where the positive or negative charge carriers (for p or n doped silicon) deplete far enough and then at the threshold voltage inversion happens when the opposite sort of charge carrier start to accumulate along the oxide and allow conduction. By surrounding the channel there's less space for a depletion region and so inversion happens at lower voltages, leading to higher performance. Same as people used to do with silicon on oxide.

The Wikipedia article has nice diagrams:

https://en.wikipedia.org/wiki/MOSFET

jama211

Moore’s law has been unsustainable for 20 years, I remember Pentium 4’s with 4ghz. But that hasn’t seemed to matter in terms of real day to day performance improvements. This article makes some great points about the scaling cost and reduced market opportunity for there to be more than 2 or 3 makers in the market, but that’s a trend we’ve seen in every market in the world, to be honest I’m surprised it took this long to get there.

As interesting as this breakdown of the current state of things is, it doesn’t tell us much we didn’t know or predict much about the future, and that’s the thing I most wanted to hear from an expert article on the subject, even if we can take it with a large pinch of salt.

WillAdams

The corollary is Wirth's Law:

>software is getting slower more rapidly than hardware is becoming faster.

>Wirth attributed the saying to Martin Reiser, who in the preface to his book on the Oberon System wrote: "The hope is that the progress in hardware will cure all software ills. However, a critical observer may observe that software manages to outgrow hardware in size and sluggishness."

I wish that there would be more instances of developments like to Mac OS X 10.6, where rather than new features, the software was simply optimized for a given CPU architecture, and the focus was on improving performance.

ReptileMan

God of war 2 was made on 300mhz cpu and 32mb of ram.

We haven't been bound by moore's law because we just waste computing power because programmers are expensive. No one is trying to optimize nowadays except in very niche places. And when push comes to shove we just start wasting slightly less. Like adding a JIT to a script language 15 years too late.

LegionMammal978

I've always wondered what the classic Moore's-law curve looks like when you take the FLOPs per constant dollar, totaled across the whole R&D/manufacturing/operation process. Sure, you can keep investing more and more into increasingly cutting-edge or power-hungry processes, but at some point it isn't worth the money, and money will generally be the ultimate limiting factor. Not that we'll ever really get these numbers, alas.

ggm-at-algebras

More parallelism. Less clock. More l1 cache per CPU and less disk stalls. Are there plenty of tricks in the sea, when clockspeed goes flat?

Dual ported TCAM memory isn't getting faster and we've got to 1,000,000 prefixes in the internet and ipv6 are 4 times bigger. Memory speed is a real issue.

kristianp

> Roughly every five years, the cost to build a factory for making such chips doubles, and the number of companies that can do it halves.

So we may have Apple and NVidia as the only ones that can afford to build a fab. Edit, correction, Microsoft is the current number 2 company by market cap.

mepian

They can't afford to tank their margins like that, investors would be rather unhappy.

avereveard

> Roughly every two years, the density of transistors that can be fit onto a silicon chip doubles. This is Moore’s Law.

that... isn't the moore law, it is about count / complexity, not density. and larger chips are a valid way to fullfill it.

https://hasler.ece.gatech.edu/Published_papers/Technology_ov...

https://www.eng.auburn.edu/~agrawvd/COURSE/E7770_Spr07/READ/...

layer8

avereveard

even if it were, that isn't about transistor density, but power density, which is not the same as

> the density of transistors that can be fit onto a silicon chip doubles

the whole article takes off from a flawed and fantasious misinterpretation and argue against that self created windmill

null

[deleted]

sys_64738

Has "Moore's Law" been consistent since it reared its head, or has it been constantly tweaked to suit the narrative of it still being correct?

WillAdams

It was good up until 1975:

https://www.livescience.com/technology/electronics/what-is-m...

since then, there have been some adjustments, but it still holds as a prediction of a general trend since as noted in that article:

>One reason for the success of Moore’s prediction is that it became a guide — almost a target — for chip designers.

but as noted:

>The days when we could double the number of transistors on a chip every two years are far behind us. However, Moore’s Law has acted as a pacesetter in a decades-long race to create chips that perform more complicated tasks quicker, especially as our expectations for continual progress continue.

b0a04gl

the law delivered enough headroom that systems moved on. once compute got cheap to rent and scale ,there was less pressure to push frequency or density every cycle. so focus shifted. the gains kept coming ,just not in the same shape.

orefalo

factor in power usage reduction, and it still works

mettamage

Is there also a law for how much more difficult it becomes to sustain Moore's law?

Ultimately, there's a cap. For as far as I know, the universe is finite.

Symmetry

Landuaer's principle govern's how efficient computation can be, but we might have to transition to something other than transistors to hit that limit.

https://en.wikipedia.org/wiki/Landauer%27s_principle

null

[deleted]

mandmandam

> as far as I know, the universe is finite.

I don't think we know that. We don't even know how big the universe really is - we can only see so far. All we have is a best guess.

There may also be a multiverse out there (or right beside us).

And, creating universes might be a thing.

... I don't expect Moore's law to hold for ever either, but I don't believe in creating unnecessary caps.

matthewdgreen

I think you could very easily give a cap that hinges on our current understanding of basic physical limitations, and it would arrive surprisingly soon.

mandmandam

That's the thing about Moore's law - it has assumed from the beginning that our 'current understanding of basic physical limitations' is incomplete, and been proven correct on that front many times over.

layer8

In contexts like these, “universe” means the observable universe, which is finite in size. Also, creating universes (in the usual models) conserves energy, so you don’t actually gain anything by that.