Skip to content(if available)orjump to list(if available)

What if humanity forgot how to make CPUs?

palmotea

This has a ton of holes:

> Z-Day + 15Yrs

> The “Internet” no longer exists as a single fabric. The privileged fall back to private peering or Sat links.

If you can't make CPUs and you can't keep the internet up, where are you going to get the equipment for enough "private peering or Sat links" for the privileged?

> Z-Day + 30Yrs

> Long-term storage has shifted completely to optical media. Only vintage compute survives at the consumer level.

You need CPUs to build optical media drives! If you can't build CPUs you're not using optical media in 30 years.

> The large node sizes of old hardware make them extremely resistant to electromigration, Motorola 68000s have modeled gate wear beyond 10k years! Gameboys, Macintosh SEs, Commodore 64s resist the no new silicon future the best.

Some quick Googling shows the first IC was created in 1960 and the 68000 was released in 1979. That's 19 years. The first transistor was created in 1947, that's a 32 year span to the 68k. If people have the capacity and need to jump through hoops to keep old computers running to maintain a semblance of current-day technology, they're definitely f-ing going to have been able to repeat all the R&D to build a 68k CPU in 30 years (and that's assuming you've destroy all the literature and mind-wiped everyone with any knowledge of semiconductor manufacturing).

lauriewired

> If you can't make CPUs and you can't keep the internet up, where are you going to get the equipment for enough "private peering or Sat links" for the privileged?

Storage. You only need a few hundred working systems to keep a backbone alive. Electron migration doesn’t kill transistors if they are off and in a closet.

> You need CPUs to build optical media drives! If you can't build CPUs you're not using optical media in 30 years.

You don’t need to make new drives; there are already millions of DVD/Bluray devices available. The small microcontrollers on optical drives are on wide node sizes, which also make them more resilient to degradation.

> they're definitely f-ing going to have been able to repeat all the R&D to build a 68k CPU in 30 years (and that's assuming you've destroy all the literature and mind-wiped everyone with any knowledge of semiconductor manufacturing).

If you read the post, the scenario clearly states “no further silicon designs ever get manufactured”. It’s a thought experiment, nothing more.

kadoban

> If you read the post, the scenario clearly states “no further silicon designs ever get manufactured”. It’s a thought experiment, nothing more.

This kind of just breaks the thought experiment, because without the "why?" of this being vaguely answered, it makes no sense. How do you game out a thought experiment that starts with an assumption that humanity just randomly stops being humanity in this one particular way? What other weird assumptions are we meant to make?

esseph

If you don't like the rules of the game, you don't have to play it.

3eb7988a1663

Surely knowing something is possible would speed up the process. Transistors had to go from this neat lab idea to find more and more incremental use cases. Eventually snowballing into modern chips. If you know from the beginning that computers are a neat idea, surely that would warrant more focused R&D.

CuriousRose

If humans forgot how to make new CPUs, it might finally be the incentive we need to make more efficient software. No more relying on faster chips to bail out lazy coding and make apps run lean. Picture programmers sweating over every byte like it's 1980 again.

burnt-resistor

Probably not. Devices would run out within a generation.

It ain't ever going to happen because people can write these things called books. And computer organization and architecture books already exist and there are many 10k's copies of them. What should be captured in modern computer organization books is applied science aspects of the history until now and the tricks that made Apple's ARM series so excellent. The other thing is TSMC needs to document fab process engineering. Without the capture of niche, essential knowledge they become strategic single points of failure. Leadership and logic dictate not allowing this kind of vulnerability to fester too deeply or too long.

saulpw

The essential tacit knowledge can't be captured in books. It has to be learned by experience, participating in (and/or developing) a functioning organization that's creating the technology.

djmips

That's already happening

mahirsaid

There will be a great tragedy to be had if that was ever a reality in the near future. The bigger questions is what if you forgot hot to make the machines that make the CPU's. That is the bigger challenge to overcome in this crisis. Only one company specializes in this field that gives big company's like TSMC Their abilities to manufacture great CPU's. The trick is to create the machine that makes them and go from there. 10nm - 2nm capabilities.

trollbridge

I’m a little puzzled how “forgot how to make CPUs” also included “forgot how to make the mechanical part of hard drives, how to make flash memory, and how to make other chips”. I guess I don’t think of a 74xx series chip as a “CPU”?

geor9e

I read it as: we have millions of hard drives and flash drives with a dead controller chip, so we harvest their other parts as spares. We still know how to make the spare parts from scratch, but we have so many for free.

PaulKeeble

There is a bit of an issue that almost all the know how exists within a couple of private companies and if the industry down turned, such as from a crash in an AI bubble causing a many year lull, giant companies could fail and take that knowledge and scale with them. Some other business would presumably buy the facilities and hire the people but maybe not. It's one of the issues of so much of science and engineering happening privately we can't replicate the results easily.

bsder

This isn't unique to semiconductors.

If you turn off any manufacturing line, your company forgets really quickly how to make what that line made. GE discovered this when they tried to restart a water heater line in Appliance Park.

to11mtm

Heck, the US had this problem when they needed to renew/refurbish nuclear weapons due to more or less 'forgetting' how to make Fogbank.

AlotOfReading

FOGBANK was a little more complicated. The process that was written down just didn't work as expected. That was partially lost institutional knowledge that had never been recorded, but the original manufacturing just didn't understand their process. The process had contaminants improving the final product that they were unaware of. When the process was restarted, that didn't happen until it was investigated.

silisili

Yup.

Remington apparently has no idea what the blueing formula was they used in their original 1911s.

Colt lost the ability to handfit revolvers.

0xTJ

A fun read, but I do find it a bit odd that in 30 years the author doesn't think that we would have reverse-engineered making CPUs, or at least gotten as far as the mid-70s in terms of CPU production capabilities.

Also, the 10k years lifespan for MC68000 processors seems suspect. As far as I can see, the 10,000 figure is a general statement on the modelled failure of ICs from the 60s and 70s, not in particular for the MC68000 (which is at the tail end of that period). There are also plenty of ICs (some MOS (the company, not the transistor structure) chips come to mind) with known-poor lifespans (though that doesn't reflect on the MC68000).

therealpygon

Agreed. It is a whole lot easier to recreate something you know is possible than to create something you don’t know is possible.

asciimov

> … no further silicon designs ever get manufactured

The problem wouldn’t be missing CPUs but infrastructure. Power would be the big one, generators, substations, those sorts of things. Then manufacturing, lot of chips go there. Then there is all of healthcare.

Lots of important chips everywhere that aren’t CPUs.

vardump

We're toast should we ever lose ability to make CPUs.

Perhaps there should be more research how to make small runs of chips cheaply and with simple inputs. That'd also be useful if we manage to colonize other planets.

spencerflem

We as in civilization? We made it at least a few thousand years without it.

Or do you mean the circumstances that would lead to this (nuclear war perhaps) would make us toast

throw0101d

> We as in civilization? We made it at least a few thousand years without it.

Civilization is a continuity of discrete points of time.

We were able to enter (so-called) Dark Ages where things were forgotten (e.g., concrete) and still continue because things were often not very 'advanced': with the decline of Rome there were other stories of knowledge, and with the Black Death society hasn't much beyond blacksmithing and so was able keep those basic skills.

But we're beyond that.

First off, modern society is highly dependent on low-cost energy, and this was kicked off by the Industrial Revolution and easy accessible coal. Coal is much depleted (often needing deeper mines). Then next phase was with oil, and many of the easy deposits have been used up (it used to bubble up to the ground in the US).

So depending on how bad any collapse is, getting things up without easily accessible fossil fuels may be more of challenge.

AlienRobot

I'm not sure we can actually support 8 billion people's food production and distribution logistics without CPU's anymore.

spencerflem

Whatever makes us forget CPUs will make there be less than 8 billion people I'm sure.

squigz

No, but that's hardly the same as suggesting humanity would die off. We'd adapt, just at a much smaller scale.

vardump

The civilization as it is.

spencerflem

I mean, the chips they're talking abt didn't exist until like 40 years ago I think we could manage.

But tbh I don't see it as at all likely short of something like nuclear war that would be the much bigger problem.

Legend2440

Be more concerned about whatever nuclear war or social breakdown led to that point. Massive industrial manufacturing systems don’t shut down for nothing.

kimixa

To have zero effort in attempting to reproduce even 70s-era silicon technology for 30 years implies some real bad stuff, if the entire chain has been knocked out to that level I doubt "silicon chip fabrication" would really be a worry for anyone during that time.

vardump

It could also happen as natural decay over centuries. There's no guarantee we'll get more advanced over time.

spencerflem

That would be a pity, but I don't see why we'd be toast.

kimixa

Eh, there's plenty of small fabs globally that do smaller run "nowhere near cutting edge" (180nm or so) runs - you can make a pretty decent processor on that sort of tech.

Would be a pretty solid intermediate step to bootstrap automation and expansion in the cases where the supply of the "best" fabs is removed (like in a disaster, or the framework to support that level of manufacturing isn't available, such as your colony example)

roxolotl

So taking this as the thought experiment it is what I’m struck by is that seemingly most things will completely deteriorate in the first 10-15 years. Is that accurate? Would switches mostly fail by the 10 year mark if not replaced? I’ve been looking at buying a switch for my house should I expect it to not last more than 10 years? I have a 10 year old tv should I expect it starts to fail soon?

__d

My experience with retro computers is that things start to fail from around the 10-15 year mark, yes. Some things are still good after 30 years, maybe more, but .. capacitors leak, resistors go out of spec, etc, and that means voltages drift, and soon enough you burn something out.

You can replace known likely culprits preemptively, assuming you can get parts. But dendritic growths aren’t yet a problem for most old stuff because the feature sizes are still large enough. No one really knows what the lifetime of modern 5/4/3nm chips is going to be.

protocolture

Theres a hardware law that hardware past its half life often lives for an excessively long time.

Really depends on brand and purpose but consumer hardware switches do die pretty frequently.

But if you bought something like a C2960 fanless switch I would expect it to outlive me.

floating-io

I have a 10+ year old Cisco 2960G and a pair of 10+ year old Dell R620's in my homelab, still humming happily along.

So, no.

FrankWilhoit

The larger point is that we are going to forget a lot of things.

datadrivenangel

It would be a bad decade, but someone would figure out how to get older microcontroller class chip production going pretty fast because $$$