Skip to content(if available)orjump to list(if available)

The internet is killing old PC hardware [video]

sim7c00

The system’s been hijacked. The craft of real engineering—building sharp, efficient, elegant systems—is buried under layers of corporate sludge. It’s not about progress anymore; it’s about control. Turning users into cattle, draining every last byte, every last cent. Yeah, it sounds dramatic, but look around—we’ve already lost so much.

I’m running 24 threads at 5GHz, sipping data through a GB pipe, and somehow, sites still crawl. Apps hesitate like they need divine intervention just to start. Five billion instructions just to boot up? My 5GB/s NVMe too sluggish for a few megabytes of binary? What the hell happened?

The internet isn’t just bloated—it’s an executioner. It’s strangling new hardware, and the old hardware? That’s been dead for years. Sure, you can run a current through a corpse and watch it twitch, but that doesn’t mean it’s alive.

nashashmi

> The craft of real engineering—building sharp, efficient, elegant systems—is buried under layers of corporate sludge

No. It is buried under the laziness of build-fast, optimize-next. Except optimizing never comes. Building fast requires lightweight development on heavyweight architectures. And that means using bloats like JS frameworks.

ryandrake

It might be time for "24 core CPU and I can't move my mouse"[1][2] to make the rounds on HN again.

1: https://news.ycombinator.com/item?id=14733829

2: https://news.ycombinator.com/item?id=34095032

wat10000

There's a saying about engineering, "Anyone can design a bridge that stands. It takes an engineer to design a bridge that barely stands."

I might rephrase that for our field: anyone can design software that works. It takes an engineer to design software that barely works.

null

[deleted]

nashashmi

I would substitute barely with efficiently.

aredox

This is the result of “programmer time is more expensive than computer time” (which means "than user time").

https://tonsky.me/blog/disenchantment/

didgetmaster

If it takes a programmer an hour to optimize something that saves a second each time it is run, management thinks that is a complete waste since you have to run it 3600 times to 'break even'.

You might think that their thinking would change when you point out that the code is run millions of times each day on computers all over the world, so those saved seconds will really add up.

But all those millions of saved seconds do not affect the company's bottom line. Other people reap the benefits of the saved time (and power usage, and heat generated, and ...) but not the company that wrote the code. So it is still a complete waste in their minds.

Multiply this thinking across millions of programs and you get today's situation of slow, bloated code everywhere.

theandrewbailey

As improvements to manufacturing tech and CPU designs become unable to deliver the improvements that they used to, the cost of computer time will approach the cost of programmer time. As they converge and possibly flip, optimizations will become more useful (and required) to produce the gains we've become accustomed to. I'm not sure how many years away that is.

cman1444

I agree. Hardware improvement will hit a wall at some point. From then on, all performance improvement will have to come from software optimization. I do wonder if AI will allow that to happen much quicker.

Could an AI just crawl through an organization's entire code and make optimization recommendations, which are then reviewed by a human? It seems like there could be a lot of low hanging fruit.

muixoozie

It's a good blog post, but the site is ironically unreadable outside Firefox Reader due to the annoying snow flake animation.

immibis

Except back when we didn't program like this, it didn't take that much longer. It's the result of shitty technology stacks, like the archetypical Electron. We used to make things right and small at the same time.

Even Electron probably could have been fine if the browser was just a document layout engine and not a miniature operating system. There was an article going around a few years ago about Chrome - and by extension, Electron - including a driver for some silly device I don't remember, like an Xbox controller or something. Googling tells me it wasn't an Xbox controller though. Every electron app includes an entire operating system, including the parts not needed by that app, including the parts already included in the operating system you already have.

Language runtimes don't have to be this way, but we choose to use the ones that are!

Eikon

Pretty sure that’s written with AI. The dash gives it away.

specproc

100%. I had my first obviously AI-written email the other day, and that was one of the clear tells.

I was trying to figure out what made it so obvious, the dashes were one thing, the other things I noticed were:

- Bits of the text were in bold.

- The tone was horrible, very cringe. Full of superlatives, adjectives and cliches.

- I live somewhere where English is the third language, most people don't write emails in English without a few spelling or grammar mistakes.

- Nor do they write in paragraphs.

- It's also pretty unusual to get a quick reply.

Lots of these things are positive, I guess. I'm glad folks are finding it easier to communicate quickly and relatively effectively across languages, but it did kinda make me never want to do business with them again.

ndiddy

> The tone was horrible, very cringe. Full of superlatives, adjectives and cliches.

I wonder how much of this is caused by the AI companies using Reddit as a major training source.

SecretDreams

Bold text is a give away. I'm sad about the hyphens because I use them in my normal typing :(.

butlike

If you can clearly read it in your head in that new sassy GPT voice, it's probably LLM

Taylor_OD

This also just sounds like the average sales email haha.

acuozzo

Not OP, but… what? On Windows I have a trigger finger for Alt+0133, Alt+0150, and Alt+0151. I use proper ellipses, en–dash, and em—dash all the time.

dingaling

Good grief, memorising arcane key combos? Doesn't sound ready for the desktop!

remote-dev

On Linux (maybe only certain distros, not sure) the keys are different, but you can enable a Compose key and enable special character keybinds as well.

For example on Mint en–dash is "<compose> <minus> <minus> <period>" and em—dash is "<compose> <minus> <minus> <minus>"

blooalien

I do a similar thing on Linux with a feature called the "Compose Key". I press the compose key (caps-lock on my keyboard), and then the next couple of keypresses translate into the proper character. "a -> ä, ~n -> ñ, etc.

automatic6131

Being aware of Alt character codes puts you in the top 1% of computer users for ability.

sim7c00

100%. if i type it much less eloquent ;D wrote a whole maze of words and had chatgpt untie it.

hamhock666

Kind of ironic given what you are posting about. I think I need to get off the internet

jancsika

Since there's a human behind the wheel, I'll ask:

Are you actually measuring the load-time bottlenecks in devTools?

I don't know the exact details but it appears a lot of sites are sitting around waiting on ad-tech to get delivered before they finish loading content.

eqvinox

great, now I have to un-learn using 'proper' punctuation.

It's AltGr+[-] (–) or AltGr+Shift+[-] (—) on my keyboard layout (linux, de-nodeadkeys) btw.

AltGr+[,] is ·, AltGr+Shift+[,] is ×, AltGr+[.] is …, AltGr+[1] is ¹

[Compose][~][~] is ≈, [Compose][_][2] is ₂ (great for writing H₂O), [Compose][<][_] is ≤, etc.

I use all of these, and more, guess I'm an AI now :(

To be fair I intentionally use — incorrectly by putting spaces around it, just because I hate how it looks without the spaces ("correct" English grammar says there should be no spaces.)

stordoff

It provides an interesting test case for the usefulness (or lack thereof) of AI detectors:

ZeroGPT: Your Text is Human written (0% AI GPT)

QuillBot: 0% of text is likely AI

GPTZero: We are highly confident this text was ai generated. 100% Probability AI generated

Grammarly: 0% of this text appears to be AI-generated

None of them gave the actual answer (human written and AI edited), even though QuillBot has a separate category for this (Human-written & AI-refined: 0%).

acheron

What's wrong with dashes?

rrr_oh_man

It’s not just the em dash—it’s the juxtaposition.

kevin_thibedeau

Nobody types em-dashes.

sim7c00

quite surprised this comment got so much debate after i immediately agreed i used chatgpt -or did i? (u see i dont know how to punctuate , i am not so punctual!)

throwaway439080

Uh oh. I use em dashes all the time in writing. Am I an AI?

zython

I'd have just one final remark, that it really is not a engineering problem but rather a business decision.

After all; having paid gazillions to engineers and project managers to build the sludgefest, all that cash needs to be harvested back into the pockets.

sim7c00

not untrue. though many good projects existed because of passion for good engineering. there is much less good open source now. people want money for their time...

h2zizzle

People need to realize that leisure time - time off work, commute, chores - is paid for by their employer (as far as they're concerned). Which is to say that those of us with only a couple of hours to ourselves a day are being stiffed, no matter how much money we make. Stop letting the workaholics dictate how the rest of us live.

didgetmaster

People tell me all the time that I should just open source my project that I have spent thousands of hours developing. As if doing so would make money magically appear in my bank account.

zython

Tragically my landlord refused my proposal to pay him in github stars :(

fsflover

> Turning users into cattle, draining every last byte, every last cent. Yeah, it sounds dramatic, but look around—we’ve already lost so much.

This is nothing else than enshittification: https://pluralistic.net/2025/02/26/ursula-franklin/

fsflover

Why is this comment so controversial, receiving many upvotes and then downvotes? Isn't enshittification an accepted and known thing?

kibwen

It's accepted and known, but in an economy where most megacorps make their money via enshittification, the well-paid engineers who get paid to shovel the aforementioned shit down our throats don't like being reminded of their essential role.

high_na_euv

Single threaded code / problems are unlikely to get faster with more cores. Foundational limitations still exist

NegatioN

Although that in itself is a true statement, there is sooo much performance left on the table everywhere.

yjftsjthsd-h

It's obviously less dramatic than running across a dozen cores at once, but per-core performance has continued to improve over time

sim7c00

going from 1ghz to 5ghz should make single threads go a little faster ? IO might be a bottle neck on spinning rust, but we've come far from those days too..

austin-cheney

> I’m running 24 threads at 5GHz, sipping data through a GB pipe, and somehow, sites still crawl.

Aside from websites, we will talk about that in a minute, how is performance? I am running Windows 11 on new hardware and it is running great. I built personal box with 64gb of the fastest DDR5 and a AMD 9900xtx. The most expensive component (and least bang for the buck)... the video card. This is my first time to have an NVMe disk and its absolutely amazing.

I am running Debian 12 on mini computer with much less hardware and its doing amazing there too. I can run anything on the box except AAA games and 4k video movies.

Now, for the web talk. I was a JavaScript developer for 15 years, and yes its garbage. Most of the people doing that work have absolutely no idea what they are doing and there is no incentive to do so. The only practical goal there is to put text on screen and most of the developers doing that work struggle to do even that. Its why I won't do that work any more, because its a complete race to the bottom. If I see a job post that mentions React, Vue, or Angular I stop reading it and move on.

avo1d

Where did you move from JS? I am trying my best to learn low level stuff to NOT have these situations. But frameworks are bloated already and any optimisation feels useless

Sincere6066

But there is no 9900xtx?

austin-cheney

I stand corrected: 9900x. 9900xtx is a video card.

trinix912

Well it wouldn't be that way if it were not for all the JavaScript. If we just kept on doing server-side scripting (PHP, CGI/Perl...) and used JS only where absolutely necessary (video players, games...) like in the early 2000s, it would all work fine on 15 year old hardware. But instead we use the browser as an OS and have tons of JS on simple news sites.

dredmorbius

I suspect DOM modelling/rendering also has an impact as I've surfed with JS off, using Firefox on an older (mid-aughts) iMac running Linux ... and can't get much past 2--4 tabs before performance becomes absolutely unacceptable.

Instrumentation of browsers to show performance constraints remains poor and/or a Black Art, so I'm somewhat winging this.

The same system performs wonderfully with scripts, non-browser based requests (e.g., wget, curl), terminal-mode browsers (usually w3m or lynx), and local applications (mailers, audio ripping / playback, GIMP, word-processing, and of course shell tools, as well as a few occasional games perhaps.

h2zizzle

>Instrumentation of browsers to show performance constraints remains poor and/or a Black Art

Which makes the loss of all our DEI programs over the past month all the more painful.

tuyiown

The problem is not JS per se, it's sloppy client code on a very permissive runtime. Otherwise it wouldn't have creeped into app with electron.

pjmlp

Which is a reflection of those writing the sloppy client code in first place.

Back in the 8 and 16 bit days, companies managed to make software available across all of them, or at least the most common platforms, in a time where performance called for hand written Assembly, and each hardware platform was its own snowflake.

And yet in the age of high level programming languages, the best most folks can think of is shipping a browser alongside the application, not only they show complete disregard for the platform, they don't really care about the Web, rather ChromeOS.

And yes this includes VSCode, which has tons of C++ and Rust code, and partial use of WebGL rendering, to work around the exact point of being based on Electron.

null

[deleted]

mahrain

Even having a few 'retro' systems up and running, mostly Macs from the early 2000s, I find that what we used to do on them: chatting on Skype or MSN Messenger, listening to shoutcast streams or downloading on Napster, playing Unreal Tournament online etc. are mostly defunct now. What remains are local games, clunky word processors and MP3's on the local network. It turns out to be a largely empty experience unless you really get back into Command and Conquer or Metroid.

SecretDreams

> It turns out to be a largely empty experience unless you really get back into Command and Conquer

Sounds like a sacrifice I'm willing to make!

acuozzo

> It turns out to be a largely empty experience

I'm pretty heavily involved in SD video restoration and I drive things from "retro" systems mostly because I need 32-bit PCI slots and Windows XP in order to interface with older broadcast engineering hardware.

I could shoehorn everything into a more modern system, but what I have suits my needs.

ForOldHack

Marathon rockets of fury networked. Carmegaddon and Must/Pyst. Spaceward Ho! Where were you? {Fires rocket} pathways into darkness. Loderunner, dungeons of doom, Oregon trail

netsharc

I watched a friend's kid play a game on his offline iPhone. So. many. freaking. ad interruptions! Fucking tragic, I'm glad my childhood gaming wasn't like that...

Said childhood gaming, with what feels like a lot of latency: https://www.futrega.org/digger/

dredmorbius

I've had very similar experiences myself.

For most local applications, or simple over-the-Web fetches via curl, wget, etc., mid-aughts hardware or earlier often suffices.

Amongst my hobbies are occasional large-scale scraping of websites. I'd done a significant amount of this circa 2018-19 on a mid-aughts iMac running Linux. One observation was that fetching content was considerably faster than processing it locally, not in a browser but using the html-xml-utils package on Debian. That is, DOM structures, even when simply parsed and extracted, provide a significant challenge to older hardware.

I had the option of expanding RAM, and of swapping in a solid-state drive, both of which I suspect would have helped markedly (swapping was a major factor in slowness), though how much I'm not sure.

I'll also note as I have for years that this behaviour serves advertisers and merchants as a market segmentation technique. That is, by making sites unusable on older kit, in a world where physical / real-estate based market segmentation isn't possible, is an effective way of selecting for those with discretionary income to buy modern devices. Whom we presume have greater discretionary income / higher conversion rates as well.

(Multiple network round-trip requirements is also a way to penalise those making access from distant locations, as those 100--300 ms delays add up with time, particularly for non-parallelisable fetches.)

I'm not arguing that all site devs have this in mind, but at the level which counts, specifically within major advertisers (Google, Facebook) and merchants (Amazon) this could well be an official policy which, by nature of their positions (ad brokers, browser developers, merchants, hardware suppliers) gets rippled throughout the industry. In the case of Apple, moving people to newer kit is their principle revenue model as well.

axpvms

It seems to hold up pretty well for an 11 year old netbook which was quite underpowered even when it came out. The equivalent would be someone in 2014 making a video about how their Pentium IV setup from 2003 is killed by the modern internet. And actually that's a bit unfair, as Pentium IV was a premium product while this netbook was not.

psychoslave

By 2003 you could buy Half-Life on Steam.

What are these webpages exposed in this video supposed to do¹? Display some text, pictures and maybe some videos. How does it feels it term of complexity and hardware requirement compared to a 1998 FPS which achieved impressive gamer-experience breakthrough incorporated into a customer grade product? Does it seems more fair as a comparison?

Now, obviously you can’t expect all webdeveloper interns out there to reach the level of Valve engineers in 1998, sure. But the frameworks they are asked to use should give them the sober way as the easy path, and let more complex achievements still accessible in the remaining computational resources.

¹ As opposed to something using WebGL or other fancy things incorporated in contemporary browsers.

zokier

I don't think the problem here is the age of the system, but that it was extremely crappy system even at it's time. It's 4 watt pre-Zen AMD CPU running at 1GHz. It's a CPU intended for tablets, and even for that it's bottom of the barrel. Something like i7-4770K from the same year (2013-2014), which I recall being very popular at the time, is over order of magnitude faster. The CPU here is more comparable to CPUs from almost decade prior, the venerable Thinkpad T61 would probably perform better.

ninalanyon

It used to be possible and practical to do real work on much less capable hardware.

Making the difficult stuff possible really should not have made the ordinary easy stuff that we already had so much more difficult.

We used to browse the web with 486s and megabytes of RAM. If all I'm doing is reading a newspaper I shouldn't need giga bytes of RAM.

zokier

At the same time, at no point of history of computing has 10 year old PC been as useful and usable as it is today, if the PC was half-decent to begin with

Using 2015 PC in 2025 is far less painful than using 1995 PC in 2005.

akgerber

My desktop is from 2012, so 13 years old so far, but is still very capable at any task I throw at it. It was originally a high-end workstation, but by 2020 was worth so little that I got it for free from someone moving out of town. Last year, upgrading the CPU to the top of the line part that fit the motherboard socket cost $17 (versus an original MSRP of $2300), and upgrading it to 128GB of RAM cost $40.

When even top-of-the-line older hardware is nearly free, it makes little sense to optimize for bottom-of-the-line older hardware.

It does very well on any modern internet task, as well as playing modern video games with a few-year-old used graphics card.

zokier

I feel Sandy Bridge (2011) and Haswell (2013) were major turning points. Haswell is especially significant because it forms baseline for x86-64-v3, which e.g. RHEL and others are migrating towards: https://developers.redhat.com/articles/2024/01/02/exploring-...

That is also one of the potential problems of pre-Haswell hardware, distros might stop supporting it in near-future

olyjohn

Those netbooks were hot garbage when they came out. I remember trying a bunch of them, and they were all slow and shitty even when they came out. I know a few people who bought various brands, and they basically got put away in a desk and never used. Most of them had such low resolution, you couldn't even see the entirety of the Control Panel dialog boxes in Windows. They could barely play videos without shit chopping up. Web surfing was slow back then. They were horrific.

zokier

Netbooks were nifty, quite useful as ssh-terminal thin clients etc.

Rochus

> A solution is needed to help those old computers

A solution could be to put another layer on top of the internet. This could be done by means of a "presentation proxy", similar to cloud gaming, e.g. based on VNC, where only a VNC client is run on the old computer, and the browser is running on the presentation proxy.

dredmorbius

There are solid reasons to put your browser executable and storage other than on your principle desktop in the first place.

That still leaves the attack-surface of browser-based activities (do you really want your recreational activities sites interacting with your financial services?), but both gives the option for fresh respawns (OS and/or browser) and physical and network isolation of your local storage and data from your browser.

(This presumes hygiene on any browser-based downloads as well, of course.)

In general, the idea that we'd want all our data on a globally-accessible network is seeming increasingly unwise to me over time, given both technical and political developments.

Late edit: Oh, and Browsh: text-only browser-via-proxy supporting CSS/JS.

<https://www.brow.sh/>

<https://news.ycombinator.com/item?id=25129747>

notarealllama

So say we all.

My non-tech friends call me paranoid at the basic level of protection I do - separate browsers for everything, and a couple of VMs for e.g. finances.

Vlan has always been an important part of LAN security but we're just out here with always on, full Internet access? My home firewall logs show an insane amount of bot / scraping etc.

nxobject

It's also a good way of isolating proprietary applications that barf services, applications, and custom ways of keeping user state all over the system – I'm looking at you, Citrix.

WJW

Instead of using a more powerful computer to be the presentation proxy for a less powerful one, the more obvious solution would be to upgrade to a not-so-old computer directly? Alternatively, don't try to load heavy sites on hardware that is too light for it.

la_oveja

that sounds horrifying, not gonna lie

null

[deleted]

eqvinox

Indirect point: using an ad blocker protects the environment. Considering it also helps security (loading fewer things = fewer chances at exploits), it should really be the default.

everdrive

My father in-law did web development for years, but has been retired for a while. I mentioned this to him briefly, and he said pretty nonchalantly “yeah, we were always pressured to push everything to the client to improve response times.” I’m sure there’s more to it than just that, but it was all very simple to him.

rahen

Not only the Internet, but also its technologies. Electron apps bring older computers to their knees, and those apps are becoming ubiquitous (MS Teams, MS VS Code, Whatsapp, Signal...). Sometimes they are even labeled as "lightweight".

tdy_err

Nobody thinks Electron applications are lightweight. Maybe you have heard someone refer to PWAs as lightweight applications surrogates

rahen

Here is a quick example of an application labeled as lightweight, which turns out to be a 500MB+ Electron monstrosity:

"MarkText is a lightweight, user-friendly Markdown editor that serves as a free and open-source alternative to Typora. It’s designed for everyday users who want a clean, intuitive experience."

https://myownsys.com/2024/11/24/everyone-should-have-a-free-...

gw2

If the application is free (with no strings attached), I would not really complain. But the main offenders are apps by large companies that have revenues in billions. The problem is that most of the userbase do not complain.

alpaca128

I’ve seen multiple VS Code users claim it’s lightweight and fast. And to be fair compared to many other Electron apps it is, but many editors still run circles around it.

vv_

> but many editors still run circles around it

Could you provide an example?

high_na_euv

Vs Code?

gw2

I find VS Code to be unmanageable for anything beyond a medium sized project. Maybe the LSPs I use are to be blamed, but I find nvim less problematic in this regard.

ironblood

Language severs as well. I had a desktop bought around 2015, with 16GB DDR3 RAM. That was quite a lot back to that time. For some reason, I used it for a while, and I needed an isolated develop environment, so I installed Debian server with Qemu/KVM, and assigned 4G RAM to it. It looked okay when starting neovim, taking about a few hundred MB, but when starting `tsc`, especially two or three seperate projects, the RAM was not enough any more. Lua language server also needs a lot of RAM.

zkmon

A new release of software or technology is not adapted for its new features. New technologies help to create new security vulnerabilities which in turn force new release of the tech. It's a vicious circle where tech and hacks play a catch-up game. Old PC hardware is like Mayans or uncontacted tribe from an island. They can't tolerate getting exposed to the new world of internet.

Also, companies don't want to invest in supporting multiple versions at any point in time, and can't afford reputation risk by not forcing upgrades.

My company let's the employees to request for and get a software installed, but can hardly allow them to use the features! The Risk & compliance department wouldn't like anyone to work or use any software properly. Any moving thing is a risk.