Software Rot
132 comments
·August 6, 2025roda73
This is one of the reasons I absolutely hate Linux based development and operating systems built with it.
We all know it now as dependency hell, but what it is in fact is just a lazy shortcut for the current development that will bite you down the path. The corporate software is not a problem, because the corporate users don't care as long as it works now, in the future they will still rely on paid solutions that will continue working for them. For me, I run a local mirror of arch linux, because I don't want to connect to internet all the time to download a library that I might need or some software that I may require. I like it all here, but since I haven't updated in a while I might see some destructive update if I were to choose to update now. This should never happen, another thing that should never happen is if I were to compile an old version of some software. Time and time again, I will find a useful piece of software on github and I will naturally try compiling it, it's never easy, I will have to hunt the dependency it requires, then try compiling old versions of various libraries. It's just stupid, I wish it were easier and built smarter. Yes sometimes I want to run old software, that has no reason not to work. When you look at windows, it all works magically, well it's not magic it's just done smart. On GNU+Linux smart thinking like this is not welcome, it never has been. Instead they rely on huge amounts of people that develop this software, to perpetually update their programs for no reason, but to satisfy a meaningless number of a dependency.
skydhash
It’s all volunteer work not some corporation with trillions laying around. If you want something easy, use Debian (or unbuntu). They pretty much have everything under the sun.
What you want to have (download software from the net and run it) is what most distro have been trying to avoid. Instead, they vet your code, build it, and add it to a reputable repo. Because no one wants to download postgres from some random sites.
noduerme
I wish I could write all the business logic I write on an NES and never have to worry about requirements going bad. I guess the thing is, if you're writing anything on top of a network layer of any kind, eventually it's going to require patches unless you literally own all the wires and all the nodes in the network, like a secure power plant or some money clearing system in a bank that's been running the same COBOL since the 1960s. And since you're probably not writing code that directly interfaces with the network layer, you're going to be reliant on all the libraries that do, which in turn will be subject to change at the whims of breaking changes in language specs and stuff like that, which in turn are subject to security patches, etc.
In other words, if you need your software to live in the dirty world we live in, and not just in a pristine bubble, things are gonna rot.
Picking tools and libraries and languages that will rot less quickly however seems like a good idea. Which to me means not chaining myself to anything that hasn't been around for a decade at least.
I got royally screwed because 50-60% of my lifetime code output before 2018, and pretty much all the large libraries I had written, were in AS3. In a way, having so much code I would have maintained become forced abandonware was sort of liberating. But now, no more closed source and no more reliance on any libs I don't roll or branch and heavily modify myself.
gr4vityWall
> 50-60% of my lifetime code output before 2018, and pretty much all the large libraries I had written, were in AS3
Out of curiosity, what kind of work did you do? Regarding our old AS3, did you have any luck with Haxe? I assume it would be a straightforward port.
immibis
Building on bedrock is one extreme; the other is to become extremely fluid - build on quicksand but do it efficiently every time the quicksand shifts. AS3 may stop being useful, but if you can use some tool to recompile AS3 code to web JavaScript, you suffer no loss.
perrygeo
I don't like the term "rot" - your software isn't rotting, it's exactly the same as when you last edited it. The rest of the software ecosystem didn't "rot" either, it evolved. And your old code didn't. "Extinction" seems a much better fit.
codeflo
We as an industry need to seriously tackle the social and market dynamics that lead to this situation. When and why has "stable" become synonymous with "unmaintained"? Why is it that practically every attempt to build a stable abstraction layer has turned out to be significantly less stable than the layer it abstracts over?
dgoldstein0
So one effect I've seen over the last decade of working: if it never needs to change, and no one is adding features, then no one works on it. If no one works on it, and people quit / change teams / etc, eventually the team tasked with maintaining it doesn't know how it works. At which point they may not be suited to maintaining it anymore.
This effect gets accelerated when teams or individuals make their code more magical or even just more different than other code at the company, which makes it harder for new maintainers to step in. Add to this that not all code has all the test coverage and monitoring it should... It shouldn't be too surprising there's always some incentives to kill, change, or otherwise stop supporting what we shipped 5 years ago.
codeflo
That's probably true, but you're describing incentives and social dynamics, not a technological problem. I notice that every other kind of infrastructure in my life that I depend upon is maintained by qualified teams, sometimes for decades, who aren't incentivized to rebuild the thing every six months.
vdupras
Maybe it's a documentation problem? It seems to me that for a piece of software to be considered good, one has to be able to grok how it works internally without having written it.
bloppe
At any given moment, there are 6 LTS versions of Ubuntu. Are you proposing that there should be more than that? The tradeoffs are pretty obvious. If you're maintaining a platform, and you want to innovate, you either have to deprecate old functionality or indefinitely increase your scope of responsibilities. On the other hand, if you refuse to innovate, you slide into obscurity as everyone eventually migrates to more innovative platforms. I don't want to change anything about these market and social dynamics. I like innovation
Ygg2
> When and why has "stable" become synonymous with "unmaintained"?
Because the software ecosystem is not static.
People want your software to have more features, be more secure, and be more performant. So you and every one of your competitors are on an update treadmill. If you ARE standing (aka being stable) on the treadmill, you'll fall off.
If you are on the treadmill you are accumulating code, features, and bug fixes, until you either get too big to maintain or a faster competitor emerges, and people flock to it.
Solving this is just as easy as proving all your code is exactly as people wanted AND making sure people don't want anything more ever.
myaccountonhn
> People want your software to have more features, be more secure, and be more performant
I think it's worth noting that one reason hardware rots is because software seems to become slower and slower while still doing the same thing it did 15 years ago.
codeflo
> People want your software to have more features, have fewer bugs, and not be exploited. So you and every one of your competitors are on an update treadmill. If you ARE stable, you'll probably fall off. If you are on the treadmill you are accumulating code, features, bug fixes, until you either get off or a faster competitor emerges.
Runners on treadmills don't actually move forward.
Ygg2
Kinda the point of the threadmill metaphor. If you are standing on a threadmill, you will fall right off. It requires great effort to just stay at one spot.
gjvc
post a link to a stable repository on github on this site. watch as several people pipe up and say "last commit 2020; must be dead"
source code is ascii text, and ascii text is not alive. it doesn't need to breathe, modulo dependencies, yes. but this attitude that "not active, must be dead and therefore: avoid" leads people to believing that the opposite: unproven and buggy new stuff, is always better.
silly counter-example: vim from 10 years ago is just as usable for the 90% case as the latest one
forgotmypw17
This and Lindy Effect factors a lot into my choices for what to use for my projects. My choice for a project I want to be as maintenance-free as possible are special subsets of ASCII/txt, SQLite, Perl, Bash, PHP, HTML, JS, CSS. The subsets I choose are the parts of these languages which have persisted the longest.
Using the Lindy Effect for guidance, I've built a stack/framework that works across 20 years of different versions of these languages, which increases the chances of it continuing to work without breaking changes for another 20 years.
donatj
I love PHP, however since around 7.4 they have become pretty happy to make breaking changes to the language, including recently in ways where you cannot satisfy older and newer versions of the runtime simultaneously.
I end up spending often a couple weeks of my life on and off fixing things after every major release.
eviks
This dogmatic approach means you lose out on ergonomics by using poorly designed tools like bash and perl, so you incur those costs all the time for little potential benefit far away in the future (after all, that effect is just a broad hypothesis)
zeta0134
Very helpfully, python has stuck around for just as long and is almost always a better choice against these two specific tools for anything complicated. It's not perfect, but I'm much more likely to open a random python script I wrote 6 years ago and at least recognize what the basic syntax is supposed to be doing. Bash beyond a certain complexity threshold is... hard to parse.
Python's standard library is just fine for most tasks, I think. It's got loads of battle tested parsers for common formats. I use it for asset conversion pipelines in my game engines, and it has so far remained portable between windows, linux and mac systems with no maintenance on my part. The only unusual crate I depend on is Pillow, which is also decently well maintained.
It becomes significantly less ideal the more pip packages you add to your requirements.txt, but I think that applies to almost anything really. Dependencies suffer their own software rot and thus vastly increase the "attack surface" for this sort of thing.
sebtron
Python is a very bad example because of the incompatibility between Python 2 and Python 3. All my pre-2012 Python code is now legacy because of this, and since most of it is not worth updating I will only be able to run it as long as there are Python 2 interpreters around.
I like Python as a language, but I would not use it for something that I want to be around 20+ years from now, unless I am ok doing the necessary maintenance work.
forgotmypw17
My main problem with python is that a script I wrote 6 years ago (or even 1 year ago) is not likely to run without requiring modifications.
If it's me running it, that's fine. But if it's someone else that's trying to use installed software, that's not OK.
argomo
It has to be weighed against all the time spent learning, evaluating, and struggling with new tools. Personally, I've probably wasted a lot of time learning /new/ that I should have spent learning /well/.
eviks
Right, nothing is free, but switching costs is a different argument.
forgotmypw17
Is it still dogmatic if I consider Perl to be well-designed and have already evaluated more popular tools?
eviks
If "This and Lindy Effect" do not "factors a lot", but instead the major factor is you believe perl is better designed, then no, dogmatism of vague future risk is replaced with pragmatism of the immediate usefulness
jwrallie
At the point being discussed, which is not breaking backward compatibility, it indeed is arguably better than more popular tools, and I believe perl has other advantages too.
calvinmorrison
Calling perl poorly designed is absurd
blueflow
Its not "far away in the future". Every other IT job right now is supporting, maintaining and fixing legacy software. These are the software choices of the past and you pay them in manpower.
null
oguz-ismail
> poorly designed tools like bash and perl
Skill issue, plus what's the alternative? Python was close until the 3.x fiasco
eviks
Indeed, double skill issue: one with the designers and the other one with the users having poor tool evaluation skills
akkartik
For similar considerations + considerations of code size and hackability, I lately almost always prefer the Lua eco-system: https://akkartik.name/freewheeling
testthetest
It’s getting harder to make perfect choices as projects grow more complex. Even simple products often mean building for iOS, Android, web, backend, etc. You can only lean on html/js for parts of that, but in practice, the mobile apps will probably get rewritten every few years anyway.
From my side I think it’s more useful to focus on surfacing issues early. We want to know about bugs, slowdowns, regressions before they hit users, so everything we write is written using TDD. But because unit tests are couple with the environment they "rot" together. So we usually set up monitoring, integration and black-box tests super early on and keep them running as long as the project is online.
icameron
Nobody has a better ecosystem of “industrial marine grade code rot resistance” than Microsoft. That I can run the same .NET web app code compiled 20 years ago on a new Server 2025 is an easy experience unequaled by others. Or the same 30 year old VBA macros still doing their thing in Excel 365. There’s a company that knows how to do backwards compatibility.
Cthulhu_
I suspect this is true for a lot of software; a modern-day JVM can still run Java from 20 years ago as long as you didn't do anything weird, Linux hasn't significantly changed since then, the web is one of the most widely supported and standardized platform out there with the 1996 Space Jam website still working and rendering today as it was back then (your screen just got bigger).
Is software rot real? I'm sure, but it's not in the runtime. It's likely in the availability and compatibility of dependencies, and mainly the Node ecosystem.
js8
As somebody who works on IBM mainframes, I disagree. IBM is probably the best at forward application compatibility.
People will laugh, but they should really look.
mrkeen
I had the opposite experience. Tried out XNA one year for the Global Game Jam. Was somewhat pleased with it. It was gone the next year.
szundi
[dead]
dusted
Perfect little text, that article is.
Same site has this article about "bedrock platforms" which resonate deeply with me https://permacomputing.net/bedrock_platform/
Software does not rot, the environment around software, the very foundation which owes its existence: the singular task of enabling the software, is what rots.
Let's look at any environment snapshot in time, the software keeps working like it always did.. Start updating the environment, and the software stops working, or rather, the software works fine, but the environment no longer works.
I'm not saying never to update software, but, only do it if it increases speed, decreases memory usage, and broadens compatibility.
I like things better the way they were.
I like things better now than how they will be tomorrow.
I can't remember the last time I saw a software update that didn't make it worse.
prinny_
I don’t get the comparison to building a house. Houses have a ton of maintenance. You can’t build a house on steady ground and leave it unattended for 20 years either. And sometimes what you need to do is not even construction type of maintenance, it’s bills, legal paperwork, replacing old furnitures just because you grew tired of that 15 year old sofa etc.
Daub
As a software user and teacher, I think about software rot a lot. My concern is that it has a tendency to grow by addition rather than replacement. New features are added whilst the fundamental limits of the architecture are left unattended to.
The reason that Blender grew from being an inside joke to a real contender is the painful re-factoring it underwent between 2009 and 2011.
In contrast, I can feel the fact that the code in After Effects is now over 30 years old. Its native tracker is slow and ancient and not viable for anything but the most simple of tasks. Tracking was 'improved' by sub-contracting the task to a sub-licensed version of Mocha via a truly inelegant integration hack.
There is so much to be said for throwing everything away and starting again, like Apple successfully did with OSX (and Steve Job did to his own career when he left Apple to start Next). However, I also remember how Blackberry tried something similar and in the process lost most of their voodoo.
foxrider
Python 2 situation opened my eyes to this. To this day I see a lot of py2 stuff floating around, especially around work environments. So much so, in fact, that I had to make scripts that automatically pull the sources of 2.7.18 and build them in the minimal configuration to run stuff.
Ygg2
Python 2 is a warning about doing backwards compatibility changes too late. As soon as you have a few massive libraries, your backward compatibility risks grow exponentially.
C# did just as big of a change by going from type-erased to reified generics. It broke the ecosystem in two (pre- and post- reified generics). No one talks about it, because the ecosystem was so, so tiny, no one encountered it.
vrighter
When did c# have type erasure?
LittleCloud
C# 1.0 did not have generics, period. So the standard dictionary (Hashtable†) type took keys and values typed as "System.Object". As seen in the linked documentation this class still exists in the latest .NET to this day.
Occasionally one would still encounter non-generic classes like this, when working with older frameworks/libraries, which cause a fair bit of impedence mismatch with modern styles of coding. (Also one of the causes of some people's complaints that C# has too many ways of doing things; old ways have to be retained for backwards compatibility, of course.)
† https://learn.microsoft.com/en-us/dotnet/api/system.collecti...
The paper that the other commentator was referring to might be this: https://www.microsoft.com/en-us/research/wp-content/uploads/...
Ygg2
I do recall some paper mentioning it. But now I'm not sure if Google is gaslighting me or it never existed. But it seems you are right.
saurik
It certainly didn't help that they were annoying about it; like, they actively dropped some of the forward compatibility they had added (a key one being if you had already carefully used u and b prefixes on strings) in Python 3.0, and only added it back years later after they struggled to get adoption. If they had started their war with Python 3.5 instead of 3.0 it would be a lot less infuriating.
flomo
Not being a python dev, there must have been some huge superficial 'ick'. Back when, I was talking to a python guy and mentioned that Python 3 was coming out. He said something like "we're just going to ignore that until they sober-up and fix it." Which it seems like a lot of people actually did. (or they really sobered-up and rewrote in Go or something.)
jgb1984
Today we vibecode our software, so the rot is built in from day one!
I'm looking at GTK here. Don't get me wrong, I like GTK and think it should be the preferred choice of GUI toolkit for many reasons. However, I have the same complaints a lot of people do about constant change and API compatibility issues. In some cases things need to change, but why going form 3 to 4 have menus been removed and require using other constructs? Could you at least provide a wrapper? Don't use event struct members directly, OK use accessor functions... But then you change the names and other details of the functions. It's not a "window" any more, it's a "surface" just because what? Beause Wayland calls them that? API stability is an important feature but these guys are talking about regular (say every 5 years) major version bumps that break things.