Is software abstraction killing civilization? (2021)
218 comments
·February 8, 2025recursivedoubts
layer8
The concept of files and file systems is useful to regular computer users, even when they have no interest in knowing how things work under the hood. The issue is with mobile OSs, and that software companies like their apps to be a walled garden for your data as much as possible, and therefore resist exposing your data as files living in a normal shared file system. Even if you already work with files, they have you “import” your existing data into their storage system, and you have to manually “export” (or “share”) any modifications as a new, separate copy.
DiggyJohnson
In the name of low effort, tangential, golden era HN comments: the decision to hide file format extensions on windows (and maybe other OSs) sucks soooo much.
The point about mobile devices breaking the desktop metaphor and file system norms is really interesting.
Higher quality discussion question: Files, buffers,file systems, file explorers, and window managers seem like useful abstractions to me for the human computer user. Why did we not end up with something like “every app has a folder with your files” or “if you have a new iPhone, just send or recover your user files from the official Reddit app on your old device and import them to carry on where you left off on your new device. Welcome back to the Reddit app.”
layer8
Hiding file name extensions is a bad default, but at least it’s just two clicks in the Explorer ribbon to permanently unhide them: https://static1.howtogeekimages.com/wordpress/wp-content/upl... (Well, it’s now three clicks in Windows 11 it seems.)
rjbwork
> Why did we not end up with something like ...
Because that's anti-thetical to control. The biggest sin in a technology business is allowing your customers to stop using your product with no negative consequences for them.
mike_hearn
Files have big downsides for both users and developers, which is why they have largely been phased out other than for specialised pro use cases:
1. Many users find recursive data structures like trees to be confusing. Recursion is the place where a lot of students fall off the train when learning programming. Trees are fundamental to file systems and explorers, but they aren't intuitive to a lot of people. That's why every app now starts with a screen that highlights recent documents and search.
2. Developers hate files because files suck. They're just flat arrays of bytes. You have to define file formats and that is really hard work many programmers have never done. Databases are where it's at but RDBMS tech never standardised file formats, hence the appeal of SQLite's "use an rdbms as your file format" elevator pitch. Apple/NeXT tried to partly fix this with the concept of bundles, but our industry never standardised a way to transmit directories that works better than PKZIP so bundles hardly work and Apple had to give up on them in most cases. None of the protocols we use understand how to move directories around.
aoanevdus
The structure of the files an app uses internally is undocumented and not intended to be a user-facing API. Who wants to be responsible for handling every insane thing the users decide to do to those files?
philwelch
File name extensions aren’t an inherent or necessary part of the desktop metaphor. They can be stored as metadata in the file system. Mac OS used to do this.
nunez
Because having the Reddit app store and load its configuration and such in iCloud, which almost every iOS user has an account for, is a much more seamless experience for most people.
Intricacies of file systems have always been a means to an end for regular computer users.
recursivedoubts
Yes, also the curse of modern desktop os’s trying to trick people into storing data in the cloud. The notion of just having files somewhere accessible and organized in a reasonable manner isn’t clear to many (most?) of my students.
dismalaf
Last I checked Google Drive shows everything as files in folders, just like a desktop OS.
And as someone who had their house (whole town actually) burn down recently, cloud storage saved all my data, thousands of baby photos, etc...
lmz
Which is a better security model in these days of untrusted apps vs the desktop "my screensaver can read my chrome passwords file" model.
layer8
Access permissions are orthogonal to having a file system. In fact, mobile apps still use the local file system, they just hide it from the user. And password files should still be encrypted with a master key, e.g. application-private secure enclave key where available.
lupire
Your screensaver should be running as an unprivileged user.
PaulHoule
I'll argue that files and filesystems, as we know them, aren't such a great API. [1]
Before we had the hegemony of Unix, for instance, it was expected that operating systems had richer concepts of files, such as files which a record-oriented or block structure, ISAM [2], Record Management Services [3], etc.
What happened was that Unix inspired a generation of microcomputer operating systems, and on top of that, Unix implementation of its own abstractions are poor. For instance, I wouldn't trust any kind of file locking to work on Unix at short of "directory creation being atomic" but even Luu isn't so sure about that one. Contrast that to the Win '95 era in which you could run multiple copies of Microsoft Access on different machines over an SMB file system and expect locking to work properly.
Post-2010 or so there has been a bifurcation between distributed systems that implement simpler but more scalable and reliable 'file-like' APIs like S3 (random access writes are like... aspirational, aren't they?) vs libraries that do what old facilities like DEC's DATATRIEVE did for which I'd include SQLite, Apache Arrow and such. It used to be those facilities had some support in the kernel whereas things like that now are almost entirely userspace.
[1] https://danluu.com/deconstruct-files/
[2] https://en.wikipedia.org/wiki/ISAM
[3] https://en.wikipedia.org/wiki/Record_Management_Services
immibis
Files and directories are just one of many possible abstractions for storing data. You have files and directories. Directories contain files and directories. Your whole device is one big directory. Files are identified by name. There's absolutely no reason to think this is the best possible model.
Here's another: Your device contains a bunch of photos and videos. Photos are identified by looking at them. Videos are identified by a still image, or by playing the video within the video chooser.
Here's another: Your device contains a bunch of apps. Apps can organize their own data however they see fit.
... Microsoft's OLE really was the most well-integrated document-centric desktop we ever got, wasn't it?
PaulHoule
... OLE is one of those things that have been forgotten under POSIX hegemony.
What people remember of the Office '95 file formats was that they were 'proprietary' but it was worse than that in the sense that these were really meta file-formats that would let you embed a 'file' inside a file which would be handled by some arbitrary EXE, so to process any Office '95 file you really need a Windows installation with all the EXEs and DLLs required by that file, thus when Microsoft came out with ActiveX [1] I was really terrified that you'd need a Windows computer to browse the web in a year or two.
I find the 'app' model to be user-disempowering thing since the app chooses what you can do with the data, but 'photos' and 'videos' are something really different because there is a standard format for those so I can take photos with my DSLR or my tablet or make images with a Python program or edit them with Photoshop or MS Paint or the GIMP and view them with different apps, print them out, upload to a web server, etc. A system could provide a different API for accessing this kind of functionality but I'd say it is fundamentally a step back to have to read data with the same program you wrote it with.
esseph
[dead]
mp05
> I teach the systems class at Montana State, where we go from transistors up to a real computing system, and I have students that don't understand what a file system really is when they start my class
Admittedly I am old grouch, but I stopped having any expectations of the current generation of "college kids".
Incidentally, I'm at Montana State getting a master's in IE, and I deal daily with this one PhD student who has demonstrated an inability to perform a simple partial derivative, which you'd think is a pretty useful skill given the subject matter. Hell, last semester in a 400-level math course, one of the students didn't understand how to add two matrices, I kid you not. It is odd that a senior in CS wouldn't know what a file system is, but that seems rather quaint compared to some of the wild bullshit I've encountered here.
My first stint in university in the 2000s felt a lot different than this, and it's a bit depressing. But man, I feel just great about my prospects in the job market next spring.
lo_zamoyski
This depends on the specialization. Computer science vs. computer or electronics engineering, for example.
Computer science, despite the misnomer, is not about computers, even if computers are an indispensable tool in the field. Computer science is all about abstraction and language modeling of a domain and their application; the computing technology of the tool, however important practically, is an implementation detail. Just as it pays for the astronomer to understand how to operate his telescope in due proportion to his needs as an astronomer, it pays for the computer science person to understand how to operate his computer to the degree that he needs to. But it is a total mistake to place the computer at the center of the universe and treat it as the point of departure for what computer science is really about. This muddles things and is a source of much historical confusion in the field, and in fact, this discussion.
In fact, even the language of "low-level" programming or whatever is abstraction and language. It is simply the language of the computing device that must be used to simulate the abstractions of the domain of discourse.
louthy
I like this description a lot, but I think it needs a caveat or two. Most people writing code today are not computer scientists and don’t understand many of the mathematical fundamentals.
I’ve always thought of programming as 1 part science, 1 part craft, and 1 part art. The percentages of those parts vary depending on the job. There may be 0% science in some roles (outside of writing functions). I think the vast majority of programmers are firmly in the craft camp.
jonhohle
I learned computer architecture using MIPS when MIPS were actually used in things. It was nice then, and is nice now.
I spend a lot of my free time decompiling MIPS assembly and small functions can be decompiled to matching C by “hand” without need other tools.
gerdesj
"I teach the systems class at Montana State,"
You teach and hence deal with this: "Information passed on between generations is diluted"
Is it diluted? No it isn't. Your job is teaching and books and computers help us avoid calling you a bard 8) Mind you, being called a bard is not a bad thing!
In the end this is a blog post response to another blog post. I have no idea about how "important" those bloggers are but I smell ... randomly deployed blogging for the sake of it. That's the whole point of a blog anyway. I'm being polite here ...
pyeri
The IT industry has gotten highly specialised over time. The NAND-to-Tetris style education helps create computer scientists who know their stuff from ground up. But tech companies are constantly on the lookout for vocational pragmatists (like "Python Engineer" or "Java Engineer") who could quickly replace another like them. I think that's why the high level language first approach has gotten so popular.
seanmcdirmid
You only have so many hours to teach in a course, the more kids come in with, the farther you can go. If kids have to be taught what were once taken for granted, something else has to give.
aleph_minus_one
In many countries, such a problem is solved via that a lot of incapable students are "weeded out" by brutal exams (in particular "math for computer scientists" exams are prone for this) in the beginning or in the first semesters - consider it to be a kind of hard "sink or swim"-style curriculum.
ignoramous
> When I look at modern 64 bit architecture books that get suggested for me to teach with I just laugh
Why I recommend Bryant/O'Hallaron's Computer Systems: A Programmer's Perspective (for comparch) to newbies, as coding is what most folks are likely to be doing early on.
https://books.google.com/books/about/Computer_Systems.html?i...
null
seanmcdirmid
Actual old fashioned file systems aren’t really in use anymore, and the computerized versions hide the details, you don’t even get a command prompt on an iPad. I’m pretty sure my 8 year old has no clue what a file system is yet (I learned at about 7 when my dad brought home an Osborne with pre-DOS CPM to play around with). Computers don’t require you to know that stuff anymore just to use apps and save/get files.
dismalaf
The details being hidden on toys doesn't mean they're not relevant to a course training people to be developers...
seanmcdirmid
Yes, but it does mean they might not know what they are before they take said course.
harrall
If an older web developer rants about abstraction, they will target React developers.
If a Python dev rants about abstraction, they will target older web developers.
If a C++ application dev rants about abstractions, they will target Python developers.
If a firmware dev rants about abstractions, they will target application developers.
If an electrical engineer rants about abstractions, they will target firmware developers.
Drawing the line for “excessive abstraction” based on what you personally know and calling everything afterwards as “killing civilization” is quite a take.
wredcoll
God, you're right, this is almost as bad as those "well chemistry is just applied physics and physics is just applied mathematics so math is the best" nonsense that shows up every so often.
MonkeyClub
... And math is just applied philosophy, and philosophy is just applied wine drinking.
So let's drink some wine and be merry.
tempodox
I am of the firm conviction that ganja produces better philosophy than wine.
greybox
You make some very good points here. I've watched the talk too and criticism of it is important.
I have to say though Blow is right when he says: "you can't just Draw pixels to the screen"
I am a game engine programmer at a 'medium sized' games company and it is becoming VERY difficult to hire anyone to work on graphics code. DX12 (and others of the same generation) is a massive step up from what previous generations (DX11) used to demand of the programmer. Doing anything at all with these APIs is a massive task and by Microsoft's own admission (I can't find the quote from the docs anymore, citation needed), DX12 is extremely difficult to learn without experience with previous graphics APIs.
I see a lot of people using the argument: "but these APIs are only for developers that want to really push the limits of what the graphics card can do, and want to implement very low level optimizations." and that's partially true. But it's now the industry standard and it's nigh-on unteachable to anyone without previous experience.
Unless something changes, the hiring pool is going to continue to decrease in size.
pjfin123
Blow's talk made me realize I'm not crazy when I get frustrated about basic things being incredibly difficult. Drawing a button to the screen when you want to build a software application has become so difficult that most people just use a progressive web app which is 100x slower than is possible. Are the best options for GUI applications in 2025 really Java Swing and Qt?
gilbetron
Putting pixels on the screen isn't difficult. Putting them on a screen across hardware in a performant manner is the difficult part because it is difficult.
coffeeaddict1
I agree with your point, but DX12 went in the opposite direction of abstraction: it's a much lower level API than the highly abstracted OpenGL.
GuestHNUser
Yet these "lower-level" APIs, DX12 and Vulcan, often don't significantly out perform DX11, and in many cases DX11 performs better. I put lower-level in quotes because those APIs bake assumptions about the hardware into their API which shouldn't be there to begin with because they frequently prevent drivers from getting the most out of the hardware.
ignoramous
> Unless something changes, the hiring pool is going to continue to decrease in size.
Or, we'll see the return of "trainees"?
greybox
I guess what needs to improve more than anything, is the teaching and documentation of these new APIs. There are over-arching ideas that pull things together, but they aren't alluded to in the documentation, you need to attend a training session, or speak to someone who has.
ilrwbwrkhv
I think JavaScript on the server and React and these things has really made the web a mess of software development compared to how little stuff it actually does.
I know for a fact a bunch of kids now do not even know that HTML is what gets rendered in the browser. They think that React is itself what browsers render.
Not to mention the absolute idiot of a CEO of Vercel who thinks React is the Linux kernel of development.
gherkinnn
What an odd claim to make, but he did.
https://news.ycombinator.com/item?id=42824720
I have been around for long enough to remember the vanilla js, jQuery, Knockout, and Angular 1 days and it always came with a baseline mess.
React (sometimes just JSX) can be used sensibly.
Instead, I blame VC-backed tooling like Vercel, Next, Apollo, Prisma, and all the other rubbish Webdevfluencers are paid to fill the web with.
Come to think of it, every part of building software is bloated. From Notion boards to questionable DB choices.
voidr
I agree with you that it's awful that a lot of young developers wouldn't be able to program without React.
I would like to add as someone would has no problems working without any library that the DOM is one of worst APIs ever invented by humankind and that "reactive programming" is a superior model to what we used to do before.
That being said: - NextJS rolled back years of tooling improvements, it's much slower than Vite - I just statically built a NextJS and a page that has no interactivity downloads 100KB of JS to do nothing with it - Facebook decided to make a "compiler" for React instead of just not pointlessly re-rednering components by default - Compared to Preact which is almost a drop in replacement, React is huge! This shows how little Facebook cares.
erikpukinskis
> a bunch of kids now do not even know that HTML is what gets rendered in the browser. They think that React is itself what browsers render.
Ironically, they are right and you are wrong.
HTML is a serialization format, which browsers use to construct a DOM in memory.
React renders directly to the DOM without ever serializing anything to HTML.
The fact that you got this wrong and are so highly upvoted really drives home the “old man shakes fist at clouds” nature of this thread.
the__alchemist
What do you mean by "HTML is what gets rendered by the browser"? In the context of using Javascript to modify the DOM.
My understanding is that HTML is an input to the browser, which is translated to the DOM. (Which is then translated to painting the screen, handling inputs etc.) This is an important distinction in context of your point. React (Or any VDOM or similar JS lib) isn't creating HTML; it's creating a set of DOM manipulation instructions in JS.
__MatrixMan__
That was the whole point of react, right? To create something that wouldn't work at all without JavaScript enabled and would be enough of a mess that Facebook could effectively hide their malware in it.
I think many of Blow's points are good, but that he overlooks that much of the degradation isn't a facet of some kind of generational drift or information entropy but is straight up malevolence on the part of people making the decisions.
gardenhedge
What malware has Facebook hidden in React?
piperswe
I think they were referring to spyware hidden in Facebook's JavaScript, which uses React
booleandilemma
No one knows, it's been hidden til this day
Swizec
> They think that React is itself what browsers render
My kingdom for native JSX in the browser. That would be awesome.
Something similar to how we have WebGL and Canvas, but for pure JavaScript UI without the extra step of DOM reconciliation.
There’s a small (and growing) cohort of people working on rendering React Native directly on canvas or webgl (I think). Supposedly gives you super smooth UX, but I haven’t knowingly tried it yet.
ninkendo
> rendering React Native directly on canvas or webgl
I just threw up in my mouth a little. I can’t wait to:
- not be able to double click to highlight text a word at a time, because the developers of the “super smooth UX” didn’t know that’s what typical text does.
- or triple click to highlight a paragraph at a time
- or have the standard menu available when I highlight text (which I often use to look up words, etc)
- or have text editing support any of my OS’s key bindings, like ctrl-left to move the caret one word, ctrl-shift left to highlight in the process, etc etc
- or any one of the hundreds upon hundreds of common control behaviors, accessibility behaviors, system-wide settings on text behaviors, etc etc etc be respected
Of course if they’re anything like the Flutter folks, they’ll look at every one of these things and say “well, I guess we gotta implement all that” rather than understanding the basic fact that common UI elements offered by the OS should actually be reused, not reimplemented poorly.
I really worry about what software will look like 20 years from now. At this rate we’re just going to forget every thing we learned about how it should behave.
caspper69
Don't worry, they'll figure out a way to compile Skia to webassembly and re-link it to the DOM through JS.
Maybe then the circle(jerk) will be complete.
Ugh.
raincole
> super smooth UX
I think typical front end developers (or the managers they report to) don't really know what smooth UX is. They keep reinventing things like scrolling.
foobiekr
There's almost no HCI or interaction design in what passes for UX today. They are mostly - probably 90+% of them - visual designers who have memorized some canned responses to use when asked why they aren't doing basic things.
Seriously, anyone who things software as a discipline is bad, well, they're right, but they have no appreciation for how appalling "UX" is today vs even 1990.
mhitza
> There’s a small (and growing) cohort of people working on rendering React Native directly on canvas or webgl (I think)
Sounds terrible for accessibility.
rglover
The lengths that some JS developers go to avoid writing HTML is unbelievable.
JSX is one of those technologies that was clever for what it was, but should have never caught on. Adding another layer of abstraction on top of JSX is the type of behavior at the root of the civilizational collapse argument.
Just because you can doesn't mean you should.
gmueckl
Just because you can doesn't mean you should in production.
If someone want to build horrific abominations in rheur spare time at home, I won't stop them. They might become better engineers through that experience. But don't come to me suggesting actually using one of these horrors.
raincole
I don't think it's a very fair criticism. Of all the abstractions over HTML, JSX is the only one that forces you to learn HTML itself. So the reason why JS programmers chose JSX isn't that they refuse to write HTML.
wruza
Mind you, I’m a former x86 asm and C guy who avoids writing HTML. Web 1 state transfers never made sense to me. It just doesn’t click, I don’t think this way and refuse to do so in the name of not sure what.
And while I despise react for its architectural delusions, I still use a js framework to render my webapps. I also don’t see any bloat or something, they all render instantly on both my pc and my phone.
FullGarden_S
oh no. At this point, we just need something that doesn't use the stupid combo of two languages(one markup and one style sheet) for UI and one(totally single threaded) for UX that allows people to share and view simple text and media to one another. Maybe then we can rid ourselves from constraints like having to stick to JS or a Virtual Machine like V8 while settling on poor implementations like WebGL and Canvas. WebGPU is still not a thing and it probably won't be anytime soon.
A new web browser or another JS runtime won't be the solution to the current mayhem. What could actually be helpful is an alternative to the "web browser" that operates on an entirely different stack than the currently JIT overdosed JS engines. But since everybody is well accustomed to and excited about improvements within the current madness(like this comment), adaptation of any alternative web browser like software will be highly unlikely even if it were several folds better at transferring media and rendering graphics with a much simpler approach and high performance. We are officially fo*ked.
sd9
Blow often makes fantastic points about development, and often completely misses the mark.
He’s accomplished great things and has ideas worth listening to - but also plenty of nonsense that’s presented indistinguishably.
I felt quite strongly that the collapse of civilisation talk was one of those pieces of nonsense, and I’ve largely ignored it (despite listening to it twice). I’m grateful to OP for providing a more principled rebuttal.
Don’t even get me started on Casey Muratori, who tries to do the Blow thing but doesn’t even get the good parts right.
dismalaf
Agreed.
Blow's taken the better part of a decade to make a game where he didn't even need to invent the mechanics and Muratori never finished a game he started a decade ago.
Meanwhile if you use modern game engines (including stuff like Raylib) you can have a pretty decent result in a weekend game jam session, and you could probably make Blow's Sobokan game in ~6 months or so (especially with a team of ~10).
wanderlust123
What do you disagree with Casey Muratori on specifically? I have seen some of his content and he seems pretty humble but opinionated on things he knows about. I also think he did a great job with handmade hero.
jiggawatts
Personally I find that Casey presents simple hard facts that rile people up, but at that point they're doing... what? Arguing with facts?
He had a particularly pointed rant about how some ancient developer tools on a 40 MHz computer could keep up with debug single-stepping (repeatedly pressing F10 or whatever), but Visual Studio 2019 a on multi-GHz multi-core monster of a machine can't. It lags behind and is unable to update the screen at a mere 60 Hz! [1]
I have had similar experiences, such a the "new Notepad" taking nearly a minute to open a file that's just 200 MB in size. That's not "a small cost that's worth paying for some huge gain in productivity". No, that's absurdly bad. Hilariously, stupidly, clownshoes bad. But that's the best that Microsoft could do with a year or more of development effort using the "abstractions" of their current-gen GUI toolkits such as WinUI 3.
This is not progress.
[1] The whole video is worth watching end to end, but this moment is just... sad: https://youtu.be/GC-0tCy4P1U?t=1759
jakelazaroff
If I only spoke Arabic or Russian or Chinese, could I write words in my language on those ancient developer tools? Or would I be limited to ASCII characters?
If I were blind, could the computer read the interface out to me as I navigated around? If I had motor issues, could I use an assistive device to move my cursor?
I'm not saying this excuses everything, but it's easy to point at complexity and say "look how much better things used to be". But a lot of today's complexity is for things that are crucial to some subset of users.
masfuerte
Single-stepping is moderately complicated. I'm even less impressed with modern editors that can't keep up with typing. My first computer could do that. It had a 2MHz 6502.
john_the_writer
I've had the same thing happen with a drawing app.. I've an older android tablet that has a sketch app. It works great.. I got a new high powered tablet, because the battery on the old one wouldn't keep a charge.. New one has a sketch app that lags.. draw line... wait. draw line... wait. It's unusable.. It has a newer processor, and more memory, but I can't draw.
TheMode
I personally do not like how his solution boils down to "just learn more" which may be true at an individual level, but not as the general solution to awful software.
You will never be able to force developers worldwide to start writing everything in C/Assembly, or even to care beyond "it performs fine on my machine". Individuals can have fun micro-optimizing their application, but overall, we have the app we have because of compromises we find somewhat acceptable.
More likely the solution will be technical, making great/simple/efficient code the path of least resistance.
sarchertech
Watch some of the intros to his performance aware programming videos. He doesn’t want everyone to use C or Assembly. He also doesn’t want everyone micro-optimizing things.
>compromises we find somewhat acceptable
His entire point is that most developers aren’t actually aware of the compromises they are making. Which is why he calls it “performance aware programming” and not performance first programming.
patrick451
The real problem is that we have a broken, anti-performance culture. We have allowed "premature optimizaiton is the root of all evil" to morph into "optimization is the root of all evil. That single quote has done untold damage to the software industry. We don't need to pass laws to force all programmers worldwide to do anything. Fixing our culture will be enough. In my view, that's what Casey is trying to do.
patrick451
Muratori's main talking point seems to be that modern software is slow. I think he is 100% correct on this. It's absolutely insane how long it takes jira to show me a ticket or slack to switch chat rooms and that vscode can't keep with up with normal typing speed.
bena
Does he though?
It seems like he just makes these broad critical statements then pisses off to continue to do nothing.
I wouldn’t even say he’s accomplished great things. He’s accomplished some decent things.
He’s released two games which barely qualify as games. They’re more like puzzles. You don’t need to play them ever again after you finish the first time. Braid is ok. And the Witness is just Flow. And then after that he’s been working on a programming language for the past decade which he hasn’t released to the public because “it’s not finished”.
He got lucky and made his nut and ever since then has thought of himself as way more talented than he is.
mjr00
> He’s released two games which barely qualify as games.
Everyone's entitled to an opinion, but it's worth pointing out that Braid is not only widely considered one of the best games ever made[0], its success was also instrumental in the explosion of indie games in the late 00s/10s. The Witness didn't quite reach the same heights, but it got an 87 on Metacritic and was a financial success.
Even if it's only two games, that's a much stronger resume than most. You can argue that he takes too long to develop games, but other studios also take 8 years to make games and come out with Concord or Dragon Age: Veilguard.
[0] https://en.wikipedia.org/wiki/List_of_video_games_considered...
bluefirebrand
> Braid is not only widely considered one of the best games ever made[0]
The source here is a wikipedia article containing "A list of video games that multiple have considered to be among the best of all time"
I don't know anyone who is into gaming that takes the opinions of "video game journalists or magazines" seriously anymore. They are so obviously just a marketing extension of the videogame industry that you can't trust anything they say. Look at recent developments with Dragon Age Inquisition. Reviewers gave it great scores, but it flopped hard enough that Bioware had huge layoffs [0]
Braid is a game that people barely remember exists today, and in another 10 years it will be even more obscure. It is not remotely an all time great
> its success was also instrumental in the explosion of indie games in the late 00s/10s
Don't give Braid too much credit here, it caught the indie game wave at the start but the wave was coming either way. The explosion happened at the same time Braid came out, not years later inspired by its success.
[0] https://www.forbes.com/sites/paultassi/2025/02/02/after-drag...
bena
I like Braid. I do. But, it’s more of a puzzle than a game. Once you figure out the answer, you’re done. Like Myst.
Games have some element of skill involved and Blow’s products lack that.
And that’s not to knock the products. I like puzzles. And not everything has to be a game. Will Wright has said that he does not make games, he makes toys. Digital toys.
I also think people judge his work on his reputation more than on its merits
yoyohello13
I do agree, although I do like him. I find it fun to listen to Blow’s rants and part of me feels like he is right to a degree. It would be nice if programmers knew more about the platforms they are developing on.
But it’s pretty clear that Blow’s process does not make him particularly productive. The dude obviously works insanely hard and is talented. I can’t imagine how much he would be able to produce if he leaned on some abstractions.
torlok
I think you're getting close to ad hominem. It should be sufficient to point out how little overall people like Blow have accomplished. Blow is a hard worker, and yet it took him and his team 8 years to develop The Witness, which as you pointed out isn't anything revolutionary. I think there's a weird form of elitism at play here. Working close to the hardware isn't hard. There was a point in my early 20s where I too felt like I'm better, because I know how a computer works, or that making games in pure C is harder than in C++. As an adult this is cringe. I work in embedded, and I can tell you there's enough low level programmers out there to threaten my job security.
deanCommie
The Witness isn't revolutionary, but it is crafted with the care of someone who cares about the small details of 3D worlds. Besides being easily one of the deepest/broadest puzzle games of the decade, it is GORGEOUS, and filled to the brim with visual scenes that take your breath away.
To me it's something more akin to the iPod. "No wireless. Less space than a Nomad. Lame." was absolutely a correct way to dismiss it as nothing revolutionary. And yet it's perfect well-rounded craftsmanship WAS revolutionary.
From what it seems, Blow is doing exactly the same thing with his next game too. And there's something admirable about that.
I say this as someone who disagrees with 80% of what he says, but VEHEMENTLY agree with the remaining 20.
bena
Blow routinely trashes software he’s never had to write.
Single user software that is the sole focused application when it is running. It saves nothing of importance and doesn’t alter any system to any appreciable degree.
You know what’s more impressive than The Witness? Any web browser. It has to deal with HTML, CSS, JavaScript, XML, SGML, SVG, etc. An entire runtime environment with garbage collector. A web browser can run VS Code. It can run Doom. It can be made to run The Witness.
Blow does not remotely acknowledge the complexity needed to make modern software. He treats his incredibly small domain as the pinnacle of the craft.
xanthor
Have you shipped a commercial-quality 3D game engine and game? How long did it take you?
BirAdam
There are certainly plenty of issues with the modern software landscape, and I do think too much abstraction is a problem. Yet, the opposite extreme is also bad, and people overly romanticize the past. Not only were crashes and reboots problems, not only did Amiga and such have compatibility problems between hardware versions, but even systems that strove for compatibility suffered from incompatibilities.
The fact is, even on the most unreliable modern system (Windows 11) my computer is far more reliable than any computer I had before about 2010. It can also run software written for Windows 95. That’s a very good run. A computer being usable day to day is better than one that isn’t.
travisgriggs
Not all simplifications are abstractions. Not all abstractions are simplifications. But the pursuit of simplification is usually what motivates an abstraction. I don’t think that abstractions kill software, or civilization for that matter, but ill begotten abstractions in the name of short win simplifications, puts a drag on the flexibility and agility and approachability of either.
Take syntactic sugar in just about any language. There’s usually a breaking point, where you can say the localized gain in simplification for this particular nuance is not worth how complex this tool (language) has become. People don’t make more mistakes in a syntax heavy language because of any particular element, but because using the tool well to solve complex problems just gets difficult (regardless of what a compiler might guarantee you).
I look at the complexity that async and coroutines adds to my experience when programming anything “threaded like” in Kotlin compared to how I deal with the same sorts of problems in Elixir/Erlang and it’s just night and day difference. Both have abstractions/simplifications to the age old problem of parallel/async computing, but one (the former) just multiplies simplicities to end up with something complex again and the other has an abstraction/simplification that Just Works(tm) and really is simple.
dostick
It appears that author is from that newer generation, and completely misses points because he just don’t know. Ironically the article is an example of what Blow was talking about. similar happens if I talk about how Figma is destroying the design world on unprecedented scale by normalising bad UX, UI and product management found in Figma itself, and get responses from younger designers that everything is great, completely baffled. You have all that knowledge because you grew up in that environment. They didn’t, and not likely they can learn the equivalent of culture and experience anywhere.
senordevnyc
So your rebuttal is that they’re wrong because they’re young and inexperienced? OK, maybe, but what exactly are they wrong about? You haven’t actually added anything to the conversation with this ad hominem attack.
rat87
Also I couldn't find the authors age but judging by the fact they say they still use an Amiga they're probably at least in their 40s(unless they're a huge retro fan)
john_the_writer
Also talked about C64's, like they lived through it. That brings them into their 50's. I got the feeling they've been at it a long time.
xandrius
Could you expand on how Figma is destroying the design world?
low_tech_punk
I'm curious too. I want to understand "Figma is destroying the design world on unprecedented scale by normalising bad UX, UI and product management found in Figma itself".
null
gtsop
Which points do you think the author has missed?
jibal
"It appears that author is from that newer generation"
No it doesn't. Nothing else you wrote here is true either.
rat87
He demolished Blows argument so I don't think it's important to bring up age but it does seem like he is likely at least in his mid forties. The Amiga was popular in the late 80s
> "The argument that software is advancing is obviously false" I still regularly use my beloved Amiga computers. A couple of weeks ago, I was copying a file to an Amiga's hard drive when the computer suddenly decided to crash on me. Not because I did something wrong, but because old home computer OS:es weren't very stable. This resulted in a corrupted hard drive partition and the OS failed to re-validate the file system. My only option was, in the end, to re-format the partition.
I don't even know much about design but I can tell your claims about Figma are totally wrong, in a similar fashion to blows claims. It's the nostalgia talking. You're forgetting the fact that user interface has always had a lot of crap, just like software just like everything. The positives of the old cream of the crop is remembered while all the crap and even all the failings of the well designed things are forgotten
foobiekr
Figma absolutely sucks.
It is slow, but more importantly, by encouraging UX designers to work in high fidelity designs where live changes are frustrating and feedback is entirely incorporated as a bunch of mostly-hidden comments in a sandbox that makes it hard for reviewers to follow up on or reference back to, all it's done is made everything worse.
null
rglover
Ill-considered abstractions, yes. There are a lot of abstractions that you can tell are the first draft or attempt but because of tech's "religion of speed"—and being drunk on hubris—end up shipping (vs being one of several iterations).
Then, if those abstractions are part of a popular project, other people copy them due to mimetics (under the watery banner of "best practices").
Rinse and repeat for ~10-20 years and you have a gigantic mess on your hands. Even worse, in a hyper-social society (ironically, because of the technology), social consensus (for sake of not being found out as an "impostor") around these not-quite-there solutions just propagates them.
FWIW, I love that Jonathan Blow talk and come back to it at least once per year. He doesn't say anything controversial, but deep down, I think a lot of developers know that they're not shipping their best (or guiding younger generations properly), so they get upset or feel called out.
We've just reached a state of culture in which chasing novelty has become commonplace (or in some cases, celebrated). Well-considered solutions used to be a cultural standard whereas now, anything that's new (whether or not its actually good) has become the standard.
We can navel gaze and nitpick the details of Blow's argument all we want, but the evidence is front and center—everywhere. And yes, over a long enough horizon, that can lead to civilizational collapse (and when you consider the amount of broken stuff in the world, arguably, already is).
gtsop
It is unfortunate that someone needs to pick apart a flawed thesis in such detail (as the author did with Blow). The pure empiricist is equaly as detached from reality as the pure theorist, and as such Blow is making up arguments just because they fit his experience, cherry picking examples that fit his rants and promoting the exception to be the rule.
null
low_tech_punk
I think this post is related: We are destroying software (https://antirez.com/news/145)
I don't disagree that software engineering is not as rigorous as it was. But software has also spread into much wider area, allowing many more people to participate as programmers, all thanks to abstractions.
I'm drawn to the romantic/pure/spartan aspect of low level programming too but I also need to make a living by solving problems in practical and scrappy ways.
cyberax
> I don't disagree that software engineering is not as rigorous as it was.
Such an utter BS. Modern run-of-the-mill software would run stadiums around the software of 80-s or 90-s.
Version tracking, continuous integration, and even _bug_ tracking was not at all common in 90-s.
Testing? That's so far ahead of the state-of-the-art of even 2000-s that it's not even funny. Even the basic unit testing was a huge step ahead in 2000-s. And fuzz testing, ubiquitous sanitizers, time-travel debugging are all novel to the mainstream development.
And that's not all. We're now not just talking about formal methods, but applying them in practice (see: WUFFS). And we now have safety-oriented low-level languages like Rust (that's just basically 10 years old).
I teach the systems class at Montana State, where we go from transistors up to a real computing system, and I have students that don't understand what a file system really is when they start my class
I agree that blow is wrong on some details, but I really think we need to be thinking hard around a NAND-to-Tetris style education that starts in high school for our technical students.
I use "outdated" models like Little Man Computer and a simple visual MIPS emulator which, even though they aren't realistic, at least give the students a sense of where we came from, with a level of complexity that a normal human can get their head around. When I look at modern 64 bit architecture books that get suggested for me to teach with I just laugh.
Anyway, not to say I agree 100% with Blow on everything, but connecting technology down to the roots is a hard problem.