Kill the "user": Musings of a disillusioned technologist
104 comments
·February 5, 2025rogual
rcarmo
Personally, I think the web is 200% to blame. It has wiped out generations of careful UX design (remember Tog in Interface?) and very, very responsive and usable native (and mostly standard-looking) UI toolkits in favor of bland, whitespace-laden scrolling nightmares that are instantly delivered to millions of people through a browser and thus create a low-skills, high-maintenance accretion disc of stuff that isn't really focused on the needs of users--at least not "power" ones.
I know that front-end devs won't like this, but modern web development is the epitome of quantity (both in terms of reach and of the insane amount of approaches used to compensate for the browser's constraints) over quality, and I suppose that will stay the same forever now that any modern machine can run the complexity equivalent of multiple operating systems while showing cat pictures.
Earw0rm
Front-end dev is optimising for a different set of constraints than HIG-era UIs.
Primarily that constraint is "looks good in a presentation for an MBA with a 30-second attention span". Secondarily "new and hook-y enough to pull people in".
That said...
HIG UIs are good at what they're good at. But there is an element of a similar phenomenon to how walled-gardens (Facebook most of all, but also Google, Slack, Discord..) took over from open, standards-based protocols and clients. Their speed of integration and iteration gave them an evolutionary edge that the open-client world couldn't keep up with.
Similarly if you look at e.g. navigation apps or recipe apps. Can an HIG UI do a fairly good job? Sure, and in ways that are predictable, accessible, intuitive and maybe even scriptable. But a scrolly, amorphous web-style UI will be able to do the job quicker and with more distinctive branding/style and less visual clutter.
Basically I don't think a standardised child-of-HIG formalised UI/UX grammar could keep up with the pace of change the last 10-15 years. Probably the nearest we have is Material Design?
mckn1ght
> walled-gardens (Facebook most of all, but also Google, Slack, Discord..) took over from open, standards-based protocols and clients. Their speed of integration and iteration gave them an evolutionary edge that the open-client world couldn't keep up with
Seems to me to be a combination of things, none of which indicate that the new products are implicitly better than the old. The old products could’ve incorporated the best elements of the new. But there are a few problems with that:
- legacy codebases are harder to change, it’s easier to just replace them, at least until the new system becomes legacy. slack and discord are now at the “helpful onboarding tooltip” stage
- the tooling evolved: languages, debuggers, IDEs, design tools, collaboration tools and computers themselves all evolved in the time since those HIG UIs were originally released. That partially explains how rapidly the replacements could be built. and, true, there was time for the UX to sink in and think about what would be nice to add, like reactjis in chat
- incentive structures: VCs throw tons of money at a competing idea in hopes that it pays off big by becoming the new standard. They can’t do that with either open source or an existing enterprise company
Analemma_
I also think the web is 200% to blame, but for a different reason: ad-tech in general and Google+Apple in particular taught users that software should cost $0. Once that happened they didn't go back, and it torpedoed the ISV market for paid programs. You used to go to CompUSA and buy software on a CD for $300; that can't happen now. Which would be fine, except adware filled the revenue gap, which by necessity brought a new set of design considerations. Free-as-in-beer software fucked us over.
leidenfrost
I was about to say the same thing.
It even happens in the FOSS world. Open Source theorists tell us all the time that "free" only means "free-as-in-freedom". That we can share the code and sell the builds.
But whenever someone actually wants to charge users money for their own FOSS apps, even if it's only a few bucks to pay for hosting and _some_ of the work, outraged users quickly fork the project to offer free builds. And those forks never, ever contribute back to the project. All they do is `git pull && git merge && git push`.
Maybe the Google+Apple move was a strategy against piracy. Or maybe it was a move against the FOSS movement. And maybe the obsession with zero-dollars software was a mistake. Piracy advocates thought they were being revolutionaries, and in the end we ended up with an even worse world.
AshamedCaptain
One example in MS Word is the ribbon. It is relatively recent invention, and when it was introduced, _at least_ they went to the effort of using telemetry to guess which features where actually often utilized versus which ones where not, and designed the ribbons in accordance.
Nowadays every new "feature" introduced in MS Word is just randomly appended to the right end of the main ribbon. As it is now you open a default install of MS Word and at least 1/3 of the ribbon is stuff that is being pushed to the users, not necessarily stuff that users want to use.
At least I can keep customizing it to remove the new crap they add, but how long until this customization ability is removed for "UI consistency" ?
scotty79
> At the core of Photoshop is a consistent, powerful, tightly-coded, thoughtfully-designed set of tools for creating and manipulating images. Once you learn the conventions,[...]
Photoshop has a terrible set of conventions. I'd take Macromedia Fireworks any day of the week instead. But Adobe bought Macromedia and gradually killed Fireworks in 8 years due to "overlap in functionality" between it and 3 other Adobe products.
That action pretty much enabled enshi*tification of Photoshop which wrapped the terrible core of Photoshop with terrible layers of web and ads.
rcarmo
I still run Fireworks under WINE in Fedora. The only real pain point is that the fonts are stupefyingly small in comparison to other apps due to the way modern GUIs and resolutions have grown (this is fixable, but not everywhere).
mistrial9
Fireworks is great, still used today
DanielHB
I think you have a bit of rose tinted glasses, I remember back in the day how much people complained about every feature being shoved into the title bar menus of word and photoshop. Eventually growing the menus too long with a bunch of features no one cared about and obscuring the actual useful ones.
rcarmo
Yeah. This (and Fitt's Law) is actually why the ribbon UI came about. 80% of the most common stuff is right there, and you can customize or search for the rest.
ghaff
Yeah, I actually saw a presentation at Microsoft's Mix (the web-oriented conference they ran for a while) about the ribbon. I never loved the ribbon but it was really an attempt to deal with all the features that maybe 1% (at most) of users ever utilized but that those (many different) 1%s REALLY cared about.
One of the reasons I like Google Workplace. I'm mostly not a power user these days so the simpler option set really works for me even if I very rarely run into something I can't quite do like I'd prefer.
DanielHB
Not only that, but in photoshop and word dialogs for features released in 1995 had different design conventions from dialogs from feature released in 1999. Not UI controls and stuff like that (everything was still using win32), but the design language itself. Like how to display things to user or how to align buttons and stuff like that.
If anything this kind of layout patterns are better today than they were back then. What the OP is complaining about was ALWAYS a problem for any long-lived software. But given how much _older_ some software is today it is no wonder it is way more noticeable.
Win95 had a bunch of Win3.1 settings dialogs just like Win11 still has a bunch of Windows98 setting dialogs (the network adapter TCP/IP configuration one comes to mind).
I mean just look at this:
https://www.reddit.com/r/Windows10/comments/l85hsx/windows_3...
the_third_wave
> One day, all these products will be gone, and people will only know MBA-ware. They won't know it can be any other way.
Just like vinyl made a comeback 'real' software will come back as well, maybe running in an emulator in a browser. Yes, there will be copyright problems, yes there will be hurdles but if and when MBAware becomes the norm software will persevere. Free software for sure, commercial software most likely even if legally on shaky foundations. The tighter they grip, the more users will slip from their clutches.
patates
> vinyl made a comeback
Well, at least one of us must be living in a bubble.
batch12
I don't know about a full comeback, but it is really the only physical format I see for sale in major retail stores in the US (like Walmart, etc.)
The format had outpaced CDs recently [0], but I think due to a decline in CD sales too.
[0] (2023) https://www.bbc.com/news/64919126.amp
the_third_wave
It went from 'nearly dead, scrap the factories' to 'we need more manufacturing capacity to supply the increasing demand'. Not a bubble, i don't do vinyl but I know several people who do. The same will happen to 'real software' once the current iteration of productivity tools can no longer be distinguished from advertisements for such, once you need to really dig down to do simple things but have "AI enhanced autoclippy" under every "button". It is just the way things go no matter the field: 'craft' beer, vinyl, sourdough, vintage whatever. In some cases it is actually rational, in others (vinyl) it is mostly driven by emotional factors. The 'craft software' revival would be an example of a rational move.
AndrewKemendo
What a great description
We live the timeline where the Feringi were the ones who bootstrapped the Borg with their hyper-consumption-trade culture
All will be consumed into the financial market (Borg cube) eventually
dist-epoch
> At the core of Photoshop is a consistent, powerful, tightly-coded, thoughtfully-designed set of tools for creating and manipulating images. Once you learn the conventions, it feels like the computer is on your side, you're free, you're force-multiplied, your thoughts are manifest.
It's funny that today there still isn't a free image editing software comparable with the Photoshop from 2000. Krita is close, but still cumbersome to use.
misnome
I’m sure that it is a complete coincidence that CS6, the last before they moved to a subscription model, is the last time it was mostly nonsense-free
marsovo
Indeed. The trouble with subscriptions is that you don't need to make the new version actually good enough to convince people to upgrade, you just need to make it not bad enough for people to abandon the subscription entirely.
I think the same thing happened to Windows and Office.
Having said that, there's probably another elephant in the room, namely the current generation that grew up with phones and tablets and didn't really learn to use traditional computers fluently
noduerme
I would love to jettison Photoshop/Illustrator and just use Affinity. Illustrator has recently gone from taking 30 seconds to over 2 minutes to launch on my M1. It's an atrocity. But Adobe software is so entrenched in printing that, even though print media is only 10% or less of what I do these days, it would just be an endless headache to deal with file conversions to and from other designers, print shops and publishers. And anyway I expect Affinity will go the same way soon, now that they're owned by Canva.
rcarmo
I'm surprised nobody mentioned GIMP in the 2 hours since you wrote this (I still use Fireworks inside WINE).
card_zero
I'm not particularly surprised: who wants to try to hold up GIMP as an exemplar of a good interface? Maybe by arguing "it's not as bad as it was, and now it hardly sucks at all".
I'm an old Photoshop user who has GIMP now. I'd like to do a breakdown of everything that's wrong with its interface behavior, but analysing exactly how it does behave would be a major mission. There's something - several things - wrong with how it selects, moves, deselects, selects layers, and zooms, compared to what I expect for the workflow I try to have. Possibly this is just a matter of needing to learn new conventions, but possibly I have learned the GIMP conventions and they're just clunky.
Interesting, though, since this is organic, grass roots, free software interface crappiness, not the coercive corporate kind.
Jach
Indeed, especially because Krita is terrible for a lot of what I think of as "image editing"; I'd much rather use Gimp for work like that. Though I do quite like the recent AI stuff available to Krita at the moment; e.g. there's a plugin that lets you "object select" performed by AI, so e.g. you click on a person's shirt and it selects the shirt, or add other parts of the person (or draw a box) and get the person themselves, separate from the background. Or click on a bird, or speaker, or whatever. And you can use the ai-diffusion stuff to remove it, easier than the old heal tool techniques. (The selection is of course not perfect but a great compliment to the other selection tools that more or less overlap with Gimp's, but I prefer Gimp's knobs and behaviors after the selection. And I'm sure Photoshop has similar AI stuff by now, but I remember over the years it's seemed like a lot of stuff crops up in open source first. e.g. I think for quite a while "heal" / "smart patch" was just a script-fu plugin..)
I just appreciate that there are many options, and can talk to each other pretty well. If one becomes unusable, I have others, and sometimes there are newer ones. I did a stage banner project last December but I had Gimp, Krita, and Inkscape all open at the same time. (With a quick use of an old version of Illustrator in Wine to export into an illustrator template matching the dimension outlines and particular color space the printing company needed...)
Photoshop tried to be the everything tool, and it probably is and will continue to be the best kitchen sink (and if I knew it better and had a license, it probably could have sufficed by itself for my project), but for any specific thing there's going to be something else that's better for at least that thing (maybe even one of Adobe's other products, like Illustrator). Krita isn't competing with Photoshop so much as with Photoshop's usefulness in drawing and making art, and in that space are also Clip Studio Paint or Procreate on iPads, both quite popular with hobbyist and professional artists. Gimp isn't competing so much on the art creation side (or even making simple animations like Krita lets you do more easily) as it is on the editing and manipulation side. And when editing camera raws, you'd use Lightroom/Darktable/RawTherapee. Inkscape is vector graphics, a whole other use case and competitive landscape.
(Speaking of old/dead software, I remember using Xara Xtreme LX for a while, it was really slick...)
wetpaws
[dead]
zoomerknowledge
Photopea
ge96
Windows has been annoying me with this notification that keeps popping up time to time "Hey want to use Adobe?" in the bottom right corner
reginald78
I just disable the notifications entirely. If you don't want me to view it as a garbage dump stop putting all your garbage there.
freetonik
I've written a short blog post "User is Dead" a while ago https://rakhim.org/user-is-dead/
> User is dead. User remains dead. And we have killed him. How shall we comfort ourselves, the developers, the designers, the growth hackers? What was holiest and the final judge of all that the world has yet owned has bled to death under our A/B-tests and new features. Who will wipe this blood off us? What garbage collector is there for us to clean ourselves? What conference of atonement, what disruptive technology, what sacred meeting shall we have to invent?
Tasteless, but I felt, and still feel, like the notion of a user is truly lost. Somehow, the only technology which technically allows direct 1-many relationships between a small group of builders and a vast number of users, managed to create an industry which actively prevents and disincentives such relationships.
crabbone
> Somehow?
Oh no! Working with end-users is madness. It's tiring, exhausting, counterproductive. Users don't know what they want, and will demand the worst possible solution for them. They will resist and circumvent security measures in the program. They will make sure to use the program in an unintended way and then endlessly complain about it not working in the way it was never meant to work.
The day I transitioned from B2C to B2B my emotional well-being improved tenfold.
Now, on a more serious note: making an individual user facing product is more expensive. It's easier to pitch and sell the product to an organization managing multiple users because the organization will agree and compromise on many aspects of the product, and then will create internal organizational policy for its users to use the product only in permitted ways. It will make feedback and improvements requests expensive for the users and will serve both as the customer surveyor and as the first tier of the customer support for the software shop. It's a match made in heaven (and that's how Microsoft and the likes built their empire). There's nothing surprising about that.
card_zero
See also: customers, and clients. Getting in the way, wanting things, causing chaos. Every business runs much more smoothly without them.
rjbwork
>Somehow, the only technology which technically allows direct 1-many relationships between a small group of builders and a vast number of users, managed to create an industry which actively prevents and disincentives such relationships.
Bean counters and social status games players came in and spewed money everywhere and said to the engineers: "do your engineer thing, we'll handle the money and the people".
kevingadd
I think this string of questions from the middle of the post really gets to the heart of it:
⁃ Does the “user” feel respected by the software?
⁃ How does this software affect the mental health of the “user”?
⁃ How does the software fit into the rest of the “users” lifestyle?
⁃ Does this software help the “user” perform a task/entertain them without coercion?
A lot of modern software looks really bad if evaluated through the lens of these four questions.
dist-epoch
> How does this software affect the mental health of the “user”?
We need to talk about Jira...
dartos
Yeah… it’s awful.
The user is often not the end customer of any software, so it’s not optimized for their benefit.
atoav
This is why I predominantly use CLI tools where it makes sense. Two CLI tools are more alike than two AI tools, theh tend to respect the user and their intelligence more, there is no sign in, it works together with other software and it will work for decades.
I'd love if GUI applications were similarly stringent or even had the goal of creating an ecosystem, but they don't, they are competing against each others trying to grasp user attention, bending them to their wills, locking them in. Not all of them of course, but the mental overhead with CLI is much smaller.
hello_computer
CLI presently selects for users with reading comprehension, who are a hard sell for trojan horses. If the great unwashed ever took a shine to CLIs, CLIs would become just as bad. You can already see a bit of this in PowerShell (i.e. marketing & telemetry).
teucris
I’ve read a lot of articles like this over the past decade, and books, and papers. I agree with most of it: we need to bring technology back “to the people”. But everything I’ve read, including this, has two problems:
1. Consistently, people reminisce about older tech that they loved, and wishing they could have stuff like that again. But the reality is that people like us, on HN, are not the average person. The tools we adored were great to us because we knew how to wield their power and felt empowered to do so. What parts of those applications could the broader population use easily?
2. What should we be using to create personal/folk/situated software? How do we even accomplish this goal? Again, we (HN readers) know the tools and feel empowered. But for the ideas in this article to come to light, many more people need to be empowered to solve their own problems and tailor their tools to their needs. What technologies should they use? I never get a good answer to that one.
ickelbawd
I’ve been coming around to a similar point of view that modern software technology removes human agency. Everything is being automated—we thought it would free up our time for other things, but to my eyes we’ve become less free. AI is only going to accelerate this phenomenon—robbing us of even higher levels of agency as well as our ability to think independently and deeply. All in the name of efficiency and engagement. I struggle with this daily since I work in and have been steeped in tech for decades. I used to love it. Part of me still sees the good that modern technology has enabled too. I’m not sure what the solution is here besides logging off the internet and returning to live in the real world with real human interaction and slower, more meaningful connections.
PaulHoule
I like the sentiment but the article itself ought to be edited to be 1/3 the length, certain themes should be broken out. The story of 'cyborg software' should be told in depth, for those of us who live that dream it doesn't need a lot of explanation but for other people it needs to be spelled out.
There are some interesting business patterns such as 'investment than disinvestment' which is common in communications applications. An application like Zoom, for instance, needs a lot of engineering work to deal with old computers, new computers, people with four cameras attached to their computers, who have slow internet connections, etc. A 'just works' experience is essential and the money is there to support it. A decade later it will be in a harvesting phase, the money won't be there to fix funny little bugs, and now it 'just doesn't work'.
There are other problems around recurring subscriptions, for which Creative Cloud is hardly the worst example. See the gaming industry, which has moved away from experiences that are 20-120 hours to wanting to be the one game you play for decades
tobr
I got into interaction design and UX design because so much of it was so bad. At some point along the way it seems we got too good at it. Many of the points in this article frankly makes me feel somewhat embarrassed to describe myself as a UX designer. Maybe I should start to think of myself as a… personal computing designer? small software designer? dignity designer? (No, surely designing ”dignity” is somehow an even worse pretense than designing ”experience”.)
gyomu
We lost the plot when interface design became UI/UX (and all the associated modern variants on this terminology).
The goal of an interface is clear: it is how a human interacts with a machine. Buttons, dials, latches, sliders - those are interfaces. We can reason about them, make taxonomies, determine what operations they are appropriate for (or not), and so on.
“User experience” tries to capture everything into a nebulous haze that exists not to serve a human with a task to accomplish that a tool will assist with - but a business and how it will capture “users” and guide them on the “journey” it deems most appropriate to reach its sales goals.
Design students won’t be able to formulate a cogent thought on what the properties of appropriate interface feedback are, but they’ll be great at cranking out “personas” and sleek landing pages that enumerate marketing points. Something’s rotten.
samiv
We lost the plot when the MBA ass clowns took over and started adding "engagement" and other features that only serve the interest of the developer not the user.
The typical corporate software is now primarily serving the needs of the developer and the user is secondary. In many cases the user has no way but to succumb (due to lack of competition or enterprise/workplace policies etc) and eat their frustration because there's no alternative, or they're not allowed to use the alternative or the alternative is equally trash.
eXpl0it3r
I disagree that UI/UX as interface design is where we lost the plot. After all UI mostly refers to the looks and UX to the interaction, both have always existed and on their own are certainly not bad.
For me one of the biggest issue is that neither the developer, nor the UI/UX designer are actual users of the software they write. If you don't actually understand how users end up using the software, best even skip the interviews with users, you'll never be able to write software that truly serves the users.
Additionally, you have the separation between developers and UI/UX which leads to another area where trade-off needs to be made to accommodate (new) requirements. In that sense you might be right, that when the developers created everything, they were able to shape something way more efficient and concise, but often also at the cost of looks and potential accessibility.
A second issue I see is, that software these days is always developed for the average user, where as in the past software was developed for experts. As such the look becomes "simplified" and the UX dictates a layout that is cumbersome to use for experts and power users.
slfnflctd
I feel compelled to say that this is the most concise and accurate summary of the whole mess I've yet seen.
When you take something straightforward which is grounded in hard logic and measurable outcomes, and try to combine it with abstract concepts about feelings and organizational goals that overlap with multiple philosophies, a massive amount of space for endless argument over details is created.
card_zero
I found an archive of Apple's Human Interface Guidelines.
Here's 1985:
https://archive.org/details/apple-hig/1985_Apple_II_Human_In...
Here's 2013:
https://archive.org/details/apple-hig/MacOSX_HIG_2013_10_22/...
(More in sidebar)
Do they still have these? Anyway I present it as a curiosity. Maybe they're heroes for sticking relatively closely to objective considerations like how a slider should behave and when to go fullscreen, or maybe it's all their fault for being pretentious and tastefully elegant and dumbing things down with a 1-button mouse in the first place. Maybe it shows a smooth evolution from good intentions, and successful corralling of terrible erratic interface choices (from the dawn of time, i.e. the 80s and 90s) into something logical and standard, through into an inevitable rise of nebulous "user experience" over function. Or maybe it was all nebulous and abstract from the start, and the high point was in the middle, somewhere around 2000, where it went through a phase of nitty-gritty good sense which faded away again, I don't know.
andrepd
> Something's rotten
It's called late capitalism. We cannot structure a society around not only profit, but infinite growth, and expect that not to give us any problems. It does.
You cannot just sell as many copies of Photoshop as last year: you must sell more, each quarter, and more expensive. You cannot sell as many phones as last year: you must make batteries non-removable so phones break quicker, you must spend 3x your R&D budget on ads to manipulate people on buying an identical phone to last year. Etc etc
moron4hire
I think the big red flag with UI/UX was that it was a new term invented for a concept we already had: HCI. Human-Computer Interaction is what it was called when companies like Apple and Microsoft were releasing style guides for application developers to adhere to, to ensure their applications remained consistent with every other application on the system.
By inventing a new term, that generation of designers signature that they were ignorant of that past work. And the specific choice of terminology further signalled the shift in values for the designer. It was no longer about creating optimal interface points between the human and the computer, it was about creating "an experience".
It centered creating a distinct brand identity for the application through the design, a goal that is anathema to the goals of HCI. Because of this, it was not enough to use the native UI toolkit of the system, which had been carefully designed around consistent experience. It became necessary to reinvent UI tooling from scratch, with the primitive elements available in the browser engine being readily available for doing such. The cross-platform nature of the browser also presented an opportunity for the MBA- and SV-driven focus on monopoly to pursue total market dominance, rather than just creating one good app for the users you have on the one system.
skydhash
> It centered creating a distinct brand identity for the application through the design, a goal that is anathema to the goals of HCI
Not really, as the constraints of HCI are relatively loose. The fact is that current UX designers don’t read the litterature and don’t take usability in account. Instead, UI design is just aesthetic and UX is ruled by sales.
A platform should be consistent, but you can add your variation as long as it’s usable. Which most UI toolkits allows.
atoav
UX can be used and abused. The biggest sin is that instead of thinking about the big picture of someone sitting at their computer and trying to do a thing that requires multiple programs to work together, bad UX designers assume they start from scratch without any existing environment and have only to anseer to their own project.
Imagine if something like Command line pipes would exist for GUI applications and then ask yourself why it does not (the closest thing might be copy/paste).
If you as an UX designer try to empower your users and do that in a way that does not break all existing convention without a good reason, you will be fine.
A industrial designer can design medical injection systems for drug cartels with the goal of catching more sheep or for a medical non-profit with the goal of making actual medical help cheaper — design is not the problem, the business interest it might be used for is.
Designers aren't always in the position to make their decisions freely, but you can put your weight behind the right side and everybody in here will appreciate you for doing so.
ben_w
> No, surely designing ”dignity” is somehow an even worse pretense than designing ”experience”
Much.
"Dignity" seems be be mostly used when it is missing — without any dignity, lost their dignity, helping others regain their dignity…
And Dignitas.
"Experience" seems still positive to me, at worst a bit cliché.
samiv
Based on my 25 yoe in the industry the article is extremely cynical but unfortunately very much spot on.
Today in the corporate world the MBAs are running the show and they need continuous, growth and engagement. All needs of the developer and they manifest themselves in the software as features that are only there to serve the needs of the developer, not the user often combined with (deliberate) dark patterns and confusing UX just drive up the KPIs.
What a load of BS.
And the user often has no choice but to eat their frustration due to lack of alternatives, corporate/workplace policies or the alternatives being equally bad.
Unfortunately I expect the whole concept of "PC" as in Personal Computing is going to go away and will be replaced by "locked down, you don't own shit computing".
All the major platforms today are more or less locked down, Android, iOS, MacOS and Windows is on its way. I expect in the next 5 years Microsoft will introduce mandatory app signing and will lock the platform down so that the Microsoft store is the only place for installing apps. They can shove all candycrush, azure etc garbage in the users face who have no choice but to eat it.
Linux but is the only bastion of hope but unfortunately that's still (on the desktop) alpha quality and will likely never move past alpha.
Hopefully I can retire soon.
WhyNotHugo
> I expect in the next 5 years Microsoft will introduce mandatory app signing and will lock the platform down so that the Microsoft store is the only place for installing apps.
This is already the case in lower end hardware.
> Linux but is the only bastion of hope but unfortunately that's still (on the desktop) alpha quality and will likely never move past alpha.
I fear that a lot of mayor software is suffering from a similar issue. Targeting a fantasy “user” who is both literate enough to install, maintain and use Linux, but illiterate enough that displaying keyboard shortcuts or showing a menu when right clicking would confuse them.
There’s little effort to stabilise software. The priority is often “I want all apps to use my chosen theme”, and not “I want apps that work out of the box”.
And there’s a huge trend of “X for gnome” or “Y for KDE”. Portable software has a become a niche thing.
kbolino
I think a bifurcation is likely. You can't program these locked-down devices/systems on themselves. At the end of the day, some of us have to be able to write the code that makes these things work. So I think there will always be "unlocked" or "developer" operating systems and hardware. But those systems will be considered "too powerful" or "too dangerous" for mundane tasks, especially entertainment.
Right now, the legacy operating systems like Windows and macOS are trying to straddle the line. I think in the not too distant future you'll have to choose between being able to run a debugger and hack the OS vs. being able to watch shows, play video games, and access your bank account. I'm not exactly sure how the split will happen, it could be through totally different ecosystems, or an irreversible flag in the firmware, or maybe something else entirely, but it seems almost inevitable at this point.
stuartjohnson12
Android has already gone this way - certain features like access to cardless payments are disabled if you're running a "not locked down" version of Android in anyway.
I bought a new 2nd hand phone recently and learned that one the hard way when I went to pay for my journey to work. The previous owner had enrolled the phone in the Android beta program, which, this being Google, is remotely controlled and required me to spend half an hour Googling to understand how to unenrol myself and my phone.
skydhash
I don’t mind locked down hardware as long as it’s usable. I don’t open a microwave or a tv for fun, but I mind the experience while using them. If you want to do one thing, do it well. And do it even after the manufacturer collapses unless there’s a service involved. What I don’t like is when it’s my computing ressources that are involved, but they want to reassure the mothership that it’s legitimate use.
Let’s take an ereader as example. I can buy it knowing that it can only display books from Amazon Kindle and I need an active subscription or a license for the book. Or I can buy it knowing that I can display any ebook file I have. It’s one or another, or both together independantely.
Which is why I don’t mind the old App Store model. I know that it’s locked down and I can license apps which will be tied to my account. When it’s no longer supported, I can still download the version that work. But now the license is temporary so as soon as you’re not able to pay, the app breaks.
WhyNotHugo
> I think a bifurcation is likely.
Surely Apple has some other OS used internally for debugging/testing hardware prototypes. I heard rumours that it's Linux, but it might just be something else built on their BSD core. Whatever it is, it's kept secret and not available to the general population.
> I think in the not too distant future you'll have to choose between being able to run a debugger and hack the OS vs. being able to watch shows, play video games, and access your bank account.
This is already the case on Android. Technically on iOS too; you can't run a debugger there at all.
openrisk
> Folk music is enmeshed in a particular culture. It is knowledge transmitted across generations, but which evolves to meet the challenges of the times and changing sensibilities.
This is a beautiful and unexpected connection. Drawing analogies between software and other forms of cultural expression is a long overdue mental shift. The use of linguistic expressions such "tech" and "engineering" highlights the prevailing desire to think of software as some sort of thing apart, less social, less political (and thus something we can profitably pursue with fewer moral qualms).
The switch from the original mentality of software as a product (literally shipped in a box), to the current business model of "user as a product" and software merely being the bait and hook is so profound that we are not really talking about the same industry anymore.
Not clear where the strange and twisted journey of software would lead. The infinite reproducibility at zero cost is not something current economic systems can handle. The enshittification might continue, further enshittifying society or open source becomes the norm.
layer8
This reminds me of the concept of “user” as presented in the 1982 Tron movie, where users were regarded as sovereigns with god-like powers by the software (“programs”). The notion of “user” in the article is almost the reverse. We should return to that older conception.
lokimedes
I like the idea of either augmented computing or calm computing. Either it is fitting comfortable on/in me, or it is seamlessly integrated in the ambient environment. The basic idea that we are a mechanistic factory line, where computing are industrial tools, and I have to master my role in the process, or get my hands malested is sicking.
I really hope the LLM wave makes this view realistic. We still mostly see tools that enhances our use of tools, but I believe LLM's strongest functionality is that it provides efficient translation between human needs and machine instructions.
skydhash
What about just computing? Gving us tools and the manuals that comes with them. You bought a computer, you bought/get a set of software that helps accomplish some tasks and that’s it. Mac and macOS used to be that, but they’ve wrapped it up in harzadous “services” brought by MBA. Windows used to be that, more versatile and fragile, but they’ve destroyed it. Linux is that, but you have to learn CLI speak.
You can see this in venerable software which has lived through the times of "designing for the user" and is still being developed in the times of "designing for the business".
Take Photoshop, for example, first released in 1987, last updated yesterday.
Use it and you can see the two ages like rings in a tree. At the core of Photoshop is a consistent, powerful, tightly-coded, thoughtfully-designed set of tools for creating and manipulating images. Once you learn the conventions, it feels like the computer is on your side, you're free, you're force-multiplied, your thoughts are manifest. It's really special and you can see there's a good reason this program achieved total dominance in its field.
And you can also see, right beside and on top of and surrounding that, a more recent accretion disc of features with a more modern sensibility. Dialogs that render in web-views and take seconds to open. "Sign in". Literal advertisements in the UI, styled to look like tooltips. You know the thing that pops up to tell you about the pen tool? There's an identically-styled one that pops up to tell you about Adobe Whatever, only $19.99/mo. And then of course there's Creative Cloud itself.
This is evident in Mac OS X, too, another piece of software that spans both eras. You've still got a lot of the stuff from the 2000s, with 2000s goals like being consistent and fast and nice to use. A lot of that is still there, perhaps because Apple's current crop of engineers can't really touch it without breaking it (not that it always stops them, but some of them know their limits). And right next to and amongst that, you've got ads in System Settings, you've got Apple News, you've got Apple Books that breaks every UI convention it can find.
There are many such cases. Windows, too. And MS Word.
One day, all these products will be gone, and people will only know MBA-ware. They won't know it can be any other way.